In preparing an invited talk on Shiny, I organized my experience and notes on reactive programming, and found the storyline I developed may actually be a good alternative compare to the usual tutorials on this topic. Thus I’m expanding the talk slides into a blog post and sharing it here.
You can customize Shiny to a much greater extent if you knew Shiny UI functions just generate html codes. You can make a link button with creative use of Shiny functions.
This is a summary about my experience on synchronize colors in multiple ggplots of same dataset.
- My experience with Carto.com in creating web map for data analysis
- I wrote a R package to wrap Carto.com API calls
- Some notes on my experience of managing Gigabyte size data for mapping
Recently I found RStudio began to provide addin mechanism. The examples looked simple, and the addin API easy to use. I immediately started to try writing one by myself. It will be a good practice project for writing R package, and I can implement some features I wanted but not in RStudio’s high priority list.
- This is my second post on topic of Data Cleaning.
- Cleaning addresses format turned out to have a substantial positive impact on Geocoding performance.
- Deep understandings of address format standard is needed to deal with all kinds of special cases.
- Data cleaning is a cumbersome but important task for Data Science project in reality.
- This is a discussion on my practice of data cleaning for NYC Taxi Trip data.
- There are lots of domain knowledge, common sense and business thinking involved.
- I discussed all the problem I met, approaches I tried, and improvement I achieved in the Geocoding task.
- There are many subtle details, some open questions and areas can be improved.
- The final working script and complete workflow are hosted in github.
- This post discussed the background, approaches, windows and linux environment setup for my Geocoding task.
- See more details about the script and workflow in next post.
- This is a write-up for my volunteer Data Science project for the American Red Cross.
- The project used public data to help Red Cross directing limited resources to homes that more vulnerable to fire risk and loss.
- My work in the project:
- Discovered the hidden information in NFIRS dataset, obtained and analyzed 10G NFIRS data.
- Major contribution on model design, implemented NFIRS related predictors, and built interactive visualizations.
- Geocoded 18 millions address in NFIRS data with AWS server, Postgresql, PostGIS.