• Bike Counts App: Helping SABA Handle Bike Audits

    Two weeks ago, Code For Sacramento hosted the Sacramento Area Bicycle Advocates’ Executive Director Jim Brown. Jim came to us a few weeks back with a problem: SABA does quarterly bike counts on Sacramento streets. These counts require volunteers to watch an intersection for two hours and count every time a bicycle enters the intersection, including collecting certain demographic information. Up until now, these counts were mainly handled on paper with sheets that would have to be tallied and deciphered, then entered into a database. As this was a volunteer effort, there came some issues around consistency with data collection.

  • Unlocking Value in Health by Freeing the Data

    The California Health and Human Services (CHHS) Agency has taken the forward-thinking step of releasing a number of datasets on its Open Data Portal. As of this writing, the portal contains 73 datasets, along with 43 associated charts and 14 maps, for over 100 pieces of interesting health care information available to Californians. The data released so far includes information on population health indicators, health care facility performance, the availability of healthcare in communities, environmental chemicals, social indicators, and societal behaviors. And much more is coming, with encouragement from the California HealthCare Foundation’s Free the Data Initiative.

  • Roll Call

    #roll call for weekly meetup 2015-03-18

  • Jekylling Jekyll with Jekyll

    Code for Sacramento is a couple weeks into using our new Jekyll based website and we figured it’d be an awesome way to introduce members to contributing to open source projects and working with code! Here are a few reasons you should contribute or learn to contribute to the Code For Sacramento website:

  • Webscraping with R

    At the last meetup, Adam Kalsey gave a great presentation on scraping data from the web, illustrating the ideas with his beer-loving Twitterbot @sactaps. In this tutorial, I will show you how to scrape data from the web in R using the rvest package Github link.

subscribe via RSS