Hackathon Ruby

BattleHack 2013 World Finals at PayPal Headquarters

At the end of September, I won the BattleHack competition in Austin with PayPup – a mobile app for our local no-kill animal rescue, Austin Pets Alive. This was a project I’d been thinking about doing for over a year, but I needed that 24 hours to sit down and do it – and bringing PayPal into the mix for donations was the missing element to make it worthwhile to everyone.

As a result of winning the local competition, I got an entry into the BattleHack world finals in San Jose, California on November 16 and 17. This was an amazing opportunity to have a chance to go head to head with some of the most talented hackathon developers in the world, as competing teams would show up from ten different cities – Moscow, Tel Aviv, London, Berlin, Barcelona, New York, Miami, Seattle, Washington, DC, and myself from Austin. All of the other teams had at least two people, and most had three or four developers or designers. I was competing solo, as I had basically known what I wanted to build at BattleHack Austin and hadn’t formed a team.

I didn’t really think too much about my idea for the world finals, except a loose concept around fundraising for disaster relief, based on the flooding that happened in southeast Austin on Halloween this year. Then the typhoon struck the Philippines, with incredible amounts of damage and loss of life, and my friend Ronnie headed over to the Philippines  to organize disaster recovery for the area where her family is from. They had a ton of damage there, and it’s not one of the big cities, so it won’t be one of the first areas cleaned up by the government or outside relief organizations.  She asked for money from family and friends, and then kept us informed with updates on Facebook as to how things were going in Manila and then in the disaster area. This started to crystallize some of the ideas I wanted to do!

After flying up from Austin to San Francisco, I had my first ever driver waiting for me with my name on his tablet – usually I wouldn’t have even looked for those guys! The black car ride down to San Jose was pretty uneventful, but I checked in to the hotel around 3pm (meeting the great team from DC in the lobby), to get ready for the video pitch at 4:10pm. I figured that our video pitches would be shown up on a screen somewhere and some point during the weekend, as we weren’t getting judged on the pitches themselves, or even whether we built what we pitched. We had ten minutes of time in front of the camera to give a 1-minute pitch (all the way through, not edited) – I used three takes, after writing down some notes in my hotel room right before the event.

The next item on the agenda was to meet all the other developers at a dinner in our hotel that night – we’d also be meeting the PayPal developer evangelists we’d be working with that weekend. Overall, it was an extremely friendly event – I was a little worried that with $100,000 on the line for the winners, the other teams would have their guards up, whether they were being rude or just unfriendly, but it was basically the opposite. That really impressed me.

After dinner, we had a little free time to get ready for the next day (which started at 9:45am with a bus over to PayPal headquarters). I was exhausted after getting up at 6am to catch my flight from Austin, and I can only imagine how tired the developers from overseas were – apparently Team Moscow felt pretty good, because they rented a car and drove up to the Golden Gate bridge that night!

Early the next morning, I woke up and went for a 4 mile run on San Jose’s brand new Guadalupe River trail, which was a great way to start the morning off. It was just above freezing, but the day was sunny and clear.


For my birthday (which was that Sunday), my wife Cheri got me a lucky Hacker Periodic Table of the Elements shirt – can’t go wrong combining programming with chemistry, in my mind.


The event really started to kickoff after we made our way to PayPal headquarters – the Austin BattleHack was really well organized, but the world finals were over the top – individual video stations playing video from each city’s hackathon, a bar, a DJ station, a hack room with individual tables for each team, and an area set up for the announcements.


Once we got settled in and ready to go, I basically wasted two hours trying to decide whether or not I wanted to build my site with Rails 3.2 or Rails 4, and then if I wanted to use Rails Composer or not. More to the point, I didn’t have a strong idea of what I wanted to do, or how I was going to accomplish it, so I kind of wasted a lot of time. This wasn’t smart! All of the other teams seemed to be building mobile apps, so I went with a responsive web app – even though I mostly do mobile app development these days, part of what I wanted to do at this hackathon (and every hackathon) is stretch my development skills out a little bit further.

At the BattleHack in Austin, I did my first Windows Phone 8 app in C#, and used Windows Azure Mobile Services as a backend for the first time. This time around, I’d use Rails 4 – not vastly different from Rails 3.2, but little things like attr_accessible vs strong parameters were new. Luckily turbolinks appeared to cause absolutely no problems.

After futzing around with Rails installs,  scaffolding, and the paperclip gem for storing images on S3, I decided to clear my head of dev problems and switch over to design. As a one-man team, I couldn’t just rely on a graphic designer or front-end developer, so I had to conceptualize what I wanted as a responsive design. I was able to build my page up using Twitter Bootstrap 3 and DesignModo’s free Flat UI Kit in the browser, getting a static HTML prototype of how I wanted the key page of my site to work. Because I designed in the browser, it wasn’t a lot of work to split this static page out into an application layout and a content page template to get it back into Rails.


PayPal did an excellent job of feeding us – the hackathon actually stopped the clock for a formal dinner at 5:30pm, where the hackathon judges and mentors rotated between tables.


In addition to this, later that night, a build-your-own pho place rolled into the hackathon area, you could assemble your own bowl out of noodles, vegetables, broth, and sauces! Yum!


I didn’t want to sleep, but I was exhausted. Part of PayPal headquarters was turned into a sleeping area, with a cot set up for each team with a PayPal developer blanket. I wasn’t sure how the teams with more than one person were going to handle the cot situation, but it didn’t seem to be a problem for any of them.


By this point, I had the basics of the web site down, so I had switched into integrating two key hackathon partner APIs – Twilio and SendGrid – I know that both of these are extremely easy to add in a superficial way, but my web app actually heavily depended on Twilio and SendGrid integration.

In a disaster area, you’re not going to have strong wifi, a reliable 4G internet network, or even possibly access to your own computer or mobile device. Because of this, I wanted users to be able to update their stories using SMS, voicemail, or email updates. This required writing hooks for incoming email using SendGrid’s webhooks, and incoming SMS and voice using Twilio. In addition, I used the Twilio Client to provide a voice recording feature directly from the web browser, for areas here there may have been available wifi (perhaps through satellite uplink) but no reliable voice infrastructure.

Voice was a key part of the application, because I wanted people to be able to record messages for their loved ones – their friends and families would love to hear from them directly that they are ok. Twilio also does a decent job of transcribing those recordings, so I included those as status updates.

Somehow I struggled through to the morning, and I basically had the core of the app done – integrations with SendGrid, Twilio, and PayPal, a Google Map on the home screen, and a compose new update feature that took images, YouTube videos, or links and inserted them on the screen using AJAX. I gave a practice pitch at 10:30 am, and got some excellent feedback on leading with the SMS feature, which I incorporated into my final pitch.

I really just worked on testing and polishing the app for the next two and half hours. I also took several walks around the building to practice giving my pitch, first to myself, and then second over the phone to my mom – I figured if she could understand what I was pitching with no visuals, it would be pretty clear to the judges and audience.

With an hour left, I was basically done. There were things I could improve with the app, but they wouldn’t fit into the demo, so I didn’t bother with them for fear of breaking the hack.

The final presentations went smoothly – I was thrilled to be third, so I wouldn’t have the pressure of leading off, but I wouldn’t be nervous through out the whole presentation by going last – after all, there was $100,000 on the line! Some of the teams appeared to go over their five minute limit, and were being waved off the stage, but I ended my presentation with about 7 seconds left of my five minute limit. The questions from the judges were all well thought out – Sarah Austin asked if you could set up a disaster story over SMS, which I hadn’t thought of before, but would actually be an excellent idea. Jeff Lawson from Twilio asked if I was going to support MMS, and the app could easily support MMS as soon as Twilio opens it up for long telephone numbers! Last, I was asked what technologies I used to build the application (Rails 4, Bootstrap 3, SendGrid, Twilio, Heroku).

The other teams all gave interesting presentations – I liked that none of the teams seemed to be doing the same things, so there was a good cross-section of ideas – NFC, 3-D technology, image recognition, cross-platform mobile apps, and hardware hacks. Some were more practical than others, and some seemed to stretch the limits of available technology. I thought that David Marcus – one of the judges, and President of PayPal – did an excellent job of asking how some of the apps would be used by customers or where they fit into the landscape.

Team London
Team Seattle (and Macklemore)
Team Moscow (with the winning project, DonateNow)


The period in between when the finalists all presented and when the prizes were announced was really stressful – finally we were called back in for the final judging, and I was thrilled to see that my project had won both the Twilio and SendGrid prizes! The Microsoft and Nokia prizes went to a strong team out of Seattle that demo’d Kinect-based technology, while the first, second, and third prizes went to Team Moscow, Team Tel Aviv, and Team Miami, respectively.

This turned out be a very crazy weekend – out to California on Friday, back to Austin on Monday, teaching my iOS development class Tuesday night, so it took a little while to get back up to speed. I thought the best part of the hackathon was meeting all of the developers on the other teams – I really enjoyed sitting in between two teams of college students (Team DC from Georgetown University, and Team London from the Hackathon Society at the University of Nottingham), and being able to talk about the project with the other solo developer, Jurre from Team Berlin (Team Berlin’s other member was on the business side). I also enjoyed meeting all of the PayPal developer evangelists (Robert, Jonathan, Tim, and Cristiano), along with Kunal from SendGrid and Joel from Twilio.


Quickstart: Running Rails on your own Digital Ocean VPS

I’ve been working on a couple different web application projects using Rails 3.2, and they’re now at the point where they need to go onto a publicly accessible development/staging/production platform.

The quickest, most hassle-free solution is to use Heroku for deployment – create a Heroku web application from the command line, git push heroku master, and you’re live! There’s something to be said for using Heroku for a development environment – you can easily push copies of your server, it’s free if you only use one dyno and no SSL, and they handle all of the maintenance and operations.

Unfortunately, once you look at taking your web application up just one level towards production (two dynos, so Heroku doesn’t spin down your application, a database with more than 10K rows, and SSL), you’re looking at $65/month. Not a big deal once the project gets going, but if you’re launching a lot of web services, it will add up.

I’ve had good luck with hosting on DigitalOcean, a VPS provider that uses SSD disks for its servers. The lowest cost plan is $5/month for 512 megs of RAM and a 20-gig SSD. I really like the performance of a 512-megabyte VPS I’m currently using to host a couple of PHP/MySQL projects. I moved those sites over from another VPS provider who was acquired by a larger company, and I’ve gotten much better performance for $5/month than I was for $25/month.

Previously, it had been a pain to install RVM, nginx, and Unicorn all together on Ubuntu – I’d tried messing with Vagrant, Chef recipes, Passenger, and a bunch of other half-baked devops scripts. I find it a little strange that no one’s done a good job writing a simple Ubuntu->RVM devops setup script with sensible defaults, but that’s fine. I ended up setting up my first DigitalOcean Rails VPS with Passenger, which wasn’t too bad, but it wasn’t easily scalable.

In the meantime, DigitalOcean released an RVM/Rails image (with MySQL, but you can easily install PostgreSQL instead if you want), so you can click a button and get a fully provisioned Rails/RVM/nginx/unicorn/MySQL server in 60 seconds after signing up with DigitalOcean.

What’s missing there is a deployment story  – if you just want to hack on the server, great, but that doesn’t scale either. The easiest way to deploy Rails sites is Capistrano, which is covered in this fantastic tutorial (How to Deploy a Rails 4 App with git and Capistrano by Rob McLarty). Rather than recap everything in that tutorial, I’ll let you go through it.

Of course that tutorial doesn’t exactly match the setup that DigitalOcean uses for its Rails droplet, so I had to make a few changes to support RVM and the Capistrano way of deploying apps using current and releases directories.

I’m not a huge fan of RVM, and you’re going to run into problems because RVM basically runs as a shell script on interactive login, and not non-interactive login, which Capistrano uses.

You’lll need to modify the deploy.rb Capistrano file from the above tutorial for the default DigitalOcean setup . I also changed the shell used in the above tutorial to bash. Extremely important is to set the bundle command used, or you’ll get lots of errors about the bundler gem not being installed. Here’s the first part of my deploy.rb file with my changes.

require 'bundler/capistrano'
$:.unshift(File.expand_path("./lib", ENV["rvm_path"]))
require 'rvm/capistrano'

default_run_options[:shell] = 'bash'
set :rvm_ruby_string, "ruby-1.9.3-p429"
set :rvm_type, :user
set :bundle_cmd, 'source $HOME/.bash_profile && bundle'

I also included the RVM Capistrano plugin, and specified the RVM ruby string to use, though that could probably be taken out for most projects.

The other place you’ll run into problems using Capistrano and DigitalOcean’s Rails provider is in setting up the nginx and unicorn directories.

Modify the /etc/nginx/sites-enabled/default file to change the location of your Rails app from /home/rails to /home/rails/current

Modify two unicorn file:

In/home/unicorn/unicorn.conf, change the working directory to /home/rails/current

In /etc/default/unicorn, change the APP_ROOT to /home/rails/current

Run capistrano to deploy the rails app: cap deploy

From the server, restart unicorn and nginx, and your server should be pointing to your new rails app!

service unicorn restart
service nginx restart

Let me know if this how to helps you sort out deploying Rails apps to DigitalOcean! If you have questions or feedback, or if something else needs to be included, either leave a comment here or contact me directly.

Hackathon Ruby Twilio

Talxer: Talk to your e-commerce customers – AngelHack Austin 2013

I was lucky enough to enter quite a few hackathons last year – including API Hack Days in Austin and Dallas, the Via.Me Back to the Future Hackathon and the TwilioCon Hackathon (both in San Francisco), and a DrupalCon Twilio Hackathon in Denver. For the most part I enjoy them, though it is hard to bring a project out of a hackathon into an actual shipping product.

This year, I haven’t had a chance to do any, so I jumped on the opportunity to enter the AngelHack hackathon held at Mass Relevance in downtown Austin.

I actually had a somewhat well planned out idea going into the hackathon about what I wanted to accomplish – a project that would solve problems for two clients I have right now where I as a developer would pay for this solution. It wouldn’t particularly involve integration with any of the sponsors of the hackathon, but for this hackathon, that wasn’t really the focus.

Instead, I ran into Saranyan from Bigcommerce, one of the sponsors of AngelHack Austin. We’d met at Circus Mashimus during SXSW this year when I’d shown up to demo another hackathon project (RV Trip Companion). He mentioned that he’d been talking to Keith Casey from Twilio about how to tie voice into Bigcommerce, so store owners can provide click-to-call to their customers. This sounded like the kind of project I could make a pretty deep dent into in 24 hours, so I switched gears.

Starting with Rails Composer, I built a Software-as-a-Service project in Rails that used Stripe for billing. I then set up the site on Heroku, as that can get a little tricky with Rails Composer web apps. Next was integrating Twilio using some hand-written TwiML as ERB files – I’ll probably replace those with their TwiML builder, but they worked for now. After that, it was time to integrate the Bigcommerce gem, and start to do some customer and order lookup – their API is pretty RESTful, and they have a developer console, so it wasn’t too hard. All of the attributes I wanted to query by were exposed. A little more TwiML, and I had a toll-free hotline ready to go for my e-commerce stores.

Next to build was a click-to-call widget, powered by Twilio’s web client. I knew I’d have to serve a Twilio capability token out of my server up to a third party web site (the e-commerce store), so I built some code that provided the widget functionality as an external JavaScript file served off of my platform to be embedded into the Bigcommerce store.

Integrating with the front-end of Bigcommerce was a little tougher – they use a custom templating/theming engine that you modify chunks of and then upload to your e-commerce store over DAV (think FTP). Digging into it, it wasn’t obvious at first where to put the JavaScript, but I dug through things, and hacked in some JavaScript, some CSS, and my new call and hangup buttons. I got things to look pretty in the default theme using my CSS skills (margin-top:-7px, for instance).

At the same time, I also put together a VoiceBunny project for a full custom voice-over for all of the spoken content for my site. This made everything voice-related feel much more professional, and because VoiceBunny was one of the sponsors, they had a discount code for AngelHack. The voices came in with about 3 hours to go, and I was able to integrate them into Twilio.

I ran into a couple of problems, including wasting time trying to get Heroku to use Ruby 1.9.3 instead of 1.9.1 or 1.9.2 (I just commented out SecureRandom and moved on instead), killing my Stripe integration by accidentally putting another copy of jQuery into my Rails asset pipeline, and using rbenv with Rails Composer instead of rvm (luckily I’d already mostly battled this one out on another project). Another, bigger problem, was that I hosted my MP3 files for the voice over on a basic shared hosting provider, and Twilio’s caching algorithms would work much better if they were served out of something that properly handles HTTP caching (such as the Rails asset pipeline). These collectively ate up time that I could have spent polishing the site or adding new functionality, but that always happens in hackathons.

I ended up winning the Bigcommerce prize, and the VoiceBunny prize for best use of their respective technologies at AngelHack ATX. I’ve definitely got some hard-coding to take out, and some rough edges to polish off before the app is ready to go out to customers, but it’s a pretty good start for a 24 hour project.

Thanks to Mass Relevance for hosting the event, Damon, Wade, and the other organizers, the judges for spending their Sunday afternoon in a conference room, and to all the sponsors (TokBox, Amazon Kindle Fire, Bigcommerce, VoiceBunny, Elance) for making it happen!

Ruby Social Media Analytics Talks

Ruby Talk from 2011: Twitter Streaming API + MongoDB

I gave a talk at the Lone Star Ruby Conference in 2011 on Consuming the Twitter Streaming API with MongoDB. The talk was recorded, but just recently uploaded to YouTube. This was one of the first talks I gave at a professional conference, instead of just a meetup, so I practiced in front of a few of my software developer friends who didn’t know Ruby, MongoDB, or the Twitter Streaming API. Watching this, I’ve definitely improved my speaking style over the past two years, but that’s more than ok.

For those of you following along, the code referred to in the presentation is Tweeter Keeper, at GitHub here:

Ruby Twilio

Building an Unconference App For RailsConf with Twilio and Twitter: Guest Post at Twilio

Earlier this year, I built a little web application in Ruby to help out with RailsConf 2012 – basically a gateway between Twitter and SMS using Twilio. Twilio thought it was a pretty cool hack, and they wanted to make sure that the code didn’t just get lost in austinonrails’ GitHub account, so they asked if I could write up how I did it for their blog – read it over at Building an Unconference App For RailsConf with Twilio and Twitter

Here’s an excerpt:

How do you keep everyone at a mega-conference up to date about what’s going on in a community-organized unconference that’s running side-by-side? That’s the problem Austin on Rails set out to solve when RailsConf 2012 came to our city.

What’s an unconference? It’s a participant-led conference – instead of having speakers apply to speak months ahead of time, and having committees set the schedule in advance, unconferences are more fluid. This is great for a fast-moving technology like Rails, where you may have put something up on GitHub last night, and you can show it off the next day! Of course, the problem with this format is – how do you let people know what’s going on?

Austin on Rails took on the responsibility of organizing the unconference, BohConf. We could have hack sessions, talks, or birds-of-a-feather going on. Based on previous BohConf/RailsConfs, one of the biggest complaints was that people would have gone to a BohConf happening if only they’d known about it!

Read more over at Twilio – Building an Unconference App For RailsConf with Twilio and Twitter


Rails equivalent of pathauto in Drupal for SEO Friendly URLS

One of the things I like about both Drupal and WordPress is that creating SEO-Friendly URLs is very easy. Instead of having your URL be (or worse,, you can have

With WordPress, it’s basically out of the box with permalinks, though I don’t usually pick the default format. Drupal requires an additional module called pathauto, but once you add it, it works pretty painlessly. I’m sure there’s some tweaking involved somewhere, but it works fine for me on a few Drupal 6 and Drupal 7 sites.

Working with Rails, it’s not quite as easy. I’m using ActiveAdmin as a back-end (instead of ActiveScaffold or RailsAdmin), so I wanted to make sure anything I picked would work with it. Friendly_id was my first choice after taking a look at a few contenders.

Installing it was pretty straightforward, just following the directions, and adding a new migration for my “slug” column. In Friendly_id, slugs are tokenized versions of any other column, in my case name. Ran into one weird problem by not specifying the exact version of the gem in my Gemfile, but once I added that, things worked.

One nice thing is that I didn’t have to muck with my routes at all for this to work with Rails 3.1.


Deploying Tweeter Keeper on the Heroku Cedar Stack

To go along with my talk at the Lone Star Ruby Conference on Consuming the Twitter Streaming API with Ruby and MongoDB, I’m going to provide some deployment instructions for how to take the Tweeter Keeper ruby script and deploy it on Heroku.

Heroku provides a managed platform to run Ruby applications (not just Rails), which we can use to run Tweeter Keeper “in the cloud”. Currently, you can run one process on Heroku for free, and you can get a free MongoLab account through Heroku as well if you want to experiment with the Twitter Streaming API.

Your first step will be to sign up with Heroku and create an account on their web site.

Make sure you have Ruby 1.9.2 – the easiest way to do this is to set up RVM, if you haven’t already.

Next, go ahead and install the Heroku gem so you can interact with the Heroku platform from your command line.

Tweeter Keeper has Gemfile and Gemfile.lock setup already for Bundler, so you should be able to run bundle install from your command line, and then bundle exec ruby tweeter-keeper.rb to run the app. Be sure to set up config.yaml with a valid Twitter account’s username and password.

Tweeter Keeper isn’t a web process – it’s a command line tool that you run with bundle exec. Heroku’s Cedar Stack has support for specifying processes within your Procfile that your application can execute. Here’s the Procfile for Tweeter Keeper (just one line):

tweeterkeeper: bundle exec ruby tweeter-keeper.rb

If you need to run multiple processes (for instance, a companion web application in Sinatra or Rails), add it to the Procfile.

We’re going to use the Cedar Stack from Heroku – this isn’t currently the default, so you’ll need to specify it when you create an application on Heroku. Heroku’s Cedar Stack relies on bundler, so make sure everything’s working ok on your local development environment first.

heroku create --stack cedar

Heroku will create a new application for you, with a crazy name (that you can easily rename).

You’ll need to configure a MongoDB add-on service from the Heroku add-ons – I used MongoLab, but you can also use MongoHQ.

heroku addons:add mongolab:starter

You use git to push your code to Heroku – add your Heroku git repo as a remote:

git remote add heroku

And then go ahead and push your code out!

git push heroku master

Now see if your process is running on Heroku:

heroku ps

You should see something similar to:

Process       State               Command
------------  ------------------  ------------------------------
tweeterkeeper.1  up for 9h           bundle exec ruby tweeter-keeper.rb

To see if your tweeter keeper process is keeping tweets, go into the add-ons section for your project, pick MongoLab, and select your MongoDB collection – it should be filled with Tweets!

Ruby Talks

Speaking at the Lone Star Ruby Conference Today

I’m speaking at the Lone Star Ruby Conference today on the Twitter Streaming API.

Check out the source code on GitHub –

Here’s the slides from the presentation:

Drupal Planet iPhone Development Ruby

Content Authoring for an iPhone App with Drupal using MongoDB

I just finished an interesting project – the EveryMarathon marathon calendar iPhone app. I’m working on my Active City Guides – and as part of that project, I ended up finding every American marathon. Running marathons is one of my hobbies, as is travel, and I like to combine the two by planning trips around marathons. There are several web sites with marathon calendars, but no iPhone (or Android) apps that I could find. So why not make one of my own? I could have put a mobile theme on my Drupal site, but that’s too easy 🙂

One of the screens from the app that lets you track your favorite marathons and your finished marathons

Content Storage and Import

Because I already had the content in Drupal 6, that simplified some of my decisions. In addition, the races themselves would be read-only on the app, meaning I wouldn’t have to worry about editing them and syncing them back to the server. I’d have to get the content from Drupal into the iPhone app and store it – there are several choices I could make:

Core Data

Use Core Data with a SQLite database and import Race objects (and their dependencies) into the database. Sync the database with Drupal on load or in the background, using JSON and a library like Lidenbrock.

Plists from Drupal 6 over HTTP with Services

Use Drupal iOS SDK with Services 2.2 and the Plist Server Module. Supports Views, Nodes, Comments, Users, and Taxonomies. Most Drupal-centric solution

JSON from Drupal 6 over HTTP with Caching

Use Drupal’s REST Server with JSON for Views and Nodes. Use an iPhone REST library such as  RestKit.

JSON from Drupal 6 bundled with App

Pull JSON out of Drupal 6 with REST Server Nodes, and bundle it in with the iPhone app. Read JSON off of the file system when needed. Has the huge advantage of not needing sync code right away, as updates to content can be shipped with app releases – my content isn’t particularly timely, as the marathon dates get updated once a year. Also quite fast to load JSON directly into arrays or dictionaries in the iPhone app and then display them in UITableViews.

I went with the bundled JSON approach for my first release of the app. One of the great things about knowing there are different approaches is that you can make sure your app is flexible enough to change approaches in a new version if you need to – I’ve got everything abstracted away to a singleton EveryMarathonData model object that gets races for iPhone view controllers.

Using MongoDB to aggregate content for the App

Part of the app’s functionality is to display races by state and by month:

I wanted to keep the mobile experience simple – no Advanced Search features, no complicated sorting. I could certainly use Objective-C on the iPhone to display sorted arrays for me – the Races by State could be calculated on startup with the number of races each state has, the Races by Month could be sorted out from all of the races, but these will always be the same for a set of content. I wanted to have a very fast startup time and also have a laggy, crashing iPhone app – so I went with simple. I used MongoDB, a NoSQL database that works very well with JSON, to create JSON documents that had exactly the information I needed to display, in the order they would be shown. This turned out to be simple. I have a two-step process:

1. Download JSON nodes from Drupal and store them in MongoDB
2. Create JSON to bundle with the iPhone app from MongoDB

I wrote my MongoDB integration code in Ruby, but any language would have worked. Here’s my ruby script that generates the JSON for the iPhone app screen above:

require 'rubygems'
require 'json'
require 'mongo'

states =

db ="marathondb")
coll = db.collection("nodeData")

state_abbr = {
  'AL' => 'Alabama',
  'AK' => 'Alaska',
 ...other states omitted...
  'WI' => 'Wisconsin',
  'WY' => 'Wyoming'

reduce ="function(key, values) { " +
                    "var sum = 0; " +
                    "values.forEach(function(f) { " +
                    "   sum += f.count; " +
                    "}); " +
                    "return {count: sum};" +

map = "function() { emit(this.field_location[0].province, {count: 1}); }"
races_by_state = coll.map_reduce(map, reduce,{'query'=> {"field_race_types.value" => "Marathon"}})
races_by_state.find().to_a.each do |f|
	state =
	state[:state] = "#{f['_id']}"
	state[:count] = "#{f['value']['count']}"
	state[:name] = state_abbr[state[:state]]
	states.push state
end"races_by_state.json","w") { |f|
		f.puts JSON.pretty_generate states

Basically, we make a connection to the MongoDB for a database and a collection of documents, then create an array of state names and abbreviations. MongoDB doesn’t use SQL, but we can use Map/Reduce to simulate a simple Group By SQL query.

We then prepare our JSON the way we want to use it in the iPhone app – an array of Dictionary/Hash objects. Each Hash object has the state’s abbreviation, the number of races for the state, and the full name of the state

The easiest part of the whole script is outputting the JSON to the file system – the json Ruby gem does all of the hard work!

Wrap Up

After running my ruby scripts, the app data lives in a “magic” directory that XCode knows to bundle into the app. From Objective-C, I can load any of these JSON files into memory, and then parse them into NSDictionary and NSArray objects.