Curios api

Project Description

In this project, we’ll be focusing on consuming and working with data from public APIs.

As a vehicle for learning this concepts, we’ll be selecting an API from a popular website and working to re-construct a simplified version of the website’s existing UI using their own API. For example, you might decide to use the Twitter API to build a basic version of the Twitter feed where users can view and post tweets.

As we build these features, we’ll also be working with the OAuth protocol to authenticate our users with the third-party provider, and using various testing techniques to allow us to test against the third-party data.

The project requirements are listed below:


All of a sudden – request from the president’s office, and the VP of development (for donations, &c. *not* software, sadly) for reports on legislative districts. For the last 2 years. For all guests and accounts (e.g. schools, organizations). Roughly 90,000 addresses total, for Washington state alone. And they want the data in 8 (business) days. My script? Still only processes 1000-1200 rows at a time, due to the timeout limit. Even running it on a local server where I can bump the timeout up – which I did, to a full hour – I could still only process about 4000 rows at a time.

A Rabbit Trail (i.e A Side-story About Failure A Learning Experience)

Admittedly a rather scrambled idea, I spent that Friday trying to essentially re-configure the PHP-only app to use a fair bit of jQuery / ajax requests to handle the requests, so I could make the client handle more of the heavy lifting. Anything to avoid that server-side timeout. This appeared to be working, until I ran into 2 last-minute (seriously, 11th hour) problems.

  1. Google has query limits (25000 queries per 24 hours). Duh. I have no earthly idea, or any non-earth-bound ones for that matter, why I didn’t bother checking this. Probably because until this surprise request I was dealing with maybe 4000 records. After testing a 10000 row CSV a couple of times, however, I discovered that I wasn’t getting anything back. Surprise!
  2. A test run of 4 addresses (don’t want to waste paid-for credits) against Cicero API returned … Well, I discovered the data was getting returned, but something about the way I (re-)created the “save to CSV” process was re-setting the returned information to the placeholder values I set up to accommodate empty values being returned/found.

With no time left to comb through my hack job, I tossed everything and went back to the original, PHP only, code.


Fortunately, we processed the Accounts records separately from the Guest records, so I’d already finished those before my unsuccessful hack job (see above). With only 87883 Guest records to process, I … created 2 new Civic Information API keys with 2 additional Google accounts to bypass the 25000 query limit for the majority of the list. I then set up the code on 3 different machines, each using a different Civic Information API key. I broke up the original spreadsheet into 6, parceled those out to the 3 machines, and got cracking. Eventually, I had the timeout limit pushed up to a full hour, but only 2 machines seemed to actually be picking that up. Alas, the 3rd machine was a Windows (IIS) server, and I’m using the set_time_limit() function in the script itself (so it’s a temporary thing, right?). IIS, sadly, has a global PHP time limit that overrides anything set locally. Of course. Fortunately, that was only 1 machine.

While the process did take all day – literally – it did work! I’ve learned my lesson about hacking something together without planning for large scale use. And about checking query limits on APIs. Great reminder about remembering to branch with Git when trying a new idea that needs to be rushed, too. That rabbit trail fiasco would have been a lot easier to backtrack from if I’d remembered to branch BEFORE making changes … Thank goodness I had the original version saved elsewhere.

Up Next? Siriusware API integration!

Why yes, I will be returning to my rabbit trail idea! Using proper branching! And planning it out ahead of time! Fortunately, this kind of data pulling will most likely only be done every quarter, and I doubt we’ll ever need to run 90000 addresses again.

More importantly, however, I’ll be integrating the Siriusware API so I can pull the Guest & Account records myself, rather than importing a CSV. Oh, and so I can then modify those records, instead of creating a CSV to be imported. This was something I wanted to do before the major pull, but with 8 days lead time (3-4 for me, since the data had to be imported & a report built afterwards) … I’ll be working on that integration now.

Available APIs

To start, you need to select an API to work with. We’ve selected the following list of applications for their well-documented public APIs, and relatively straightforward UI’s.

For each project, we have included a rough summary list of features to include. As with any development project, you should focus on moving iteratively through the most basic features before starting on more complex ones. During the project, the instructors will meet with you to assess progress and determine what features to focus on next.

Build a basic version of the Twitter feed. As a user, I should be able to:

  • Authenticate with my Twitter account
  • View a list of recent tweets from my feed
  • See my basic profile information (profile pic, follower count, following count, etc)
  • Post a tweet
  • Favorite a tweet


  • Retweeting a tweet
  • Replying to a tweet
  • Use a paginated or infinite-scroll interface to view more tweets
  • Unfollow a user


Build a basic version of the Instagram (web) UI. As a user, I should be able to:

  • Authenticate with my Instagram account
  • See my basic profile information (username, profile pic)
  • View a list of recent posts from my feed
  • View photos for each post
  • View comments for each post
  • View like count for each post


  • Infinite Scroll to view more photos
  • See trending posts
  • Show pictures that match a hashtag
  • Search for a user


Build a basic version of the Tumblr UI. As a user, I should be able to:

  • Authenticate with my Tumblr account
  • See my basic profile information (username, profile pic)
  • View a list of recent posts from my feed
  • View embedded photo or video content for the posts
  • Favorite a post
  • Reblog a post


  • Create a post (perhaps starting with just text posts and moving on to more complicated types)
  • Generate a permalink for a post
  • Follow a user whose post was reblogged into my feed


Build a basic version of the Github profile / feed UI. As a user, I should be able to:

  • Authenticate with my github account
  • View basic information about my account (profile pic, number of starred repos, followers, following)
  • View contribution summary information (Contributions in last year, longest streak, current streak)
  • View a summary feed of my recent activity (recent commits)
  • View a summary feed of recent activity from users whom I follow
  • View a list of organizations I’m a member of
  • View a list of my repositories


  • View a list of open pull requests that I have opened
  • View a list of «@mentions» that I was included in
  • Create a new repository
  • Planning & Requirements

Technical Expectations

You’ll work with an instructor to define more explicitly the requirements for your specific application, but the basic requirements for this project include:

  • Use an Omniauth authentication library for authenticating users with the 3rd-party service.
  • Mimic the interface functionality of one online service from the list below.
  • Consume an external API to get real data and interact with a third-party service.

The authoritative project requirements will be created and maintained in collaboration with your client through meetings and your project management tool. This means that the requirements for your could differ significantly from other projects.


You’ll be graded on each of the criteria below with a score of (1) well below
expectations, (2) below expectations, (3) as expected, (4) better than expected.

Feature Delivery

1. Completion

  • 4: Developer delivered all planned features plus 2 extensions.
  • 3: Developer delivered all planned features.
  • 2: Developer reduced functionality to meet the deadline.
  • 1: Developer missed major features and/or the application is not deployed to production.

2. Organization

  • 4: Developer used a project management tool and updated their progress in real-time.
  • 3: Developer used a project management tool to keep their project organized.
  • 2: Developer used a project management tool but didn’t update the progress frequently.
  • 1: Developer failed to use a project management tool to track its progress.

Technical Quality

1. Test-Driven Development

  • 4: Project demonstrates high test coverage (>90%), tests at the feature and unit levels, and does not rely on external * services.
  • 3: Project demonstrates high test coverage (>80%), tests at feature and unit levels, but relies on external services
  • 2: Project demonstrates high test coverage (>70%), but does not adequately balance feature and unit tests
  • 1: Project does not have 70% test coverage

2. Code Quality

  • 4: Project demonstrates exceptionally well factored code.
  • 3: Project demonstrates solid code quality and MVC principles.
  • 2: Project demonstrates some gaps in code quality and/or application of MVC principles.
  • 1: Project demonstrates poor factoring and/or understanding of MVC.

Product Experience

1. User Experience

  • 4: The application is a logical and easy to use implementation of the target application
  • 3: The application covers many interactions of the target application, but has a few holes in lesser-used functionality
  • 2: The application shows effort in the interface, but the result is not effective
  • 1: The application is confusing or difficult to use