Caching Archive

SXSW to Go: Creating Razorfish’s iPhone Guide to Austin (Part 3)


As the Razorfish Guide to SXSW became more fully developed, we started to look at key areas where we could make performance gains and either actually speed up the site or simply make the site appear to load more quickly. (Check out part 1 of our story to see how requirements for the site were gathered and part 2 to learn about how the site was architected)

Cache it good

One of the earliest steps we took to optimize the application was to use server-side caching. ASP.NET allows you to cache just about anything on the server for quick retrieval. Taking advantage of this feature means that you can avoid extra trips to the database, requests to other services, and repeating other slow or resource-intensive operations. The Razorfish.Web library’s abstraction makes ASP.NET’s caching easy to use, and we quickly added it both to all database calls and to store most MVC models.

Zip it up

A second key optimization was to add GZIP compression to our assets. GZIP compression shrinks the size of most text-based files (like HTML or JSON) down to almost nothing, and makes a huge difference in the amount of time it takes for a slow mobile client to download a response. IIS7 has this feature built in, but we were running the site off of an IIS6 server. Happily, Razorfish.Web.Mvc has an action filter included that supports compressing your responses with GZIP.

Strip out that whitespace

Next, we used Razorfish.Web’s dynamic JavaScript and CSS compression to strip out unnecessary characters and to compact things like variable names. Minifying your scripts and stylesheets reduces their file size dramatically. One of the nice features of Razorfish.Web is that it also can combine multiple files together, reducing the overall number of requests that a client has to make. All of this happens dynamically, so you’re free to work on your files in uncompressed form, and you don’t have to worry about going out of your way to compact and combine files.


Another key optimization was combing all of the image assets into a single file, and using CSS background positioning to choose what image to display. Doing this not only cuts the number of requests that have to be made (from 10 to 1, in our case), but also cuts the overall amount of data that needs to be loaded. Each file has its own overhead, and you can cut that overhead by combining them.

Keep it in-line

As we started testing on the actual iPhone, we still weren’t satisfied with the page’s load time. There was a significant delay between the page loading and the scripts loading over the slow EDGE network. This defeated the purpose of the JSON navigation because the user was apt to click a link before the scripts had a chance to load and execute – meaning that they’d have to load a new HTML page. If the scripts were delivered in-line with the page, there would be no additional request, and they could execute right away. Because the successive content was to be loaded with JSON, concerns about caching the scripts and styles separately from the page were moot. We set about extending Razorfish.Web so that it could now insert the combined and compressed contents of script and style files directly into the page. By moving the scripts and styles in-line, we shaved off about 50% of our load time, and the scripts were now executing quickly enough that the JSON navigation mattered again.

Smoke and mirrors

A final touch was to take advantage of Safari Mobile’s CSS animation capabilities. The iPhone supports hardware-accelerated CSS transitions and animations, meaning fast and reliable animation for your pages. We added a yellow-glow effect to buttons when pressed. The glow was not only visually appealing, but its gradual appearance also helped to distract the user for the duration of the load time of the successive content.


The team managed to pull the web application together in time for launch, and the guide was a smashing success. Over the course of SXSW, was visited by 2,806 people who spent an average of 10 minutes each on the site, typically viewed about 8 pages, and often came back for second and third visits. The site attracted a large amount of buzz on Twitter and was praised as the go-to guide for the conference.

When designing for mobile, speed is key. All of the components of the site, including the design, need to work together to connect the user to the content as quickly and as efficiently as possible. In such a hyper-focused environment, the user experience, graphic design, and technology need to be unified in supporting a shared goal.

By producing a responsive, reliable, easy-to-use, to-the-point, and locally-flavored guide to the city, the team succeeded in creating a memorable and positive impression of Razorfish at SXSW.

SXSW to Go: Creating Razorfish's iPhone Guide to Austin (Part 2)

Design and Development

Up against a tight deadline, our small team was working fast and furious to create the Razorfish mobile guide to Austin in time for the SXSW Interactive conference. With our technologies determined and all eyes on the iPhone, we set out to bring the guide to life. (Check out part 1 of our story to find out more about how we set requirements and chose technologies)

The meat and potatoes

The guide is content-driven, and we knew that the site wouldn’t be given a second look without strong content to back it up. Our team decided on structuring the site as nesting categories with a design reminiscent of the iPhone’s Contacts application, and breadcrumb navigation (as is found in the iTunes Store).

With the flow determined, the creative director started developing the content categories and soliciting suggestions from the office about their favorite Austin haunts. She enlisted an information architect to assist with writing the site’s content, and they churned out the site’s content over the next several weeks.

Simultaneously, one of our presentation layer developers began work on graphic design, another focused on hosting and infrastructure, and I began working on database and application architecture.

Getting around

The first major issue we tackled when working on the front-end of the site was navigation. We had identified several features that were essential for the guide to perform satisfactorily:

  • Rather than load a new page, new “pages” of data should be loaded as JSON, and then have their HTML constructed on the client-side. JSON is a very compact way of moving data and is easy to support using JavaScript’s eval function. By using JSON to communicate between the server and the content, we avoided the performance hits of loading a larger request, rendering a fresh page, running scripts again, and checking cached components against the server. Those performance issues are often negligible on a PC with fast internet connection and plenty of memory, but on a mobile device, every byte and every request makes a noticeable impact.

  • Data need to be cached on the client whenever possible, and making repeat requests to the server for the same data should be avoided.

  • The browser’s history buttons (Back and Forward) must work, and ideally work without making new requests to the server.

  • The site must be navigable in browsers that cannot properly support AJAX.

To satisfy both the first and last requirements, we were going to have to effectively have two versions of every page running in parallel (a JSON version for AJAX-ready clients and an HTML version for others). Luckily, the MVC framework makes this easy on the server. By properly defining our data model classes, we could either send the model object to a view page for each of the data points to be plugged in and rendered as HTML, or we could directly serialize the model to JSON and send it to the client. To make it easy for the client script to select the right version, all of the JSON page URLs were made identical to the HTML URLs, except with “/Ajax” pre-pended. With this URL scheme in place, JavaScript could simply intercept all hyperlinks on a page, add “/Ajax” to the location, and load a JSON version of the content instead of a whole new page.

To determine when to use JSON and when to use HTML, we did some simple capabilities testing. If window.XMLHttpRequest, the W3C standard AJAX class, exists, then it was safe to use JSON navigation on the client. Incidentally, Internet Explorer and many mobile browsers do not support this object, which greatly simplified later development.

Several JavaScript classes were created to support page rendering: A history class to manage caching and the forward/back buttons, a base page class that would take care of rendering JSON into HTML, and an application class that would manage the interactions between the pages, the history, and the user. A handful of page types were identified, and subclasses were created from the base page for each specialized layout and different data model.

A method called BrowseTo was defined on the application class that would handle all actions associated with the user clicking a link or going to a new URL. _BrowseTo** **_did several things:

  1. Identify the JSON URL (dropping the “http” and the domain, and adding “/Ajax”)

  2. Determining what page class to use to render the JSON data

  3. Checking if there’s already cached data for the URL, and making a request to get the data if there’s not

  4. Instructing the page to render

  5. Instructing the history to add the new page to the list of visited sites

  6. Caching the JSON data from the response in memory if a new request was made

Due to time constraints, we opted to use “dirty-caching” for JSON data. When dirty-caching, you’re storing the JSON object in memory under a key. In this case, the key was the URL. There are a few downsides to this method:

  • Storage isn’t persistent, and only lasts as long as the browser is open on that page

  • You’re using up memory, not disk space, to store data, which could eventually overwhelm the client and cause it to crash

Because the size of the data that we were caching was very small, and dirty-caching is both very fast to implement and universally supported, we used it to temporarily story data. Given more time, we would have taken advantage of the iPhone’s HTML 5 local storage features. On any browser that supports this feature, you can store data in a database on the client. Many web applications take advantage of this feature to provide persistent offline access to content. The downside is that the HTML 5 local storage API is somewhat tricky to implement properly and is currently confined to a select few browsers.

A little bit of history

Forward and back button support comes naturally when you’re loading new pages, but for the JSON version of the site, we implemented a solution based on URL hashes (the # data at the end of a URL). Most browsers will include URL hashes as a state that can be navigated to using the forward and back buttons. By regularly scanning the URL hash, you can update your page when there’s a change and simulate forward/back button support. Our history class was designed to add the “/Ajax” path as the URL hash, making it easy to determine what JSON data to load when the hash changed.

With our navigation system intact, and our creative team churning out new content for the site, we took a step back and started to look at performance. Check back next week, and see how we fine tuned the site to work quickly and responsively on the iPhone.