Keeping cool with DryIce

Tuesday, December 10th, 2013
By Daniel Cooper

At Homeflow towers we need to consume a lot of external APIs to build our front end sites. We need to cache this data pretty heavily because we don’t want to be slowing down our page rendering time waiting for external services and we’d rather not hammer external servers with loads of estate agent traffic.

Often this meant sticking an x hour cache on an item in order to limit extra queries. While a fine solution from a performance stand point, it was ultimately unsatisfying because it meant that updates to the external resource would need to wait for a cache to automatically expire before we could show the current data.

Luckily, some of the better APIs we deal with place relevant cache information in the header using standard HTTP caching mechanisms. Having to deal with this manually for every resource sounded a bit too much like hard work for us, so we decided to write DryIce.

DryIce is a small companion gem for HTTParty which caches responses in accordance with the wishes of the service author. Let’s take a look at how we’d set one up.

Here we have an API service including HTTParty and HTTParty::DryIce. By calling the cache helper with our desired cache service we’re instructing DryIce to intercept all calls to get and serve the resource from cache if it’s available. If it’s not available, DryIce will allow get to do it’s thing and insert the response in the cache for whatever time is specified ready for the next request.

You’ll notice we’re using Rails.cache here. DryIce will work with any class that quacks like a ActiveSupport::Cache. We like to use the Redis gem so we can share caches across our application servers and because the Redis cache supports time based key expiry.

You can find DryIce here: https://github.com/homeflow/dry_ice

Happy caching!