2

How would you cache an ActiveResource model? Preferably in memcached. Right now it's pulling a model from my REST API fine but it pulls dozens of records each time. Would be best to cache them.

Robert Ross
  • 1,895
  • 2
  • 23
  • 32
  • forgive my completely noobish question but what is an "ActiveResource model"? I think of ActiveRecord classes as model classes and ActiveResource as controller classes. I see this phrase all over the place and don't fully understand it. sorry to piggy back on your question :( – jaydel Oct 06 '11 at 18:31
  • Does this mean a model that is backed by the RESTful requests/api rather than a database? – jaydel Oct 06 '11 at 18:36
  • An ActiveResource model is just a different way of retrieving data. ActiveRecord is to database, as ActiveResource is to external data (API's for example). ActiveResource doesn't have a lot of the niceties of ActiveRecord, however. – Robert Ross Oct 13 '11 at 22:07

3 Answers3

4

I've been playing around with the same thing and I think I've found a pretty simple way to check redis for the cached object first. This will only work when you use the find method, but for my needs, I think this is sufficient.

By overriding find, I can check the checksum of the arguments to see if I already have the response saved in redis. If I do, I can pull the JSON response out of redis and create a new object right there. If I don't, I'll pass the find through to ActiveResource::Base's find and the normal action will happen.

I haven't implemented the saving of the responses into redis with ActiveResource yet, but my plan is to populate those caches elsewhere. This way, normally I can rely on my caches being there, but if they aren't, I can fall back to the API.

class MyResource < ActiveResource::Base
  class << self
    def find(*arguments)
      checksum = Digest::MD5.hexdigest(arguments.md5key)
      cached = $redis.get "cache:#{self.element_name}:#{checksum}"
      if cached
        return self.new JSON.parse(cached)
      end

      scope   = arguments.slice!(0)
      options = arguments.slice!(0) || {}
      super scope, options
    end
  end
end

and a little patch so we can get an md5key for our array:

require 'digest/md5'

class Object
  def md5key
    to_s
  end
end

class Array
  def md5key
    map(&:md5key).join
  end
end

class Hash
  def md5key
    sort.map(&:md5key).join
  end
end

Does that help?

Brian
  • 108
  • 4
1

I would suggest looking into https://github.com/Ahsizara/cached_resource, almost all of the work is done for you through the gem.

YaBoyQuy
  • 783
  • 5
  • 8
1

Caching in rails is configurable. You can configure the cache to be backed by memcached. Typically you can cache when you retrieve. It's unclear if you are a rest consumer or service but it's really not relevant. If you cache on read (or retrieve) and then read the cache the next time, everything will work just fine. If you are pulling the data from a database, serve up the cache and if no cache is available then cache the read from the database.

I wrote a blog post about it here: http://squarism.com/2011/08/30/memcached-with-rails-3/

However what I wrote about is really pretty simple. Just showing how to avoid an expensive operation with what is sort of similar to the ||= operator. For a better example, new relic has a scaling rails episode. For example, they show how to cache the latest 10 posts:

def self.recent
  Rails.cache.fetch("recent_posts", :expires_in => 30.minutes) do
    self.find(:all, :limit => 10)
  end
end

Rails.cache has been configured to be a memcached cache, this is the configurable part I was talking about.

squarism
  • 3,242
  • 4
  • 26
  • 35