2

I've built an Rails API that is hosted on Heroku. The API uses memcached to cache very large serialized objects to speed up the API response time. This works well, but as my API gets more traffic, the cost of memcached is just crazy. Right now I'm paying $160/mo for 2.5GB.

Is there a disk based solution that is more cost effective (trading off speed of course)? Has anyone tried MongoDB as their cache_store in Rails using the Mongo_Store gem? It seems the price/GB is about 3-7X cheaper for an SSD MongoDB on Heroku. For example, I can get a 40GB MongoDB cluster for $240/mo.

One big advantage of using Memcached with the dalli gem right now is that my large objects are compressed. Does MongoDB do this for me, or do I need to do this manually?

avian
  • 1,693
  • 3
  • 20
  • 30
  • Is managing your cloud an option here? You can get a lot more bang for your buck. – Omar Ali May 02 '15 at 17:24
  • That is my last resort. Heroku has made our lives easier, but we may be forced to move to Amazon as I just don't see our Heroku bill scaling very well. – avian May 02 '15 at 21:42

1 Answers1

2

I think you have a couple of options here but the simplest is that you may be able to use straight-up AWS Elasticache Memcached, which is significantly cheaper ($64/mo for 2.78GB, $164 for 30GB, bigger instances have better ratios). You'll have to work out how to grant security group access to your Heroku app, or else proxy the read through an EC2 instance.

More broadly, if you are using the memcached facet to speed up API response times it seems likely that storing the serialized object on-disk vs in-memory will pretty much blow away any speed benefit - do you have benchmarks that suggest otherwise? How long does the object take to generate if you don't cache it? Even SSD reads will be an order of magnitude slower than memcached reads.

Marcus Walser
  • 5,101
  • 1
  • 14
  • 8
  • Thanks for the response! Elasticache looks awesome, but it doesn't seem to work well with Heroku (other SO posts say it isn't a good ideas as there is no authentication mechanism with it: http://stackoverflow.com/questions/11042794/can-i-use-amazon-elasticache-on-heroku). As for benchmarks, before I was using memcached I just stored the denormalized JSON in my postgres database and just used that instead of doing complex serializations and that saved a lot of time. – avian May 02 '15 at 20:19
  • When I added memcached to my setup and connected it with my Active Model Serializers, it made it even faster which is great, but I could no longer use my postgres denormalized JSON as a back if the memcached hit missed. So as long as the stuff is in cache, it is super fast. – avian May 02 '15 at 20:19