1

I'm looking for a real-time counter that updates as more database entries (sales) are created (for example, http://www.humblebundle.com). I'm trying not to put unnecessary strain on the database.

A naive way to do this would just be to ping the database for the sales count every second or so and update the number in real-time with javascript. This'll put way too much strain on the database and is unworkable, but that's the basic effect I'm trying to achieve.

Another way I can think of is to somehow store the sales number in a cache instead of the database, and then adding +1 to this cache every time a sale is made. The counter then gets updated every second with the new value of the cached count. This seems like it might work a lot better, but I don't know specifically how to implement this in Rails.

I'm using Rails 3.1 and jQuery on Heroku, but I'm mostly just looking for a high-level way to do this. Of course, the more details you can provide the better :)

Eric Yang
  • 1,881
  • 1
  • 18
  • 23

2 Answers2

3

This is not a caching problem but a data de-normalization problem. If you have a record with the correct value in it, reading that value out is trivial and will take no time even if done hundreds of times per second. You can benchmark it to be sure.

It's usually easy to do this with an after_create hook where you twiddle some value:

after_create :update_sales_total

def update_sales_total
  # Update the simple summary record
  SalesTotal.update([ 'total=total+?', self.total ])
end

If you need to cache because database performance isn't satisfactory, use the Rails.cache facility with Memcache as a store so this data is shared between instances of Rails. Each time you record a sale, add the appropriate amount to the cached value. If the cache isn't populated, pull the sum from the database.

tadman
  • 208,517
  • 23
  • 234
  • 262
0

Cache idea is best. You can easily cache every 5 minutes or so the amount of current sales and start generating a data set of all the polled values. You could use that data set to generate an average sale rate an auto increment the sale counter based on slightly below estimated rate of sales with a real poll to the cache every 5 minutes. count(*) isn't really a slow operation anyway, in postgresql at least I don't even think it needs to literally scan all elements if you don't have advanced filters.

Jazz Man
  • 909
  • 7
  • 17
  • Every five minutes doesn't sound very real time. – tadman Dec 16 '11 at 17:10
  • javascript estimate gives realtime feel without performance of constant querying. But again the querying could easily be to a file so it's just a normal http request. A cron job could easily do a count. – Jazz Man Dec 16 '11 at 17:11
  • Oh, so you're saying to extrapolate and fake it on the client-side. That wasn't clear in your answer. – tadman Dec 16 '11 at 17:12