8

I have 1500 small objects to render for a webservice inside a rails 4 application. I use json as format with jbuilder for templates. I already changed the json engine to oj in the app initializer:

 require 'oj_mimic_json'
#MultiJson.use :yajl

Oj.mimic_JSON

# jbuilder json templates
Jbuilder.key_format camelize: :lower

a single rendered json object looks like this:

center: {lat: 45.962153536249, lon: 7.68207088549831}
lat: 45.962153536249
lon: 7.68207088549831
n: "Zermatt-Cervinia"
st: 80
sy: 0


Rendered json_partials/_snow_in_resort.json.jbuilder (0.5ms)
Rendered json_partials/_snow_in_resort.json.jbuilder (0.5ms)
....
Rendered resorts/find.json.jbuilder (4213.4ms)
Completed 200 OK in 4351ms (Views: 3924.3ms | ActiveRecord: 306.8ms | Solr: 0.0ms)

But still I need 150 ms for 101 kb on my localhost which is way too slow for the task I want to accomplish on the UI. What do I have to do to speed up here? Which things should I check? I appreciate help. Best, Philip

update

I optimized my active record queries down to ActiveRecord: 77.8ms , nevertheless the view rendering is still too slow

dc10
  • 2,160
  • 6
  • 29
  • 46

1 Answers1

6

You can explore using HTTP Caching using Varnish. Here's a great article describing different caching techniques (Fragment Caching, HTTP Caching, etc. ) in Rails 4. It has good explanation on caching JSON responses.

Rails 4 and Caching

http://www.slatestudio.com/blog/p/caching-in-rails-4-applications

Here's another on Rails and Varnish

http://www.hward.com/scale-rails-with-varnish-http-caching-layer/

lacquer a popular gem with drop-in support for varnish caching in rails

If you want to get into little more detail on what is HTTP caching, Here's a really good writeup

https://www.mnot.net/cache_docs/

Ryan Bates has a excellent tutorial on Rails Cast but it's a Pro Episode

http://railscasts.com/episodes/321-http-caching

UPDATE

Geocoder Gem

Based on some comments below, I would suggest looking into using the Geocoder gem. It's been around for a while and is very well optimized for Points of Interest search, like what you are trying to do. It also does lot more than that.

Spatial Index If you have already tried it, and are not satisfied, can you please post some details on what kind of optimizations you are using on your database to speed up the query? You can significantly speed up POI queries by using spatial indexes on the table?

Here's a good article on spatial indexes:

http://daniel-azuma.com/articles/georails/part-6

Some performance testing ideas

You might be able to test if it is indeed the rendering that is slowing you down by coming up with a good test case. Try querying for things towards top, middle and bottom of your points table. Also for different number of response objects in your JSON, and different number of properties in your JSON object. Right now I see, that lat, lon is redundant. Try removing them and compare times for huge number of results, if it is indeed the rendering that is slowing you down, fewer the properties, faster responses you should see.

Also, if your properties, (name, st, sy etc.. ) are coming from relationships instead of same table as points, try de-normalizing your DB to see if you get faster view rendering..

Shaunak
  • 17,377
  • 5
  • 53
  • 84
  • It's not a caching issue, since the response is always different, based on the request. – dc10 Dec 15 '14 at 20:26
  • can you please provide some details on what kind of request it is? is it some kind of Geocoding or Points of interest search? In which case you might have to explore better geo-indxing options in your database. Also, dynamic request's too benefit from HTTP caching... but to address your case specifically , more details on what kind of request, query, response will be helpful. – Shaunak Dec 15 '14 at 20:28
  • I am searching for snow_resorts based on there geo coordinate. But I already improved the query to ActiveRecord: 77.8ms . So it's not a querying problem, I think. – dc10 Dec 15 '14 at 20:33
  • I see, so it is a Points of interest search. Have you tried the Geocoder Gem? I have used it several times before on fairly huge database, and have been very satisfied with performance.. http://www.rubygeocoder.com/ – Shaunak Dec 15 '14 at 20:36
  • Yea, it s really cool. But it's not exactly fitting my needs. I started doing my own calculations in postgis which was very fast. But the biggest boost was switching to sunpot's build in postgis search which is blazing fast. That's why I am sure in this case it is a rendering issue – dc10 Dec 15 '14 at 20:38
  • probably pointless question, but.. you are not developing on windows are you? – Shaunak Dec 15 '14 at 20:47
  • Hehe.No on osx now, or on ubuntu sometimes. But never again on windows. – dc10 Dec 15 '14 at 20:49
  • cool.. Optimizations is always tricky.. take a look at this very old question of mine, on similar task as yours, but on millions of high,density LIDAR point clouds. It was about, query optimization, and I never got a better answer, but you might get something useful out of it. I have shown how I usd spatial indexes.. http://stackoverflow.com/questions/6654875/optimize-nearest-neighbor-query-on-70-million-extremely-high-density-spatial-poi – Shaunak Dec 15 '14 at 20:52
  • I am not sure whether there is much more potential on the gis side, since I am already using bounding boxes to narrow down potential result sets, and then there is solr which is returning the required results in a ridiculous fast way. So I a stuck since I don't have a clue to tackle the rendering faster. – dc10 Dec 15 '14 at 20:55
  • just curious.. its been a while, did you ever get this figured out? – Shaunak Feb 26 '15 at 17:16
  • Yes, it turned out some deeply nested property in the json structure was hitting the db each time. – dc10 Feb 26 '15 at 19:36