3

My application suddenly became really slow with response time that can go up to 3 minutes. I have tried on two different local environment to make sure it is not being slowed down by the CPU/RAM or any other hardware component, but the problem remains.

Some contexts

I am using ruby 2.0.0 and rails 4.0.0. Pow as a rack server and a PostgreSQL database. The request used for this test is a basic API call to fetch all entries in an entries table. The table itself doesn't have many records (~12k).

Postman API request

And here is the what the log shows for that request:

Started GET "/en/*****/api/v1/entries.json" for 127.0.0.1 at 2014-09-11 11:42:13 +0100
  ActiveRecord::SchemaMigration Load (0.7ms)  SELECT "schema_migrations".* FROM "schema_migrations"
Processing by Api::V1::EntriesController#index as JSON
  Parameters: {"subdomain"=>"cms", "not"=>[:new, :create, :edit, :update, :destroy], "locale"=>"en", "tenant"=>"*****"}
...
...
...
...
Completed 200 OK in 309341ms (Views: 307471.4ms | ActiveRecord: 1620.3ms)

Yes you read it well... 299 seconds for the view rendering.

Entries action

Api::V1::EntriesController#index is fairly simple

  def index  
    # Set Status
    @status = params[:status].present? ? status_id("#{params[:status]}") : [0..3]

    # Set Promoted
    @promoted = params[:promoted].present? ? params[:promoted] : [true,false]

    # Fetch Entries   
    @entries = Entry.where(status_id: @status, promoted: @promoted).filter_by_tag(params[:tag]).order("published_at DESC").paginate( page: params[:page], per_page: params[:limit] )
    respond_with @entries
  end

New Relic Preview

New Relic Details

And finally the SQL reports:

New Relic SQL reports

As you can see above, the SQL queries duration and timestamp do not match. Even though queries don't take long, the timestamp increase very quickly (until 300 seconds at the end). I will spare you the details but the long list of queries (thanks to acts-as-taggable-on) does not show any weird or long unexpected queries. The longest query takes 300ms.

development.rb

  config.cache_classes = false

  # Do not eager load code on boot.
  config.eager_load = false

  # Show full error reports and disable caching.
  config.consider_all_requests_local       = true
  config.action_controller.perform_caching = true

  # Don't care if the mailer can't send.
  config.action_mailer.raise_delivery_errors = false

  # Print deprecation notices to the Rails logger.
  config.active_support.deprecation = :log

  # Raise an error on page load if there are pending migrations
  config.active_record.migration_error = :page_load

  # Debug mode disables concatenation and preprocessing of assets.
  # This option may cause significant delays in view rendering with a large
  # number of complex assets.
  config.assets.debug = true

  # Do not compress assets
  config.assets.compress = false

  # Disable color for log file
  config.colorize_logging = false

Pow and Rails logs don't show anything wrong. I bypassed .dev in the proxy settings of my ethernet adapter.

Any idea what could cause my app to slow down that much?

Edit #1

Following apneadiving reply, I removed all association that could possibly slowing down my request.

It is now simplified to render a simple JSON array of IDs (~300 records).

/app/controllers/api/v1/entries_controller.rb

def index

  # Set Status
  @status = params[:status].present? ? status_id("#{params[:status]}") : [0..3]

  # Set Promoted
  @promoted = params[:promoted].present? ? params[:promoted] : [true,false]

  # Fetch entries
  @entries = Entry.where(status_id: @status, promoted: @promoted)
  respond_with @entries
end

/app/serializers/entry_serializer.rb

class EntrySerializer < ActiveModel::Serializer
  attributes :id
end

Results is still more that 3 seconds for such a simple request...

Started GET "/en/*****/api/v1/entries.json?limit=2" for 127.0.0.1 at 2014-09-12 10:07:10 +0100
Processing by Api::V1::EntriesController#index as JSON
  Parameters: {"limit"=>"2", "subdomain"=>"cms", "not"=>[:new, :create, :edit, :update, :destroy], "locale"=>"en", "tenant"=>"*****"}
  Account Load (1.3ms)  SELECT "accounts".* FROM "accounts" WHERE "accounts"."domain" = '******' ORDER BY "accounts"."id" ASC LIMIT 1
   (0.6ms)  BEGIN
  SQL (2.2ms)  INSERT INTO "sessions" ("cookie", "created_at", "domain", "locale", "updated_at") VALUES ($1, $2, $3, $4, $5) RETURNING "id"  [["cookie", "Dn/aXXbbWG8t8A5ZYGVgsQ=="], ["created_at", Fri, 12 Sep 2014 09:07:11 UTC +00:00], ["domain", "*******"], ["locale", :en], ["updated_at", Fri, 12 Sep 2014 09:07:11 UTC +00:00]]
   (0.6ms)  COMMIT
  ApiKey Load (0.6ms)  SELECT "api_keys".* FROM "api_keys" WHERE "api_keys"."account_id" = 2 AND "api_keys"."access_token" IS NULL LIMIT 1
  ApiKey Load (0.6ms)  SELECT "api_keys".* FROM "api_keys" WHERE "api_keys"."account_id" = 2 AND "api_keys"."access_token" = '****************' LIMIT 1
  Account Load (0.8ms)  SELECT "accounts".* FROM "accounts" WHERE "accounts"."domain" = '*****' ORDER BY "accounts"."id" ASC LIMIT 1
  SQL (2.6ms)  UPDATE "api_keys" SET "count" = COALESCE("count", 0) + 1 WHERE "api_keys"."account_id" = 2 AND "api_keys"."id" = 2
  Entry Load (4.9ms)  SELECT "entries".* FROM "entries" WHERE "entries"."account_id" = 2 AND "entries"."promoted" IN ('t', 'f') AND (("entries"."status_id" BETWEEN 0 AND 3 OR 1=0))
Completed 200 OK in 3558ms (Views: 3246.6ms | ActiveRecord: 27.7ms)

Indexes of the entries table:

# \d entries
                                               Table "public.entries"
      Column       |            Type             |                             Modifiers                             
-------------------+-----------------------------+-------------------------------------------------------------------
 id                | integer                     | not null default nextval('entries_id_seq'::regclass)
 title             | character varying(255)      | 
 slug              | character varying(255)      | 
 status_id         | integer                     | 
 promoted          | boolean                     | 
 published_at      | timestamp without time zone | default '2014-07-31 15:06:20.462154'::timestamp without time zone
 created_at        | timestamp without time zone | 
 updated_at        | timestamp without time zone | 
 account_id        | integer                     | 
 excerpt           | text                        | 
 user_id           | uuid                        | 
 extra             | hstore                      | 
 entry_views_count | integer                     | 
Indexes:
    "entries_pkey" PRIMARY KEY, btree (id)
    "index_entries_on_slug" UNIQUE, btree (slug)
    "entries_title" gin (to_tsvector('english'::regconfig, title::text))
    "index_entries_on_account_id" btree (account_id)
    "index_entries_on_promoted" btree (promoted)
    "index_entries_on_status_id" btree (status_id)
    "index_entries_on_user_id" btree (user_id)
Liyali
  • 5,643
  • 2
  • 26
  • 40
  • you have to go deep in your query but it must be too complex and without adapated indexes. Consider using caching at least – apneadiving Sep 11 '14 at 14:30
  • But the problem doesn't seem to come from the queries though `(Views: 307471.4ms | ActiveRecord: 1620.3ms)`. Any idea what could cause such a difference between the **Duration** and **Timestamp** in the New Relic SQL reports image? – Liyali Sep 11 '14 at 14:33
  • well because of lazy loading, views appear to be the one responsible for slowlyness but the are not. To be sure, just add `.to_a` at the end of your query. It will be executed right away in the controller, no more in the view – apneadiving Sep 11 '14 at 14:35
  • good point. replacing `Entry.where(status_id: @status, promoted: @promoted).filter_by_tag(params[:tag]).order("published_at DESC").paginate( page: params[:page], per_page: params[:limit] )` with `Entry.where(status_id: @status, promoted: @promoted).filter_by_tag(params[:tag]).order("published_at DESC").paginate( page: params[:page], per_page: params[:limit] ).to_a` I got **Completed 200 OK in 102830ms (Views: 101043.8ms | ActiveRecord: 629.4ms)** – Liyali Sep 11 '14 at 14:47
  • interesting, how do you generate your json – apneadiving Sep 11 '14 at 14:48
  • Have a look at this one -> http://stackoverflow.com/questions/10451722/what-is-the-fastest-way-to-render-json-in-rails – usha Sep 11 '14 at 14:57
  • @apneadiving using an ActiveModel serializer. There is nothing fancy in it that could slow down the app as far as I can see. – Liyali Sep 11 '14 at 14:59
  • @Vimsha thanks! I am not only trying to just improve performance but to track down a bug that is significantly slowing down my app (300k ms...) – Liyali Sep 11 '14 at 15:01
  • Ok so something that can hurt badly in views ins N+1 requests, do you call associations? – apneadiving Sep 11 '14 at 15:05
  • yes I call one `has_many` association in the serializer. Let me try to remove it. I did not make change to this file for ages though. – Liyali Sep 11 '14 at 15:11

1 Answers1

3

You have to eager load data as much a possible to prevent N+1 from happening:

entries = Entry.where(status_id: @status, promoted: @promoted).filter_by_tag(params[:tag])
entries = entries.includes(:entry_fields).order("published_at DESC")
entries = entries.paginate( page: params[:page], per_page: params[:limit] )

includes doc is here

apneadiving
  • 114,565
  • 26
  • 219
  • 213
  • Thanks @apneadiving. I assume I need to add `def entry_fields object.entry_fields end` and a new `attributes :entry_fields` in my serializer? after doing so, the page loads faster but still take 12sec for only 350 records. – Liyali Sep 11 '14 at 15:48
  • why didnt you keep the has_many in the serializer? you thin it skips the eager loading? – apneadiving Sep 11 '14 at 15:49
  • just tried with `has_many :entry_fields` instead and got the same load results :\ – Liyali Sep 11 '14 at 15:52
  • weird it should work: http://stackoverflow.com/questions/18134649/eager-load-associations-with-active-model-serializers – apneadiving Sep 11 '14 at 15:57
  • Thanks! I just edited my original question to add some details (Edit section). – Liyali Sep 12 '14 at 09:24
  • I guess you should optmize your query `SELECT "entries".* FROM "entries" WHERE "entries"."account_id" = 2 AND "entries"."promoted" IN ('t', 'f') AND (("entries"."status_id" BETWEEN 0 AND 3 OR 1=0))` with indexes – apneadiving Sep 12 '14 at 09:29
  • I just added my list of indexes to the original question. – Liyali Sep 12 '14 at 09:53