Rails: 3.2.11 Ruby: 1.8.7 Database: Postgres
I have a custom made search engine that uses ILIKE for a search among 1 million entries like this:
formatted_query_string = '%this%search%string'
like_variable = "ILIKE"
product_feed_query_line = [
"name " + like_variable, formatted_query_string, " OR",
"description " + like_variable, formatted_query_string, " OR",
"category = ", formatted_query_string
].join(" ")
@product_feed_products = FeededProduct.where(product_feed_query_line).where(:is_available => true).paginate(:page => params[:page], :per_page => 18)
The database has indexes according to schema.rb
:
add_index "feeded_products", ["name", "description", "category"], :name => "name"
This takes about 4.5 sec to process in the database, which obviously is a lot. But is it unreasonably long according to the dataset (1M)?
I know that entirely changing to a different search engine would be a good idea for the future but until I can take the time to learn this I need to know if there is something I should do in order to speed this up, or if this type of request just is this slow.