In my testing on rethinkdb, i have inserted 14 millions data into a table.
Sample data inserted -
{"name": "jason" , "id" : "1", "email": "jason@gmail.com", ...}
id was generated by counter of 14 millions
When i tried to filter the table by using this query
r.db("test").table("test_table").filter({"id":"10000"})
This query takes about 13 seconds to return a table row.
Is there any faster ways to filter the table and return a table row that we wanted.