5

A user has a large table (3+ billion rows). To speed up queries for the next few months, an index on the remote database must be created.

Assuming there is a connection called conn - what is the best way to create an index and make it persist after disconnection from the database.

e.g.,

library(DBI)
sql<-'CREATE INDEX idx_pmid ON medcit  (pmid ASC);'
dbExecute(conn,sql)
dbDisconnect(conn)

The code above seem to work but how can the index be somehow verified (make sure it truly exist and speeds up future queries)? In other words - how can a user check the existence of the index? Also, do I need to issue COMMIT command somehow?

userJT
  • 11,486
  • 20
  • 77
  • 88
  • 1
    What do you mean by *"verified"*? Existence and maintenance of indices can be found https://wiki.postgresql.org/wiki/Index_Maintenance. – r2evans Apr 09 '18 at 19:06
  • *create an index and make it persist after disconnection from the database* ... why not save the index in the Postgres table **once** as it should have been in design of schema? – Parfait Apr 09 '18 at 19:17
  • Why are you creating indexes at the application layer in R and not the database table level in Postgres? – Parfait Apr 10 '18 at 16:22
  • 4
    @Parfait Because R is my general scripting language for everything and a single client for my project needs. – userJT Apr 13 '18 at 19:13
  • 1
    @userJT did you find some solution? – Edgar Dec 10 '20 at 16:48

1 Answers1

0

To create an index on a table:

dbGetQuery(conn, "CREATE INDEX index_name ON public.db_name USING btree (variable_name)")