0

Im pretty new to sql, so bear with me..

I have a database with 22 tables, all connected with a primarykey (product number)

I need to get all columns from all tables that match one primary key.

Today i use a query ala this:

    query = """ select * from pt_MatText where artikkelnummer = ?"""
    cursor.execute(query,artNR)
    pt_MatText = cursor.fetchall()
    pt_MatText = list(pt_MatText[0]) #makes a list of the returned tuple
    pt_MatText.pop(0) #Removes the primary key, so that im left with only the columns i want

I do this the same way for every table i have (22) It seems a bit slow, so is it possible to improve the way i get the data from my tables to improve speed? Or the general quality? Also, ive heard that using cursors is to be avoided. Why is this?

Louvre
  • 169
  • 1
  • 15
  • 1
    *"ive heard that using cursors is to be avoided"* This isn't quite true. Cursors should be avoided to do tasks that are far better achieved by using a dataset task. For example, using a cursor to generate the numbers 1-1M is a bad idea; using a tally table to produce them is a good idea. – Thom A Oct 29 '18 at 09:25
  • So, in my case, using cursors are considered a good idea? – Louvre Oct 29 '18 at 09:38
  • 1
    If you mean to loop through each table, that's one way. Personally, here, I would create a dynamic SQL statement and run the process in a single batch. – Thom A Oct 29 '18 at 09:40
  • Thanks @Larnu, Ill look into dynamic SQL statements – Louvre Oct 29 '18 at 09:42

0 Answers0