0

I have a csv file from which I am trying to load data into pysqlite database. I am not able to find a way to extract the first row of the file and get it into the database automatically as column headers of a table. I have to enter the names "manually" in the code itself, which is ok for 1-2 columsn but becomes cumbersome with tens or hundreds of columns. Here is my code:

import sqlite3
import csv

f_n = 'test_data_1.csv'
f = open( f_n , 'r' )
csv_reader = csv.reader(f)
header = next( csv_reader )

sqlite_file = 'survey_test_db.sqlite'
table_name01 = 'test_survey_1'

field_01 = 'analyst_name'
field_type_01 = 'text'
field_02 = 'question_name'
field_type_02 = 'text'

conn = sqlite3.connect( sqlite_file )
c = conn.cursor()
c.execute('CREATE TABLE {tn}({nf_01} {ft_01},{nf_02} {ft_02})'\
.format(tn = table_name01 , nf_01 = field_01 , ft_01 = field_type_01, nf_02 = field_02 , ft_02 = field_type_02 ))

for row in csv_reader:
 c.execute("INSERT INTO test_survey_1 VALUES (?,?)",row)
f.close()

for row in c.execute('SELECT * FROM test_survey_1'):
 print(row)

conn.commit()
conn.close()
Alhpa Delta
  • 3,385
  • 4
  • 16
  • 31

2 Answers2

0
c.execute('CREATE TABLE {tn}({fieldlist})'.format(
    tn=table_name01, 
    fieldlist=', '.join('{} TEXT'.format(name) for name in header),
))

Or use a ORM which is designed to make this sort of thing easy. SQLAlchemy example:

t = Table(table_name01, meta, *(Column(name, String()) for name in header))
t.create()
nosklo
  • 217,122
  • 57
  • 293
  • 297
0

You can use pandas to read your csv file into DataFrame and then export it to sqlite.

import sqlite3
import pandas as pd

sqlite_file = 'survey_test_db.sqlite'
table_name01 = 'test_survey_1'

conn = sqlite3.connect(sqlite_file)

pd.read_csv('test_data_1.csv').to_sql(table_name01, con=con)
taras
  • 6,566
  • 10
  • 39
  • 50