Just would like to get a quick idea on the best, meaning least coding, way to get lots of data in recarray into postgres using psycopg2. I have seen some stuff using cast but really I thought it would be strait forward and I could find something good on the web.
Example have census data with 200 variables read in recarray with different data types for many columns. I want to just sweep through using column names and data types and input into postgres.
Also if there is anything better than psycopy2 I am open to suggestions.
This is what I found although it goes into sqlight and the wrong way.
elif driver=='sqlite3':
tups=cur.fetchall()
if len(tups)>0:
_cast = {types.BooleanType: numpy.bool,
types.IntType: numpy.int32,
types.LongType: numpy.int64,
types.FloatType: numpy.float64,
types.StringType: numpy.str,
types.UnicodeType: numpy.str}
try:
typelist=[_cast[type(tmp)] for tmp in tups[0]]
except KeyError:
raise Exception("Unknown datatype")
res = numpy.core.records.array(tups)
else:
return None
res=[res[tmp] for tmp in res.dtype.names]
except BaseException: