I have a SQL Server database table which has three columns. As an example, one of the values in this column might be 0.76. This column of data, named 'paramvalue' is defined as real
. When I use the pyodbc module command fetchall()
I get back a number like 0.7599999904632568 instead of 0.76. I'm using Visual Studio 2017 and Python Tools for Visual Studio. I've also tried the pypyodbc module but get the same problem.
The table has three columns and are defined as follows;
pconfig_id [int] IDENTITY(41,1) NOT NULL,
paramname [nvarchar](50) NOT NULL,
paramvalue [real] NULL
My Python code:
import pyodbc
cnxn = pyodbc.connect('DRIVER={SQL Server};SERVER=SERVERNAME;DATABASE=DBNAME;UID=USER;PWD=PASSWORD;Connect Timeout=15')
cursor = cnxn.cursor()
dict = {}
rows = cursor.execute("SELECT * FROM mytable")
for row in cursor.fetchall() :
if not row[1] in dict.keys():
dict[row[1]] = {}
dict[row[1]][row[2]] = row[0]
In the example above, row[2] for a typical row has the value 0.7599999904632568 instead of 0.76 as expected.