0

I have to load the data from azure datalake to data warehouse.I have created set up for creating external tables.there is one column which is double datatype, i have used decimal type in sql server data warehouse for creating the external table and file format is parquet.But using csv it is working. i'm getting the following error.

HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: ClassCastException: class java.lang.Double cannot be cast to class parquet.io.api.Binary (java.lang.Double is in module java.base of loader 'bootstrap'; parquet.io.api.Binary is in unnamed module of loader 'app'.

Can some one help me on this issue? Thanks in advance.

CREATE EXTERNAL TABLE [dbo].[EXT_TEST1]
( A VARCHAR(10),B decimal(36,19))) 
(DATA_SOURCE = [Azure_Datalake],LOCATION = N'/A/B/PARQUET/*.parquet/',FILE_FORMAT =parquetfileformat,REJECT_TYPE = VALUE,REJECT_VALUE = 1)

Column datatype in databricks:
A string,B double 
Data: A  |  B
      'a'  100.0050
Mikhail Zhuikov
  • 1,213
  • 2
  • 9
  • 19
pythonUser
  • 183
  • 2
  • 7
  • 20

1 Answers1

0

Use float(53) which is of 53 digits precision and 8 bytes length.

Prags
  • 1