I have a task to convert the jobs from synapse bulk insert to synapse polybase pattern. As part of that I see that it doesn't work straight away. It is complaining about some datatypes etc as below.... where as there is no double datatypes sometimes in the source query. Please help to understand if there a basic pattern or casting we need to do before we use polybase.
Here the source SQL I used Source table datatypes
SELECT TOP (1000) cast([SiteCode_SourceId] as varchar(1000))
[SiteCode_SourceId]
,cast([EquipmentCode_SourceId] as varchar(1000))
[EquipmentCode_SourceId]
,FORMAT([RecordedAt],'yyyy-MM-dd HH:mm:ss.fffffff') AS
[RecordedAt]
,cast([DataLineage_SK] as varchar(1000)) [DataLineage_SK]
,cast([DataQuality_SK] AS varchar(1000)) [DataQuality_SK]
,cast([FixedPlantAsset_SK] as varchar(1000))
[FixedPlantAsset_SK]
,cast([ProductionTimeOfDay_SK] as varchar(1000))
[ProductionTimeOfDay_SK]
,cast([ProductionType_SK] as varchar(1000)) [ProductionType_SK]
,cast([Shift_SK] as varchar(1000)) [Shift_SK]
,cast([Site_SK] as varchar(1000)) [Site_SK]
,cast([tBelt] as varchar(1000)) [tBelt]
,FORMAT([ModifiedAt],'yyyy-MM-dd HH:mm:ss.fffffff') [ModifiedAt]
,FORMAT([SourceUpdatedAt],'yyyy-MM-dd HH:mm:ss.fffffff')
[SourceUpdatedAt]
FROM [ORXX].[public_XX].[fact_FixedXXXX]
Operation on target cp_data_movement failed: parquet.io.api.Binary$ByteArraySliceBackedBinary cannot be cast to class java.lang.Double (parquet.io.api.Binary$ByteArraySliceBackedBinary is in unnamed module of loader 'app'; java.lang.Double is in module java.base of loader 'bootstrap'),},],'
Here is the destination columns and order is same. Destination table columns