I want to create table from CSV file, with the standard column data types like datetime, varchar, int etc and columns can accommodate upto 30000 character length and also able to handle clob columns.
I have a CSV files which I am converting into parquet format so I can create delta tables in pyspark. enter image description here As in above image you can see the table (table name: ICE) is created in Synapse lake database under "default" schema. All the columns has created with Varchar(8000) limit, I want to create custom column datatype instead of everything as Varchar(8000). Also there are few columns which are greater than 300000 character lengths those I believe getting trimmed with Varchar (8000) limitation.
I am not sure if this the correct method for creating table from CSV files. Your recommendation is appreciated. enter image description here