I have a dataframe having 100000 records that we have got after doing transformations, now i have to load all this data into synapse dedicated Pool in COUNTRY_TABLE. How can I achieve this in Synapse?
Few Other Queries
- Is it Compulsory to create a schema of our dataframe columns in dedicated pool table?
- How we can Overwrite the data Everytime in dedicated Pool from using query in spark notebook, if new data came i want to overwrite old data with the new data everytime.
I have created a Schema also for my destinaton table in dedicated pool with all column names we have in our spark dataframe