I'm facing an issue when trying to transfer a list from Sharepoint to Azure Blob Storage using Azure Data Factory, through the Sharepoint Online List connector. Unfortunately, I'm receiving a generic error that doesn't provide specific information about the cause of the problem. enter image description here
My connection is correct because I can successfully copy other lists from the same Sharepoint area. I suspect that the issue is related to the size of the list in question, which contains over 10,000 records and multiple columns. In this case, is it possible to perform batch ingestion to work around this limitation?
As it stands today, the pipeline is configured as follows: enter image description here
I expected the entire SharePoint list to be copied into Azure Blob Storage in the Parquet format.