I've been tasked to ingest flat files from data lake storage.
They are multiple files and will be stored in the same logical folder. The contents and structure of these files are different. each times a new file is added with the same structure of a previous one, the filename will be different. eg.
filename_1.csv
contents structure
col1, col2, col3
next time the same file is uploaded with different data, it can be called
january_new-data-1.csv
I've created the sink tables for each file.
how can I create adF pipelines to to ingest these files dynamically? Is it even possible?
I'm thinking these files need to be separated into their own logical folders first, yes?