2

I have a Data Flow that simply reads a Flat File and execute a stored procedure per every line on the file. Every line contains comma separated fields. The stored procedure returns an output parameter that is later to be added to every line as a new field.

Therefore, I must write the same file I am reading. For some reason when I execute the Data Flow task, I get an error about file not accessed because was being used by another process. I've being doing some research on this matter, but no success so far. Any help in this regards will be really appreciated.

dotnetspark
  • 551
  • 4
  • 23
  • 2
    Did you open your csv or xls file in Excel? When you say "I must write the same file I am reading", do you say that you try to write in the same file? If yes, you can't read and write on the same file in one dataflow. I suggest you to create 2 different dataflow. One to read and store your result in another file or database. And the next one to overwrite or append your data. – Arnaud Gastelblum Nov 23 '17 at 07:05
  • That'll help, but there's got to be a way to write back to the same file, which is what I'm looking for. – dotnetspark Nov 25 '17 at 04:52
  • 1
    I suggest you read the entire thing int o a staging table, do you processing then export it back out to a different file – Nick.Mc Dec 10 '17 at 12:53

1 Answers1

1

In order to read and write to the same file, you need to add a blocking operation to your flow ( which is a little hacky ) - for example sort. This way it will read it first and then write to it. If you are using script components, make sure you are closing your streams.