0

I have two storage accounts on Azure

  1. old storage
  2. new storage

Some data in old storage are ingested by auto loader and work well.

But now, I'm moving the data from old storage to new storage, including the auto loader with checkpoints, etc, but when I try to use the same code on the new storage, it crash.

During my search I found some files in the checkpoints with have some information about the old storage, and because this, in the new storage, does not work.

Is there something to do? To continue processing the data in the new storage?

Alex Ott
  • 80,552
  • 8
  • 87
  • 132
  • Is your pipeline stateful or stateless? – Alex Ott Jul 24 '23 at 15:36
  • @AlexOtt I have both scenarios, some codes require incremental updates to intermediate state information but now I finished some tests with codes that only track information about which rows have been processed from the source to the sink. – Afonso de Paula Feliciano Jul 24 '23 at 18:01
  • @AlexOtt just to adding more details: some files that I found are located on -> .checkpoints/table/sources/0/rocksdb/ . In this folder exists log files and zip files, and the content of this files is linked to my old storage. I used azcopy to copy the data between the storages. – Afonso de Paula Feliciano Jul 24 '23 at 18:18

0 Answers0