0

I want to import data using Excel, CSV and TSV formats into my applications database. The problem is that I want to allow users to upload files containing hundreds of thousands records at a time.

I use Azure Blob Storage to store files, so uploading and storing them is not a problem for me. The problem starts when I want to read the records from those files row by row. As you can guess I need to load whole file into memory and then to read it. I use a library called NPOI to read the data from this kind of file.

Azure Blob Storage allows me to read files as a stream without loading them completely into memory. Especially for huge text files this is a very good way to read them. I don't know, if I can use this method also to read Excel files.

Is there a way to achieve this goal?

pnuts
  • 58,317
  • 11
  • 87
  • 139
iboware
  • 977
  • 1
  • 12
  • 25
  • Excel (xlsx) files are stored in a zip file format, so it's likely you will need to load the whole file before accessing any of the content... – Tim Williams Nov 06 '15 at 00:57
  • Yes, you are correct. This is why we have decided to go on CSV only. Much easier to handle. Thanks for your reply. – iboware Nov 11 '15 at 15:06

0 Answers0