0

I am executing an azure-batch job, which creates a zip file as its output. The batch is being executed by an orchestrator with the responsibility of moving the output files to blob. I have this working, but it feels clunky - i.e. I download the file locally to the orchestrator then upload to blob:

CloudTask task; // executed task...
var node = task.GetNodeFile(fileName);
using (var stream = File.OpenWrite(localFile))
{
  node.CopyToStream(stream);
}

var blobRef = _blobContainer.GetBlockBlobReference(blobFileName);
blobRef.UploadFromFile(localFile, FileMode.Open);

I tried passing the blob stream to the CopyToStream method directly, but nothing was moved to the blob:

node.CopyToStream(blobRef.OpenWrite());

Is it possible to copy the output file from a batch vm to blob without this extra hop?

Nuri Tasdemir
  • 9,720
  • 3
  • 42
  • 67
NDJ
  • 5,189
  • 1
  • 18
  • 27

2 Answers2

1

You can do this by using a memory stream:

CloudTask task; // executed task...
var node = task.GetNodeFile(fileName);

using (var ms = new MemoryStream())
{
    node.CopyToStream(ms);

    var blobRef = _blobContainer.GetBlockBlobReference(blobFileName);
    blobRef.UploadFromStream(ms);
}
Peter
  • 27,590
  • 8
  • 64
  • 84
  • Thanks. It did occur to me, the only issue being some of the zip files will we well over 1gb, maybe even 3 or 4. – NDJ Feb 10 '16 at 21:11
  • I'm accepting this as it is a working solution and I didn't mention the file sizes. – NDJ Feb 11 '16 at 09:42
0

the answer was actually quite simple - disposing the cloud ref stream seems to force a flush - i.e.:

CloudTask task; // executed task...
var node = task.GetNodeFile(fileName);

using(var stream = _blobContainer.GetBlockBlobReference(blobFileName))
{
  node.CopyToStream(stream);
}
NDJ
  • 5,189
  • 1
  • 18
  • 27