0

I have a working implementation using the Autodesk.Forge nuget package 1.9, where files are downloaded from Azure blob storage into a memory stream (all at once) and then it works using the UploadObject API method.

However we obviously have some outofmemory exceptions with this approach using large files. In an attempt to stream directly from the Azure blob using OpenRead() I encounter errors thrown by the Forge API relating to stream properties not being available. When i inspect the stream in the debugger these exceptions are evident before calling the forge API.

I found this suggestion using a stream wrapper class that exposes the necessary properties but i can't get it to work, the stream is never consumed and the API method fails with an invalid file etc. Is there any supported / documented approach to stream a blob from azure to the forge UploadObject method?

Here is a basic example (the code works fine with a local file stream but fails with azure blob stream):

private static void TestForgeAzureStream()
{
    string relPath = "rstbasicsampleproject5.rvt";
    dynamic bearer = Forge.openAPI();
    Autodesk.Forge.Client.Configuration.Default.AccessToken = bearer.access_token;
    string bucketKey = "Whatever";
    var apiInstance = new ObjectsApi();
    var objectName = HttpUtility.UrlEncode(Path.GetFileName(relPath));  // string | URL-encoded object name
    string urn = null;
    var blob = StorageUtil.GetBlob(relPath);
    using (Stream fs = blob.OpenRead())
    {
        int contentLength = (int)fs.Length;  
        var contentDisposition = objectName; 
        dynamic result = apiInstance.UploadObject(bucketKey, objectName, contentLength, fs, contentDisposition);
        urn = result.objectId;
        Console.WriteLine(urn);  
    }
    string base64Location = System.Convert.ToBase64String(System.Text.Encoding.UTF8.GetBytes(urn));
    string error;
    string derivativeURN = Forge.TranslateDrawing(base64Location, out error);
    if (String.IsNullOrEmpty(derivativeURN))
        Console.Write(error);
    else
        Console.Write(derivativeURN);
}

Exception in VS showing stream properties not available

Error:  Error getting value from 'ReadTimeout' on 'Azure.Storage.LazyLoadingReadOnlyStream 1[Azure.Storage.Blobs.Models.BlobProperties]'.
   at Autodesk.Forge.Client.ApiClient.Serialize(Object obj)
   at Autodesk.Forge.ObjectsApi.UploadObjectWithHttpInfo(String bucketKey, String objectName, Nullable 1 contentLength, Stream body, String contentDisposition, String ifMatch, String contentType)
   at Autodesk.Forge.ObjectsApi.UploadObject(String bucketKey, String objectName, Nullable 1 contentLength, Stream body, String contentDisposition, String ifMatch, String contentType)
   at ConsoleApplication1.Program.TestForgeAzureStream()
Tim_Mac
  • 145
  • 1
  • 7
  • You say "large files" - are they above 100MB? Because then we suggest using resumable upload where you can break the content into chunks: https://forge.autodesk.com/en/docs/data/v2/reference/http/buckets-:bucketKey-objects-:objectName-resumable-PUT/ – Adam Nagy Feb 10 '22 at 12:00
  • The out of memory exceptions happen sometimes on large files above 100Mb but that is not related to the autodesk forge API, it's just asp.net not being able to grab enough memory from the VM. Even for small files the above code does not work. The API appears not to be able to read from an azure blob stream. Is there an alternative approach to avoid the current work-around of having to download the file from blob storage to the local web server and then upload from filestream? i don't understand why file streaming works but blob streaming does not. Thanks for any help! – Tim_Mac Feb 11 '22 at 13:47
  • Could you try applying timeout info as done here? https://stackoverflow.com/a/57874589/4654233 – Adam Nagy Feb 11 '22 at 14:19

0 Answers0