46

I created an Azure Storage account. I have a 400 megabytes .zip file that I want to put into blob storage for later use.

How can I do that without writing code? Is there some interface for that?

David Makogon
  • 69,407
  • 21
  • 141
  • 189
sharptooth
  • 167,383
  • 100
  • 513
  • 979

15 Answers15

37

Free tools:

  1. Visual Studio 2010 -- install Azure tools and you can find the blobs in the Server Explorer
  2. Cloud Berry Lab's CloudBerry Explorer for Azure Blob Storage
  3. ClumpsyLeaf CloudXplorer
  4. Azure Storage Explorer from CodePlex (try version 4 beta)

There was an old program called Azure Blob Explorer or something that no longer works with the new Azure SDK.

Out of these, I personally like CloudBerry Explorer the best.

Stephen Chung
  • 14,497
  • 1
  • 35
  • 48
  • 2
    +1 VS2012 with Azure tools also has this feature, it may be worth updating your answer with this – Rich O'Kelly Jul 26 '13 at 10:39
  • I just checked it and I know it's been a while, but today CloudXplorer can show blob properties, list, create and download snapshots and even browse the contents of many container files such as zips and vhds. It's awesome for that purpose, and I can't see that in the other tools. – John Feb 25 '14 at 12:15
  • Azure now has a Azure CLI package - download it from https://azure.microsoft.com/en-in/documentation/articles/storage-azure-cli/ - very well document and just works! – shekhar Jun 20 '16 at 15:27
  • Anymore Azure has a tool for such file operations: [Azure Storage Explorer](https://azure.microsoft.com/en-us/features/storage-explorer/) , it has a user friendly GUI, and login is easy with Azure username & password credentials (above is the link for the most recent version). – Orhan Celik Oct 03 '18 at 15:38
18

The easiest way is to use Azure Storage PowerShell. It provided many commands to manage your storage container/blob/table/queue.

For your mentioned case, you could use Set-AzureStorageBlobContent which could upload a local file into azure storage as a block blob or page blob.

Set-AzureStorageBlobContent -Container containerName -File .\filename -Blob blobname

For details, please refer to http://msdn.microsoft.com/en-us/library/dn408487.aspx.

Yao
  • 271
  • 3
  • 4
  • 1
    Thanks @Yao! I think you're the only one who really answered the question! – Eric Falsken Dec 03 '13 at 17:08
  • 1
    This is the best solution imho. To make it work you have to provide credentials, as described here: http://stackoverflow.com/a/18806356/897024 – kapex Mar 12 '14 at 17:40
12

If you're looking for a tool to do so, may I suggest that you take a look at our tool Cloud Storage Studio (http://www.cerebrata.com/Products/CloudStorageStudio). It's a commercial tool for managing Windows Azure Storage and Hosted Service. You can also find a comprehensive list of Windows Azure Storage Management tools here: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx

Hope this helps.

Gaurav Mantri
  • 128,066
  • 12
  • 206
  • 241
6

The StorageClient has this built into it. No need to write really anything:

var account = new CloudStorageAccount(creds, false);

var client = account.CreateCloudBlobClient();

var blob = client.GetBlobReference("/somecontainer/hugefile.zip");

//1MB seems to be a pretty good all purpose size
client.WriteBlockSizeInBytes = 1024;

//this sets # of parallel uploads for blocks
client.ParallelOperationThreadCount = 4; //normally set to one per CPU core

//this will break blobs up automatically after this size
client.SingleBlobUploadThresholdInBytes = 4096;

blob.UploadFile("somehugefile.zip");
dunnry
  • 6,858
  • 1
  • 20
  • 20
  • 1
    Oh, and if you are looking for a nice, free program, try ClumsyLeaf CloudXplorer. Works nicely. – dunnry Jul 05 '11 at 17:35
  • The comment is actually the answer Ryan :). The OP seems to be looking for an Interface an in GU. – IUnknown Jul 05 '11 at 20:24
4

There is a new OpenSource tool provided by Microsoft :

  • Project Deco - Crossplatform Microsoft Azure Storage Account Explorer.

Please, check those links:

Ivan Ignatiev
  • 1,023
  • 5
  • 15
4

I use Cyberduck to manage my blob storage.

It is free and very easy to use. It works with other cloud storage solutions as well.

I recently found this one as well: CloudXplorer

Hope it helps.

noir
  • 566
  • 2
  • 5
  • 19
  • 3
    Cyberduck removed support for azure. – Marcom Jul 15 '13 at 15:02
  • 1
    Cyberduck now has support for azure again! http://betanews.com/2014/07/23/cyberduck-restores-windows-azure-support-adds-new-sshsftp-protocol-implementation/ – CodeThug Sep 19 '14 at 16:56
2

You can use Cloud Combine for reliable and quick file upload to Azure blob storage.

ezolotko
  • 1,723
  • 1
  • 21
  • 21
2

A simple batch file using Microsoft's AzCopy utility will do the trick. You can drag-and-drop your files on the following batch file to upload into your blob storage container:

upload.bat

@ECHO OFF

SET BLOB_URL=https://<<<account name>>>.blob.core.windows.net/<<<container name>>>
SET BLOB_KEY=<<<your access key>>>

:AGAIN
IF "%~1" == "" GOTO DONE

AzCopy /Source:"%~d1%~p1" /Dest:%BLOB_URL% /DestKey:%BLOB_KEY% /Pattern:"%~n1%~x1" /destType:blob

SHIFT
GOTO AGAIN

:DONE
PAUSE

Note that the above technique only uploads one or more files individually (since the Pattern flag is specified) instead of uploading an entire directory.

David Yee
  • 3,515
  • 25
  • 45
2

You can upload files to Azure Storage Account Blob using Command Prompt.

Install Microsoft Azure Storage tools.

And then Upload it to your account blob will CLI command:

AzCopy /Source:"filepath" /Dest:bloburl /DestKey:accesskey /destType:blob

Hope it Helps.. :)

Sarat Chandra
  • 5,636
  • 34
  • 30
1

The new Azure Portal has an 'Editor' menu option in preview when in the container view. Allows you to upload a file directly to the container from the Portal UI

Norrec
  • 531
  • 4
  • 17
1

You can upload large files directly to the Azure Blob Storage directly using the HTTP PUT verb, the biggest file I have tried with the code below is 4,6 Gb. You can do this in C# like this:

// write up to ChunkSize of data to the web request
void WriteToStreamCallback(IAsyncResult asynchronousResult)
{
    var webRequest = (HttpWebRequest)asynchronousResult.AsyncState;
    var requestStream = webRequest.EndGetRequestStream(asynchronousResult);
    var buffer = new Byte[4096];
    int bytesRead;
    var tempTotal = 0;

    File.FileStream.Position = DataSent;

    while ((bytesRead = File.FileStream.Read(buffer, 0, buffer.Length)) != 0
        && tempTotal + bytesRead < CHUNK_SIZE 
        && !File.IsDeleted 
        && File.State != Constants.FileStates.Error)
    {
        requestStream.Write(buffer, 0, bytesRead);
        requestStream.Flush();

        DataSent += bytesRead;
        tempTotal += bytesRead;

        File.UiDispatcher.BeginInvoke(OnProgressChanged);
    }

    requestStream.Close();

    if (!AbortRequested) webRequest.BeginGetResponse(ReadHttpResponseCallback, webRequest);
}

void StartUpload()
{
    var uriBuilder = new UriBuilder(UploadUrl);

    if (UseBlocks)
    {
        // encode the block name and add it to the query string
        CurrentBlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(Guid.NewGuid().ToString()));
        uriBuilder.Query = uriBuilder.Query.TrimStart('?') + string.Format("&comp=block&blockid={0}", CurrentBlockId);
    }

    // with or without using blocks, we'll make a PUT request with the data
    var webRequest = (HttpWebRequest)WebRequestCreator.ClientHttp.Create(uriBuilder.Uri);
    webRequest.Method = "PUT";
    webRequest.BeginGetRequestStream(WriteToStreamCallback, webRequest);
}

The UploadUrl is generated by Azure itself and contains a Shared Access Signature, this SAS URL says where the blob is to be uploaded to and how long time the security access (write access in your case) is given. You can generate a SAS URL like this:

readonly CloudBlobClient BlobClient;
readonly CloudBlobContainer BlobContainer;

public UploadService()
{
    // Setup the connection to Windows Azure Storage
    var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
    BlobClient = storageAccount.CreateCloudBlobClient();

    // Get and create the container
    BlobContainer = BlobClient.GetContainerReference("publicfiles");
}

string JsonSerializeData(string url)
{
    var serializer = new DataContractJsonSerializer(url.GetType());
    var memoryStream = new MemoryStream();

    serializer.WriteObject(memoryStream, url);

    return Encoding.Default.GetString(memoryStream.ToArray());
}

public string GetUploadUrl()
{
    var sasWithIdentifier = BlobContainer.GetSharedAccessSignature(new SharedAccessPolicy
    {
        Permissions = SharedAccessPermissions.Write,
        SharedAccessExpiryTime =
            DateTime.UtcNow.AddMinutes(60)
    });
    return JsonSerializeData(BlobContainer.Uri.AbsoluteUri + "/" + Guid.NewGuid() + sasWithIdentifier);
}

I also have a thread on the subject where you can find more information here How to upload huge files to the Azure blob from a web page

Community
  • 1
  • 1
0

I've used all the tools mentioned in post, and all work moderately well with block blobs. My favorite however is BlobTransferUtility

By default BlobTransferUtility only does block blobs. However changing just 2 lines of code and you can upload page blobs as well. If you, like me, need to upload a virtual machine image it needs to be a page blob.

(for the difference please see this MSDN article.)

To upload page blobs just change lines 53 and 62 of BlobTransferHelper.cs from

new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob

to

new Microsoft.WindowsAzure.Storage.Blob.CloudPageBlob

The only other thing to know about this app is to uncheck HELP when you first run the program to see the actual UI.

DontVoteMeDown
  • 21,122
  • 10
  • 69
  • 105
crthompson
  • 15,653
  • 6
  • 58
  • 80
0

Check out this post Uploading to Azure Storage where it is explained how to easily upload any file via PowerShell to Azure Blob Storage.

Jeffrey Rosselle
  • 745
  • 1
  • 9
  • 25
0

You can use Azcopy tool to upload the required files to the azure default storage is block blob u can change pattern according to your requirement

Syntax

AzCopy /Source :  /Destination /s
David R
  • 14,711
  • 7
  • 54
  • 72
-1

Try the Blob Service API

http://msdn.microsoft.com/en-us/library/dd135733.aspx

However, 400mb is a large file and I am not sure a single API call will deal with something of this size, you may need to split it and reconstruct using custom code.

ChrisBint
  • 12,773
  • 6
  • 40
  • 62