0

We seem to be hitting a limit when trying to add a large binary file to Azure DevOps using the REST APIs. The same file check-in works fine using the old SOAP APIs and also works using the TFVC CLI (tf.exe). But we have a use case where we need to occasionally check in large files programatically from machines that don't have VS installed. We're trying to migrate our application from the old SOAP APIs to the REST API because we're moving to .NET Core where the SOAP API is not supported.

A POST to the /_apis/tfvc/changesets (Create Changeset) API with large (> about 19 MB) files results in:

HTTP 400: Bad Request

This issue has been reported a couple of times on the Azure DevOps .NET Samples github repo, but that isn't the right forum for this question so it hasn't been answered there.

How do we create large files in TFVC using the REST API?

Update 2020-01-13

Here's a sample console app we've used to demonstrate the problem:

using Microsoft.TeamFoundation.SourceControl.WebApi;
using Microsoft.VisualStudio.Services.Common;
using Microsoft.VisualStudio.Services.WebApi;
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Threading.Tasks;

namespace ConsoleApp1
{
    internal class Program
    {
        internal static async Task Main(string[] args)
        {
            var orgUrl = new Uri(args[0]);
            string serverPath = args[1];
            string localPath = args[2];
            string contentType = args[3];
            string pat = args[4];

            var changes = new List<TfvcChange>()
            {
                new TfvcChange()
                {
                    ChangeType = VersionControlChangeType.Add,
                    Item = new TfvcItem()
                    {
                        Path = serverPath,
                        ContentMetadata = new FileContentMetadata()
                        {
                            Encoding = Encoding.UTF8.WindowsCodePage,
                            ContentType = contentType,
                        }
                    },
                    NewContent = new ItemContent()
                    {
                        Content = Convert.ToBase64String(File.ReadAllBytes(localPath)),
                        ContentType = ItemContentType.Base64Encoded
                    }
                }
            };
            var changeset = new TfvcChangeset()
            {
                Changes = changes,
                Comment = $"Added {serverPath} from {localPath}"
            };

            var connection = new VssConnection(orgUrl, new VssBasicCredential(string.Empty, pat));
            var tfvcClient = connection.GetClient<TfvcHttpClient>();
            await tfvcClient.CreateChangesetAsync(changeset);
        }
    }
}

Running this console app with a ~50MB zip file against an Azure DevOps TFVC repository results in a VssServiceResponseException (with inner ArgumentException) with a message of: The maximum request size of 26214400 bytes was exceeded.

Update 2020-01-14

  • Added display of file size and base64 content length before sending TFVC request in my sample code.
  • Corrected my observations about the file size limit.
  • Corrected the error code returned by Azure DevOps (400, not 413).

Originally I stated that the limit was around 13 MB. This was based on the failures I saw with files > 20MB, success with files < about 10 MB, and the limits described in the linked github issues.

I have run more tests of my own and have narrowed in on about 19,200 KB as the actual limit. This seems correct based on the error message indicating a 26,214,400 byte limit. A 19 MB base-64 encoded file would expand to about 26 MB.

I also noticed in my most recent tests, when monitoring with Fiddler, that Azure DevOps returns a 400 status code (not 413). My notes had indicated a 413 Request Entity Too Large was observed at some point in the past. Perhaps this with with an older version of Azure DevOps Server? In any case, the error we see now is 400 Bad Request.

Here is what Fiddler shows for the request headers:

POST /<...REMOVED...>/_apis/tfvc/changesets HTTP/1.1
Host: dev.azure.com
Accept: application/json; api-version=5.1
User-Agent: VSServices/16.153.29226.1 (NetStandard; Microsoft Windows 10.0.18363)
X-VSS-E2EID: 6444f0b5-57e0-45da-bd86-a4c62d8a1794
Accept-Language: en-US
X-TFS-FedAuthRedirect: Suppress
X-TFS-Session: 9f8e8272-db48-4e93-b9b0-717937244aff
Expect: 100-continue
Authorization: Basic <...REMOVED...>
Accept-Encoding: gzip
Content-Type: application/json; charset=utf-8; api-version=5.1
Content-Length: 26324162

And the raw response:

HTTP/1.1 400 Bad Request
Cache-Control: no-cache
Pragma: no-cache
Content-Length: 206
Content-Type: application/json; charset=utf-8
Expires: -1
P3P: CP="CAO DSP COR ADMa DEV CONo TELo CUR PSA PSD TAI IVDo OUR SAMi BUS DEM NAV STA UNI COM INT PHY ONL FIN PUR LOC CNT"
X-TFS-ProcessId: 7611d69f-e722-4108-8050-e55a61b1cbb4
Strict-Transport-Security: max-age=31536000; includeSubDomains
ActivityId: 15e046e5-3788-4fdb-896a-7d0482121ddd
X-TFS-Session: 9f8e8272-db48-4e93-b9b0-717937244aff
X-VSS-E2EID: 6444f0b5-57e0-45da-bd86-a4c62d8a1794
X-VSS-UserData: <...REMOVED...>
X-FRAME-OPTIONS: SAMEORIGIN
Request-Context: appId=cid-v1:e3d45cd2-3b08-46bc-b297-cda72fdc1dc1
Access-Control-Expose-Headers: Request-Context
X-Content-Type-Options: nosniff
X-MSEdge-Ref: Ref A: 7E9E95F5497946AC87D75EF3AAD06676 Ref B: CHGEDGE1521 Ref C: 2020-01-14T14:01:59Z
Date: Tue, 14 Jan 2020 14:02:00 GMT

{"$id":"1","innerException":null,"message":"The maximum request size of 26214400 bytes was exceeded.","typeName":"System.ArgumentException, mscorlib","typeKey":"ArgumentException","errorCode":0,"eventId":0}
Matt Varblow
  • 7,651
  • 3
  • 34
  • 44

2 Answers2

0

Until now, the most useful solution is to allocated a sufficiently large buffer via modifying the configuration in IIS.

Go machine -> open IIS manager:

enter image description here


(1) Select the site of your collection

(2) Double click “Configuration Editor”

enter image description here

(3) Insert system.webServer/serverRuntime into Section

(4) Expand the uploadReadAheadSize value based on your scenario.

Then click Apply to apply above changes.


I know, it would be very costly for large request bodies. But, as I know, this is the often method we used.

enter image description here

Mengdi Liang
  • 17,577
  • 2
  • 28
  • 35
  • Thanks for that! But is there a solution for Azure DevOps (TFS Online)? – Matt Varblow Jan 09 '20 at 19:10
  • @MattVarblow what’s mean about TFS online? Azure devops service(VSTS)? Because as normal, all TFS are based server. – Mengdi Liang Jan 10 '20 at 00:03
  • Yes. It's changed names a few times over the years. But that is what I mean. Azure DevOps service (VSTS). :) – Matt Varblow Jan 10 '20 at 13:56
  • As far as I can tell, there is no way to programmatically commit a > 13 MB file to VSTS from a .NET Core application. I must be missing something? – Matt Varblow Jan 10 '20 at 13:58
  • @MattVarblow, My colleague and I are not has such limitation with rest api which executed with postman. You could try with using postman to run once to see whether it still prompt such error message. In addition, it seems you were using our client api to do this operation. Would you mind share me the sample about these code? – Mengdi Liang Jan 13 '20 at 07:41
  • yes we're trying to use the client API to do this operation. I've added some sample code that reproduces the error. – Matt Varblow Jan 13 '20 at 21:04
  • @MattVarblow, I tried with the same script on my side. Work fine with the script if I upload a file which size is around 18Mb. Did you ever use fiddler to capture this operation? If you don't mind, you could try again on your side, use fiddler to capture the trace and analyze the trace? – Mengdi Liang Jan 14 '20 at 09:18
  • @MattVarblow, In addition, what's the result of **File.ReadAllBytes** while you facing the error? Maybe you should try with postman again to see is it failed again. – Mengdi Liang Jan 14 '20 at 09:41
  • You are correct. An 18 MB file works fine. The limit seems to be closer to 20 MB, just above 19MB actually. The result of File.ReadAllBytes is a byte array containing the file contents. It does not throw an exception. I updated my sample program to report the file size using File.ReadAllBytes and this works as expected. I ran some more tests, examined in Fiddler, and updated the post with my observations. – Matt Varblow Jan 14 '20 at 14:15
  • @MattVarblow Thanks. I would also do the test with 19mb, also will test with our org which migrated from TFS. – Mengdi Liang Jan 14 '20 at 14:19
  • @MattVarblow In addition, the client of VSTS has maximum limitation:20MB. And this is fixed and could not be around for VSTS. – Mengdi Liang Jan 14 '20 at 14:24
0

I opened a support ticket with Microsoft. The support engineer indicated that this was a known limitation of the Azure DevOps service. He suggested that I create a feature request. I've done that. If this limitation is problematic for you, please upvote the feature request.

https://developercommunity.visualstudio.com/idea/1130401/allow-creating-large-tfvc-changesets-via-the-api.html

Matt Varblow
  • 7,651
  • 3
  • 34
  • 44