0

I am trying to upload large files (less than 5 GB, hence not multipart upload, normal upload) using java sdk. Smaller files gets uploaded in no time. but files which are above 1 MB, doesnt upload. My code gets stuck in the lince where actual upload happens. I tried using transfer manager (TransferManager.upload) function, when I check the number of bytes transferred, it keeps transferring more than 1 MB and keeps running until I force stop my java application. what could be the reason, where am I going wrong. same code works for smaller files. Issue is only with larger files.

DefaultAWSCredentialsProviderChain credentialProviderChain = new DefaultAWSCredentialsProviderChain(); TransferManager tx = new TransferManager(credentialProviderChain.getCredentials()); Upload myUpload = tx.upload(S3bucket,fileKey, file);

        while(myUpload.isDone() == false) {
            System.out.println("Transfer: " + myUpload.getDescription());
            System.out.println("  - State: " + myUpload.getState());
            System.out.println("  - Progress: "
                    + myUpload.getProgress().getBytesTransferred());
        }

s3Client.upload(new PutObjectRequest(S3bucket,fileKey, file));

Tried both transfer manager upload and putobject methods. Same issue with both.

TIA.

FunWithJava
  • 213
  • 1
  • 2
  • 9
  • If using the java SDK is not a hard requirement, I would recommend the AWS CLI - the command `aws s3 cp /local/file s3://bucket/filename` uploads a single file, and using `sync` instead of `cp` will do a whole folder. If you definitely need a java API my suggestion is to look at the source code for the CLI - I was able to do this in python by reverse engineering that code a bit, and the same strategy would probably work in java. – Paul Siegel Jan 01 '18 at 04:23
  • Its supposed to be done in java.. I couldnt find the reason behind additional bytes of data being uploaded while the actual size itself is 1MB.. – FunWithJava Jan 01 '18 at 04:27

0 Answers0