0

In my Java Spring Boot application, I first download files from an S3 Bucket.

A REST client is then used to upload these files to Documentum by calling Documentum REST.

My API method for uploading a document is pasted below:

public RestObject createDocument(Linkable parent, RestObject objectToCreate, Object content,
                                 String contentMediaType, String... params) {
    return post(parent.getHref(DOCUMENTS), new JsonObject(objectToCreate), content, contentMediaType,
            JsonObject.class, params);
}

The arguments to this method are as follows:

  • parent: a link to the cabinet where the file is to be stored
  • objectToCreate: a list of metadata values to be stored with the file
  • content: the complete byte array contents of the file
  • contentMediaType: the MIME type of the file
  • params: a few name-value pairs to set additional properties

This works fine for small files, but when I attempt to upload files of over 1GB, for example, this leads to OutOfMemory exceptions due to the entire byte array being held in memory.

I have attempted to pass the S3 Object InputStream to the API, but that results in this error:

Could not write request: no suitable HttpMessageConverter found for request type [com.amazonaws.services.s3.internal.S3AbortableInputStream]

So I am now wondering if it is possible to read the content in chunks from the S3AbortableInputStream in the Java code and send multiple requests to Documentum REST, achieving the file upload in multi-parts rather than sending the entire content in one request.

I would be very grateful for any advice or examples where something similar has been done. Many thanks for reading my post.

Thomas
  • 174,939
  • 50
  • 355
  • 478
Jon H
  • 394
  • 3
  • 17
  • If you can read streaming from S3, you don't need to send multiple requests at all. Can't help you further because I don't know the APIs involved. – Thomas Mar 13 '23 at 16:06

0 Answers0