0

I am trying to upload an image to S3 through graphql using the apollo-upload-client library which just give the ability to send images through a graphql query. So the image is storying itself in the S3 bucket, but when I try to read the Location url it doesn't seems to work. When I read the url with an <img src="img_url" /> it just shows:

Wrong image

And when I try to manually enter the link, it just automatically downloads a strange text file with a lot of weird symbols.

This is what the upload looks like:

export async function uploadImageResolver(
  _parent,
  { file }: MutationUploadImageArgs,
  context: Context,
): Promise<string> {
  // identify(context);

  const { createReadStream, filename, mimetype } = await file;

  const response = await s3
    .upload({
      ACL: 'public-read',
      Bucket: environment.bucketName,
      Body: createReadStream(),
      Key: uuid(),
      ContentType: mimetype,
    })
    .promise();

  return response.Location;
}

An example of the File object looks like this:

{
  filename: 'Screenshot 2021-06-15 at 13.18.10.png',
  mimetype: 'image/png',
  encoding: '7bit',
  createReadStream: [Function: createReadStream]
}

What I am doing wrong? It returns an actual S3 link but the link itself isn't displaying any image. And I tried to upload the same image to S3 manually and it works just fine. Thanks in advance for any advice!

  • uploading problems are already resolved ... use search? – xadm Jun 24 '21 at 10:08
  • I already searched posts, tutorials and documentation before posting this question. There is a little information about this `graphql-upload-client`. Also all the tutorials are doing the same steps. The problem I have is that I cannot find the issue, I don't really know why is not working. – Michal Ružička Ružička Jun 24 '21 at 10:46
  • why client problem? looks like a common server/API/resolver/node problem ... response returned/closed before processed ... `await s3.upload...` ? – xadm Jun 24 '21 at 11:01
  • I am calling the `await` before the `s3.upload..` I just didn't paste it in the example sorry. I am going to edit the example code. – Michal Ružička Ružička Jun 24 '21 at 11:23
  • not a big change ... there is a lot of `corrupt s3 upload` questions to dig in ... `s3 forward upload` (and related 'multipart', 'chunks'), `pipe stream`, without buffering (save on storage in the middle), etc. .. just search for better examples/answers – xadm Jun 24 '21 at 11:42
  • It's just frustrating finding only three or four examples of this usage, all of them are doing the same stuff and it's not working for me despite being only 5 lines of code. Thanks for the answer, at least now I know what to search for. – Michal Ružička Ružička Jun 24 '21 at 11:55

1 Answers1

0

So after a deeper research, it seems that the problem is with the serverless framework, specially with serverless-offline. It doesn't allow transport of binary data. So I tried to convert the createReadStream to a base64 string, but that didn't work either.

This is the try:

 export async function uploadImageResolver(
  _parent,
  { file }: MutationUploadImageArgs,
  context: Context,
): Promise<string> {
  const { createReadStream, filename, mimetype } = await file;

  const response = await s3
    .upload({
      ACL: 'public-read',
      Bucket: environment.bucketName,
      Body: (await stream2buffer(createReadStream())).toString('base64'),
      Key: `${uuid()}${extname(filename)}`,
      ContentEncoding: 'base64',
      ContentType: mimetype // image/jpg, image/png, ...
    })
    .promise();

  return response.Location;
}

async function stream2buffer(stream: Stream): Promise<Buffer> {
  return new Promise<Buffer>((resolve, reject) => {
    let _buf = Array<any>();

    stream.on('data', (chunk) => _buf.push(chunk));
    stream.on('end', () => resolve(Buffer.concat(_buf)));
    stream.on('error', (err) => reject(`error converting stream - ${err}`));
  });
}

I also tried to install the serverless-apigw-binary plugin, that that didn't work either.

plugins:
  - serverless-webpack
  - serverless-offline
  - serverless-apigw-binary

It is uploading the same corrupted image to s3.

These are some posts with the same problem, but none of them the got a solution.

https://stackoverflow.com/questions/61050997/file-uploaded-successfully-to-s3-using-serverless-api-but-it-doesnt-opencorrup

Uploading image to s3 from AWS Lambda with NodeJS results in corrupted file

UPDATE: So it is definitely not a problem with my s3.upload function or the s3 itself. It seems that the issue is getting the image to the server. I am pretty sure that is has something to do with serverless. I've created a small function that just receives the image and load it into a local folder. And I am getting the image corrupted:

export async function uploadImageResolver(
  _parent,
  { file }: MutationUploadImageArgs,
  context: Context,
): Promise<string> {
  // identify(context);

  const { createReadStream, filename } = await file;

  createReadStream().pipe(
    createWriteStream(__dirname + `/../../../images/${filename}`),
  );

  return ''
}

enter image description here

UPDATE 2: I figured out that it works when deploying. So it has to be something with serverless-offline.