0

I am facing a problem when creating a .csv file on the AWS S3 bucket. I have tried the PutObjectRequest and the TransferUtility libraries.

I think it is how I'm writing to the csvText.csv file. The data is written in memory so I can't store the data physically in a file on the server.

Issue: The file (userData.csv) is created but doesn't contain any data.

When I run this application locally on my pc it will create and populate the file in S3. But not the application is called through the AWS URL, the file is created but no data is populated in the csv.

Environment and workflow: My .NET web application (GatherUserData) sits on a EC2 instance which is called by a CreateUser() method from a thirdparty application(CS).

My application then calls (CS) with a GET request. GETS user data and then Creates a CSV file and places it in a S3 bucket.

My code:

public string ExportUserDataCSVFile(GetUserData userData)
    {
        csvText = new StringBuilder();
        try
        {
            Data1.Customfield[] customeFields = userData.data.customFields;
            if (userData != null)
            {
                //header
                csvText.AppendLine(GetSingleUserCSVHeader());

                //UserID
                csvText.Append(userData.data.userId);
            }
        catch (Exception e)

        {
            Console.WriteLine(e.Message);
        }

        return csvText.ToString();

}

CreateCsvInAws class:

public static void CreateCsvInAws(string csvText)
    {


        try
        {

            MemoryStream ms = new MemoryStream();
            BinaryWriter bw = new BinaryWriter(ms);
            bw.Write(csvText.ToCharArray());
            bw.Flush();
            bw.Close();

s3Client = new AWSClient();

                PutObjectRequest pr = new PutObjectRequest

            {

                BucketName = bucketName,
                Key = keyName,
                ContentBody = csvText, {I can see the user data here, but it doesn't get written to the file}
                ContentType = "application/octet-stream",
                CannedACL = "bucket-owner-full-control"
            };

            PutObjectResponse response = s3Client.PutObject(pr);
        }
        catch (AmazonS3Exception e)
        {
            Console.WriteLine("Error encountered '{0}' when writing");
        }
        catch (Exception e)
        {
            Console.WriteLine("unknown error '{0}' again");
        }

    }
Rick
  • 41
  • 6
  • Shouldn't the content type for a CSV file be text/csv? – jarmod Aug 17 '18 at 13:05
  • 1
    You can't put an object into s3 without correct policy, i.e. your EC2 must have the rigths from assumeRoles to put the data into S3. The reason why it works on your local machine because the access key is putting there. And the s3 bucket must have policy to allow EC2 to pubobject and readobject.. – mootmoot Aug 17 '18 at 17:59
  • @mootmoot - I'm usign the following bucket permissions.. "Sid": "Stmtasdfasdsf5184", "Effect": "Allow", "Principal": "*", "Action": [ "s3:GetObject", "s3:PutObject", "s3:*" ], "Resource": "arn:aws:s3:::mybucketname/*" – Rick Aug 21 '18 at 13:11
  • Please read this documentation thoughtfully. https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2.html – mootmoot Aug 21 '18 at 13:52
  • A possible answer. https://stackoverflow.com/questions/34057679/aws-s3-bucket-access-from-ec2 – mootmoot Aug 21 '18 at 13:57

0 Answers0