2

I sent some data using boto3 API to my destination S3 bucket which is predesigned dynamically partitioned. Here is what I set: enter image description here

And here is what I do to send the data:

import base64
import boto3
import json
def put_batch_data_stream(input, data_stream_name):
    '''
    params:
    * input: dict, must include event_id/pbr_id/pitch_id in the key set
    '''
    client = boto3.client('firehose', region_name = 'somewhere', aws_access_key_id='some-key',
        aws_secret_access_key='some-key')
    
    client.put_record_batch(
        DeliveryStreamName=data_stream_name,
        # the data blob has to be base64-encoded
        Records=[{'Data': base64.b64encode(json.dumps(input).encode('ascii'))}]
    )

    event_id = input['event_id']
    pbr_id = input['pbr_id']
    pitch_id = input['pitch_id']
    
    print(f'Successfully sent pitch {pitch_id} of player {pbr_id} in Event {event_id} to stream {data_stream_name}!')
    return
record = {}
record['event_id'] = 12345
record['pbr_id'] = 54321
record['pitch_id'] = 49876
record['information'] = {'speed': 40, 'height': 76, 'weight': 170, 'age': 34, 'gender': 'male'}

put_batch_data_stream(record, 'my-firehose-data-stream')

These two parts work well but what I got is the error in my error bucket. Its message is Non JSON record provided.

Is there anything I miss?

Memphis Meng
  • 1,267
  • 2
  • 13
  • 34

0 Answers0