4

I am investigating creating an embedded AWS IoT project and would like to use the google protobuf (binary serialization) for message payload - for size, ease of parsing, reliability etc.

AWS IoT speaks JSON, Protobuf converts quite nicely to & from JSON. I'm not awfully familiar with how the AWS services are setup, is there anyway I could convert from binary to json on arrival/before send? I thought I could pass the packet to Lambda and basically do what I want, but would this mean I can no longer run it through the rules engines? Most packets will end up in DynamoDB, if this makes any difference.

Any suggestions on this front would be very appreciated!

FrozenKiwi
  • 1,362
  • 13
  • 26

4 Answers4

2

protobuf has python support and so does lambda, so you could unpack a protobuf in lambda and use the results to write to DynamoDB or whatever. Good luck.

guycole
  • 788
  • 5
  • 10
2

The post is a little old but the situation is evolved.

As stated in this aws forum announcement it is now possible to encode the whole mqtt payload in a string using base64 encoding (only supported format), and then invoke the lambda function. The encode sql function documentation can be found here.

Giorgio Ruffa
  • 456
  • 3
  • 7
1

The idea is sound, but currently there isn't support for pushing binary data directly from IoT broker -> lambda. Possibly it could run IoT-> Kinesis->Lambda, however this hasn't been confirmed yet.

Support for alternative message packets has been requested. See this thread for more details/updates: https://forums.aws.amazon.com/thread.jspa?messageID=682020&#682020

FrozenKiwi
  • 1,362
  • 13
  • 26
1

AWS IoT Core’s Rules Engine adds support for Google Protocol Buffer (Protobuf) messaging format. Please take a look at here

masaino
  • 71
  • 4