4

I am writing data to a kinesis stream on the invocation of a dynamoDBTrigger. This stream is configured as the input stream to a kinesis analytics application. I have a lambda preprocessor configured on the kinesis stream that logs the data that is written in the stream. However, on the analytics application window in the source tab the message No rows in source stream comes up. The rows do not created in the in-application sql stream.

I am using Node and deploying the service using a serverless.yml file. Here are the configurations -

RecordKinesisAnalyticsApp: Type: AWS::KinesisAnalytics::Application Properties: ApplicationName: RecordKinesisAnalyticsApp ApplicationDescription: RecordKinesisAnalyticsApp ApplicationCode: ${file(./serverless/metadataQueries.yml):AnalyticsQuery_1} Inputs: - NamePrefix: "RecordPrefix" InputSchema: RecordColumns: - Name: "USER_ID" SqlType: "VARCHAR(20)" Mapping: "$._userId" - Name: "ANXIETY" SqlType: "INTEGER" Mapping: "$.anxiety" RecordEncoding: "UTF-8" RecordFormat: RecordFormatType: "JSON" KinesisStreamsInput: ResourceARN: Fn::GetAtt: - RecordKinesisInputStream - Arn RoleARN: arn:aws:iam::xxxxxxxxxxx:role/service-role/kinesis-analytics-KinesisDemo-us-east-1 This is the analytics query -

CREATE OR REPLACE STREAM "DESTINATION_SQL_STREAM " (USER_ID VARCHAR(20), ANXIETY INTEGER); CREATE OR REPLACE PUMP "STREAM_PUMP" AS INSERT INTO "DESTINATION_SQL_STREAM" SELECT STREAM USER_ID, ANXIETY FROM "RecordPrefix_001" WHERE ANXIETY >= 0; enter image description here

enter image description here

Marcin
  • 215,873
  • 14
  • 235
  • 294
Abhishek Pandey
  • 300
  • 1
  • 13
  • 2
    I guess you did figure out the problem, today I was facing the same, the raw records looked fine and there was no info in the Formatted tab, neither info in Error stream. I am using serverless too, the problem is that it saves the SQL query without running it, if you go to the SQL console and try to run the script you will find the error. Hope this helps somebody else. – ferflores Nov 28 '19 at 20:12

1 Answers1

0

I think this is because of your quotations. AWS Console (by default, even if you don't see it) and you in your code used "USER_ID", "ANXIETY" and so forth. So your SQL code must also use quotations:

CREATE OR REPLACE STREAM "DESTINATION_SQL_STREAM ("USER_ID" VARCHAR(20), "ANXIETY" INTEGER);

CREATE OR REPLACE PUMP "STREAM_PUMP" AS INSERT INTO "DESTINATION_SQL_STREAM"
    SELECT STREAM "USER_ID", "ANXIETY"
    FROM "RecordPrefix_001" 
    WHERE "ANXIETY" >= 0;

From docs:

Kinesis Data Analytics adds quotation marks around the identifiers (stream name and column names) when creating the input in-application stream. When querying this stream and the columns, you must specify them in quotation marks using the same casing (matching lowercase and uppercase letters exactly). For more information about identifiers, see Identifiers in the Amazon Kinesis Data Analytics SQL Reference.

Marcin
  • 215,873
  • 14
  • 235
  • 294