0

I am trying to run a SQL activity in Redshift cluster through data pipeline. After SQL activity, few logs need be written to a Table in redshift [such as number of rows affected, the error message(if any)].

Requirement: If the sql Activity is finished successfully, the mentioned table will be written with 'error' column as null, else if the sql Activity fails on any error, that particular error message is need to be updated into the 'error' column in Redshift table.

Can we able to achieve this through pipeline? If yes, How can we achieve this?

Thanks, Ravi.

Ravi Gv
  • 3
  • 3

1 Answers1

0

Unfortunately you cannot do this directly with SqlActivity in DataPipeline. The work around is to write a java program (or any executable) that does what you want and schedule it via Datapipeline using ShellCommandActivity.