1

I have Nifi set to pick up files from an SFTP and then drop them into HDFS folders based on the filename. From there, it needs to send the command alter table ${dbname}.${tablename} add partition (year=${year}, date='${date}');

I get that I need to use the "PutHiveQL" processor, but I'm not sure how to feed it the 'alter table' command. I've read some threads over at Hortonworks saying to use the "ReplaceText" processor, but I'm not sure if that would work in this situation. All of my previous processors are just there to create variables that build the HDFS folder path. There's no real "text" to replace. So, any ideas how I can pass on this command to Hive? Any assistance would be appreciated.

lengthy_preamble
  • 404
  • 3
  • 14
  • 35

1 Answers1

2

Use ReplaceText processor with Always Replace strategy.

Add these processors at the end of your flow:

--other processors 

PutHDFS //store file into HDFS

ReplaceText //always replace and replacement value alter statement

PutHiveQL //configure HiveConnectionPool and processor executes hive statements.

Now we are creating alter statement using ReplaceText processor and feeding success relation to PutHiveQL processor

notNull
  • 30,258
  • 4
  • 35
  • 50