-1

I am getting Exception in thread "main" java.lang.NoClassDefFoundError: org/xerial/snappy/SnappyInputStream error in validate and split records processor in standard-ingest template which i am unable to resolve.

This is happening for both csv and json data.

Can anyone please help me on this.

  • please add a details into your question. in which nifi processor error appears? what are the parameters of this processor? – daggett Sep 17 '17 at 21:14
  • As i have mentioned in the post, Processor name: validate and split records processor(which is nothing but the executesparkjob processor in nifi). Input parameters: JSON file(books1.json from the kylo sample file) – Ravi Kiran Gururaja Sep 18 '17 at 03:21
  • ExecuteSparkJob? here is a list of standard nifi processors: https://nifi.apache.org/docs.html – daggett Sep 18 '17 at 09:21
  • Yes it is custom processor..the source code and nar files can be found from the below link. https://github.com/Teradata/kylo/tree/master/integrations/nifi/nifi-nar-bundles/nifi-spark-bundle/nifi-spark-processors/src/main/java/com/thinkbiganalytics/nifi – Ravi Kiran Gururaja Sep 19 '17 at 06:19

1 Answers1

0

Found the solution.. It was due to mismatch of the jar version of snappy-java. Initially i had added snappy-java-1.0.4.1 which created the above problem.

After adding snappy-java-1.1.0 to the spark classpath.txt the problem got resolved!