0

I am stuck with a strange problem. Pentaho Data integration provides sample Job "word Count Job" in order to understand MapReduce Jobs.
I am learning MapReduce and I am really lost with one strange error.

Error is :

"Caused by: java.io.IOException: Cannot initialize Cluster. 
Please check your configuration for mapreduce.framework.name
and the correspond server addresses."

I have tried everything in my repertoire to resolve from chaging "plugin-properties" file in Pentaho data integration to re-installing Pentaho SHIM but to no avail.
As per the job's flow, file is correctly getting transferred to HDFS server from my local(where pentaho data integration is running) but the moment MapReduce job starts it throws error.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Vikas Kumar
  • 87
  • 2
  • 18

1 Answers1

0

Finally cracked it. The error was because in core-site.xml file, "IP-Address" of cluster was mentioned by me where as "hostname" was recognized by the cluster. Hence, because of this ambiguity this error was happening.

Hurray!!

Vikas Kumar
  • 87
  • 2
  • 18