0

I am trying to do hadoop-mapreduce in pentaho.I have hadoopcopyfiles step in a job to specify input path of file.All works fine if my input file location is with root access.(ie.)files created already in root folder.But, if i give source file as my local file location,it gives the following error in pentaho log

2016/01/12 11:44:57 - Spoon - Starting job...
2016/01/12 11:44:57 - samplemapjob1 - Start of job execution
2016/01/12 11:44:57 - samplemapjob1 - Starting entry [Hadoop Copy Files]
2016/01/12 11:44:57 - Hadoop Copy Files - Starting ...
2016/01/12 11:44:57 - Hadoop Copy Files - Processing row source File/folder source : [file:///home/vasanth/Desktop/my.txt] ... destination file/folder : [hdfs://WEB2131:9000/new1/]... wildcard : [null]
2016/01/12 11:45:03 - Hadoop Copy Files - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : File System Exception: Could not find files in "file:///home/vasanth/Desktop".
2016/01/12 11:45:03 - Hadoop Copy Files - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : Caused by: Invalid descendent file name "hdfs:".

I have tried giving

sudo chmod 777 /home/vasanth/Desktop/my.txt

but error is still there.how can i solve this problem?

vasanth
  • 224
  • 3
  • 19
  • What is your source and destination environment? – bolav Jan 12 '16 at 21:10
  • @bolov source enviroment is **** and destination environment is **hadoop cluster** – vasanth Jan 14 '16 at 05:11
  • For me it looks as it thinks the hdfs path is a local path, and that Invalid descendent file name comes from vfs, and that it comes from the :. It looks as though you don't set up the hdfs path the correct way. – bolav Jan 18 '16 at 09:37
  • @bolov i have provided hdfs path in cluster setup and tested it. the output is saved successfully in hdfs. the only problem is it cannot take only input file from non root users file location. – vasanth Jan 19 '16 at 07:30

0 Answers0