1

I am fairly new to this process. I am trying to run a simple map-reduce job using python 3.8 with a csv on a local Hadoop cluster (Hadoop version 3.2.1). I am currently running it on Windows 10 (64-bit). The aim of what I'm trying to do is to process a csv file where I will get an output of a count representing the top 10 salaries from the file, but it does not work.

When I enter this command:

$ python test2.py hdfs:///sample/salary.csv -r hadoop --hadoop-streaming-jar %HADOOP_HOME%/share/hadoop/tools/lib/hadoop-streaming-3.2.1.jar

The output reports an error:

No configs found; falling back on auto-configuration
No configs specified for hadoop runner
Looking for hadoop binary in C:\hdp\hadoop\hadoop-dist\target\hadoop-3.2.1\bin...
Found hadoop binary: C:\hdp\hadoop\hadoop-dist\target\hadoop-3.2.1\bin\hadoop.CMD
Using Hadoop version 3.2.1
Creating temp directory C:\Users\Name\AppData\Local\Temp\test2.Name.20200813.003240.345552
uploading working dir files to hdfs:///user/Name/tmp/mrjob/test2.Name.20200813.003240.345552/files/wd...
Copying other local files to hdfs:///user/Name/tmp/mrjob/test2.Name.20200813.003240.345552/files/
Running step 1 of 1...
  Found 2 unexpected arguments on the command line [hdfs:///user/Name/tmp/mrjob/test2.Name.20200813.003240.345552/files/wd/setup-wrapper.sh#setup-wrapper.sh, hdfs:///user/Name/tmp/mrjob/test2.Name.20200813.003240.345552/files/wd/test2.py#test2.py]
  Try -help for more information
  Streaming Command Failed!
Attempting to fetch counters from logs...
Can't fetch history log; missing job ID
No counters found
Scanning logs for probable cause of failure...
Can't fetch history log; missing job ID
Can't fetch task logs; missing application ID
Step 1 of 1 failed: Command '['C:\\hdp\\hadoop\\hadoop-dist\\target\\hadoop-3.2.1\\bin\\hadoop.CMD', 'jar', 'C:\\hdp\\hadoop\\hadoop-dist\\target\\hadoop-3.2.1/share/hadoop/tools/lib/hadoop-streaming-3.2.1.jar', '-files', 'hdfs:///user/Name/tmp/mrjob/test2.Name.20200813.003240.345552/files/wd/mrjob.zip#mrjob.zip,hdfs:///user/Name/tmp/mrjob/test2.Name.20200813.003240.345552/files/wd/setup-wrapper.sh#setup-wrapper.sh,hdfs:///user/Name/tmp/mrjob/test2.Name.20200813.003240.345552/files/wd/test2.py#test2.py', '-input', 'hdfs:///sample/salary.csv', '-output', 'hdfs:///user/Name/tmp/mrjob/test2.Name.20200813.003240.345552/output', '-mapper', '/bin/sh -ex setup-wrapper.sh python3 test2.py --step-num=0 --mapper', '-combiner', '/bin/sh -ex setup-wrapper.sh python3 test2.py --step-num=0 --combiner', '-reducer', '/bin/sh -ex setup-wrapper.sh python3 test2.py --step-num=0 --reducer']' returned non-zero exit status 1.

Here is the error that I exactly get from the output above:

Found 2 unexpected arguments on the command line [hdfs:///user/Name/tmp/mrjob/test2.Name.20200813.003240.345552/files/wd/setup-wrapper.sh#setup-wrapper.sh, hdfs:///user/Name/tmp/mrjob/test2.Name.20200813.003240.345552/files/wd/test2.py#test2.py]

This is the python file test2.py:

from mrjob.job import MRJob
from mrjob.step import MRStep
import csv

cols = 'Name,JobTitle,AgencyID,Agency,HireDate,AnnualSalary,GrossPay'.split(',')

class salarymax(MRJob):

    def mapper(self, _, line):
        # Convert each line into a dictionary
        row = dict(zip(cols, [a.strip() for a in csv.reader([line]).next()]))

        # Yield the salary
        yield 'salary', (float(row['AnnualSalary'][1:]), line)

        # Yield the gross pay
        try:
            yield 'gross', (float(row['GrossPay'][1:]), line)
        except ValueError:
            self.increment_counter('warn', 'missing gross', 1)

    def reducer(self, key, values):
        topten = []

        # For 'salary' and 'gross' compute the top 10
        for p in values:
            topten.append(p)
            topten.sort()
            topten = topten[-10:]

        for p in topten:
            yield key, p

    combiner = reducer


if __name__ == '__main__':
    salarymax.run()

I have taken a look at this StackOverflow question, How to run a MRJob in a local Hadoop Cluster with Hadoop Streaming? question, but it did not solve my errors.

I have looked at the setup-wrapper.sh file because that was where an error was being highlighted. Nothing seemed to be wrong with it when I checked.

I don't understand what the error is. Is there a way to fix it?

Aleks
  • 47
  • 1
  • 6

1 Answers1

0

I had the same issue, reinstalling Java JDK fixed this for me. I had originally installed it to C:\Program Files\Java but moved it to C:\Java based on some instructions. I thought updating the environment variables would be enough but apparently, it wasn't. So I uninstalled Java and reinstalled it. This time to C:\Java which fixed my issue.

Cassova
  • 530
  • 1
  • 3
  • 20