0

I am trying to import data from MYSQL to Hadoop but I am getting below exception. Can someone please help me. please find the stack trace below:

Command:

sqoop import --connect jdbc:mysql://localhost/sqoopdb --username 'root' -P --table 'company' --target-dir '/sqoopout' -m 1

     18/05/26 00:13:25 INFO client.RMProxy: Connecting to ResourceManager at /127.0.0.1:8032
18/05/26 00:13:45 ERROR tool.ImportTool: Import failed: java.net.ConnectException: Call From java.net.UnknownHostException: host: host: unknown error to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
    at org.apache.hadoop.ipc.Client.call(Client.java:1479)
    at org.apache.hadoop.ipc.Client.call(Client.java:1412)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
    at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
    at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
    at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1301)
    at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424)
    at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:145)
    at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
    at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:200)
    at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:173)
    at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:270)
    at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
    at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:520)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
    at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
    at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
    at org.apache.hadoop.ipc.Client.call(Client.java:1451)
    ... 40 more
Robin Green
  • 32,079
  • 16
  • 104
  • 187
Saurabh P
  • 3
  • 4
  • Hi Robin, i have pasted full stack trace. what I feel, when it try to connect to resource maneger it fails. somehere i am missing some settings in hdfs-site.xml or core-site.xml or yarn-site.xml – Saurabh P May 26 '18 at 07:19
  • It is saying "For more details see: http://wiki.apache.org/hadoop/ConnectionRefused " - have you read that page? – Robin Green May 26 '18 at 07:23
  • yes actually, I have seen that but not able to figure out exact problem. in betn if i run command, hadoop fs -ls, then as well i am getting this error. Call From java.net.UnknownHostException: host: host: unknown error to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; – Saurabh P May 26 '18 at 07:34
  • @SaurabhP instead of giving "localhost" can you try providing host IP or all running services – Sandeep Singh May 26 '18 at 12:27

1 Answers1

1

Please make sure the hdfs and yarn are up and running.

You can use jps command to see whether the hdfs and yarn are up or not.

if not please run start-dfs.sh and start-yarn.sh

now run jps will display like

5316 jps
3704 NameNode
3984 SecondaryNameNode
3802 DataNode
4242 NodeManager
4140 ResourceManager
sarath kumar
  • 360
  • 2
  • 15
  • Hi Sarath, actually I have already run these services but even if i run this command hadoop fs -ls, then also i am getting same connection refused error message. Call From java.net.UnknownHostException: host: host: unknown error to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused – Saurabh P May 27 '18 at 23:18
  • Have you setup the ssh-keygen key ? `ssh-keygen -t rsa -P ''"` `cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys` `sudo vim /etc/ssh/sshd_config` change property -> **PasswordAuthentication yes** `sudo service ssh restart ssh localhost` try this and see if you still have the same issue, Thanks !!! – sarath kumar May 28 '18 at 03:30