0

I was trying to import a table from RDBMS (Windows - SQL Server 2012) to HDFS using the code below. But I'm getting an error. I could successfully connect it.

sqoop import   
--connect      
"jdbc:sqlserver://192.1x8.xx.1:14xx;database=AdventureWorks2012;
 username=hadox;password=hadxx" 
 --table Production.Product
     --hive-import  

I understood the error was caused by the dot (.) in the tables name.

I got that information from the link sqoop to import data to hive. I didn't understand any details in that link.

Can anyone help please?

Thanks in advance.

Error:

ERROR manager.SqlManager: Error executing statement com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'Production.Product'.
com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'Production.Product'.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
L.G.
  • 71
  • 1
  • 10
  • Did you try `AdventureWorks2012.Production.Product` ? – philantrovert Sep 16 '17 at 07:02
  • sorry philanrovert, I am really new to the applications. where would I be writing this? – L.G. Sep 16 '17 at 20:56
  • with `--table` or maybe try a query with `--query "SELECT * from AdventureWorks2012.Production.Product WHERE \$CONDITIONS" – philantrovert Sep 17 '17 at 05:03
  • Thanks philantrovert. I used query sqoop import --connect "jdbc:sqlserver:// – L.G. Sep 18 '17 at 20:41
  • Thanks philantrovert. I used query sqoop import --connect "jdbc:sqlserver://xxx.xxx.xx.1:14xx;username=xx;password=xx" --query "SELECT* FROM AdventureWorks2012.Production.Product WHERE \$CONDITIONS" --SPLIT-BY ProductID --hive-import --hive-table --target-dir /user/local/hive/Adventure2012. I was not getting any error, but saying "missing arguiment for option: hive-table. nd also giving a list of sqoop import [GENERIC-ARGS] [TOOL-ARGS] – L.G. Sep 18 '17 at 21:04
  • IDON'T UNDERSTAND WHAT DOES THAT MEAN – L.G. Sep 18 '17 at 21:05
  • If you want to directly import it to Hive Table you have to provide a table name after --hive-table. Like `--hive-table table_in_hive`. – philantrovert Sep 19 '17 at 06:47
  • I am getting the following error as follows: 17/09/19 09:41:00 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:10002. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS) 17/09/19 09:41:00 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop1 cause:java.net.ConnectException: Call to localhost/127.0.0.1:10002 failed on connection exception: java.net.ConnectException: Connection refused – L.G. Sep 19 '17 at 08:48
  • 17/09/19 09:41:00 ERROR tool.ImportTool: Encountered IOException running import job: java.net.ConnectException: Call to localhost/127.0.0.1:10002 failed on connection exception: java.net.ConnectException: Connection refused at org.apache.hadoop.ipc.Client.wrapException(Client.java:1136) at org.apache.hadoop.ipc.Client.call(Client.java:1112) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229) – L.G. Sep 19 '17 at 08:49
  • Is your cluster Kerberos enable? I think you should raise a separate question for this. Make sure to highlight whatever you have done till now. – philantrovert Sep 19 '17 at 09:00
  • The command used: sqoop import --connect "jdbc:sqlserver://192.168.xx.x:1xxx;username=xxx;passwo‌​rd=xxx" --query "SELECT* FROM AdventureWorks2012.Production.Product WHERE \$CONDITIONS" --split-by ProductID --hive-table ProductionProduct --target-dir /usr/local/hive/AdventureWorks2012.Thanks for your advice and being patient. Sorry for being a pain again!! – L.G. Sep 19 '17 at 09:01
  • I don't know about Kerberos ! i will google it and make sure. Thanks for your help so far. – L.G. Sep 19 '17 at 09:03
  • Please ask a new Stackoverflow question. You are unlikely to get any help here. – philantrovert Sep 19 '17 at 09:03
  • When I check whether SQL server was connnected with the code "sqoop list-databases --connect jdbc:sqlserver://192.168.xx.1:14xx --username xxx --password xxx " I got the list below. Isn't the sign that Kerberos is enabled? Warning: /usr/local/hbase does not exist! HBase imports will fail. Please set $HBASE_HOME to the root of your HBase installation. Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: $HADOOP_HOME is deprecated. – L.G. Sep 19 '17 at 09:18
  • 17/09/19 10:13:42 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 17/09/19 10:13:42 INFO manager.SqlManager: Using default fetchSize of 1000 master tempdb model msdb ReportServer$SQLSERVERBI ReportServer$SQLSERVERBITempDB AdventureWorks2012 – L.G. Sep 19 '17 at 09:18
  • As you said I may raise a separate question. – L.G. Sep 19 '17 at 09:20
  • Hi philantrovert, can I ask if kerberos is a default server in ubuntu/hadoop? – L.G. Sep 19 '17 at 11:24
  • No, forget about Kerberos thing. You just need to mention the query that you are running, what you are trying to achieve , and the error that you have been getting – philantrovert Sep 19 '17 at 12:05
  • Thanks for your prompt reply all the time. – L.G. Sep 19 '17 at 12:28

1 Answers1

1

Internally sqoop will consider Production as schemaname (database name) Product as the table name. if you want to use import the table into production database, product table in hive. I would suggest you to use --query in sqoop command using that you are specifying the sqoop to look for specific table.

ankush reddy
  • 481
  • 1
  • 5
  • 28
  • Thanks for this information Ankush. sorry, can you say it with an example, please? An example will be helpful to understand bit easily as I am a kindergarten child to any applications:-) – L.G. Sep 16 '17 at 20:59