1

When I am trying to import a table to Hive, I am getting a strange error.

Query:

sqoop import  --connect 'jdbc:sybase:Tds:10.100.*.***:5500/DATABASE=****' --driver 'com.sybase.jdbc3.jdbc.SybDriver' --username "****" --password "***" --table dw.dm_court_courttype --direct -m 1 --hive-import --create-hive-table --hive-table DM_court_courtcype --target-dir "/user/hive/warehouse/DM_Court_CourtType" --verbose

Error:

java.io.IOException: SQLException in nextKeyValue at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:565) at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:796) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:346) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: com.sybase.jdbc3.jdbc.SybSQLException: SQL Anywhere Error -131: Syntax error near '.' on line 1 at com.sybase.jdbc3.tds.Tds.a(Unknown Source) at com.sybase.jdbc3.tds.Tds.nextResult(Unknown Source) at com.sybase.jdbc3.tds.Tds.getResultSetResult(Unknown Source) at com.sybase.jdbc3.tds.TdsCursor.open(Unknown Source) at com.sybase.jdbc3.jdbc.SybStatement.executeQuery(Unknown Source) at com.sybase.jdbc3.jdbc.SybPreparedStatement.executeQuery(Unknown Source) at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111) at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235) ... 12 more

Dev
  • 13,492
  • 19
  • 81
  • 174
karthee
  • 49
  • 2
  • 14

1 Answers1

1

Don't use database name with table name.

Use --table dm_court_courttype instead of --table dw.dm_court_courttype

Try this:

sqoop import --connect 'jdbc:sybase:Tds:10.100..:5500/DATABASE=****' --driver 'com.sybase.jdbc3.jdbc.SybDriver' --username "****" --password "*" --table dm_court_courttype --direct -m 1 --hive-import --create-hive-table --hive-table DM_court_courtcype --target-dir "/user/hive/warehouse/DM_Court_CourtType" --verbose
Dev
  • 13,492
  • 19
  • 81
  • 174
  • Aren't some of this three commands redundant: --hive-import --create-hive-table --hive-table Actually I was about to ask a question about that topic – Ignacio Alorre Mar 08 '17 at 10:38
  • 1
    @IgnacioAlorre I just modify OP's command. All there are not redundant. Using `--hive-table`, you can set hive table's name. Yes. ` --hive-import` will automatically create hive table. So, `--create-hive-table` tag can be skipped here. – Dev Mar 08 '17 at 10:42
  • @dev actually am getting the same errror! if i remove the schema name Table not found exception throwing.this exception throws only when am doing with -direct import. if i use 'select * from table',the query is fine! what i have missed here? – karthee Mar 13 '17 at 10:52
  • @karthee Is the same query working if you remove `-direct` (_not select * from table_)? – Dev Mar 15 '17 at 12:51
  • @karthee I am not much familiar with sybase. Are database and schema different in it? Some RDBMS have only databases or schema tightly coupled with usernames – Dev Mar 15 '17 at 12:53
  • @dev That's rite...if i remove --direct,the query is fine.it's fine with select * from \$conditions!!! am also new to this Sybase! and will try to sort this out.Thanks anyways :) – karthee Mar 16 '17 at 07:15
  • @karthee. Okay. you can look into `-- --schema custom_schema` option if you want to give custom schema name. I have not tried it with Sybase. It works with sql server, postgres. – Dev Mar 16 '17 at 08:45