I am new to BigData i don't know what is going on! Please note, I am learning this myself.
I imported a table called sqooptest
from MySQL, from a db called sqoopex8
using this command:
sqoop import \
--connect jdbc:mysql://localhost/sqoopex8 \
--table sqooptest \
-m 1
I don't know where it goes (or imports).
There is a bunch of errors that is thrown, and honestly, i don't even know what to look for in the error. If it's the last line, it says "16/04/23 01:46:52 ERROR tool.ImportTool: Error during import: Import job failed!
" Again, I am in learning phase and I am learning this all by myself, so please bear with me!
Now, I look under /user/hduser/
and there is a folder by the name of the table (sqooptest
). Nothing inside it though.
Next, intuitively, looking around in the Internet, i find out that MySQL saves all its dbs in /var/lib/mysql
. Apparently, i didn't have access to it, so I had to access it from the terminal (CLI). When I did, I found all my dbs there. Now, I did this:
sqoop import \
--connect jdbc:mysql://localhost/sqoopex8 \
--table sqooptest \
--target-dir /var/lib/mysql \
-m 1
( added --target-dir /var/lib/mysql
)
This worked for some reason. when I do hadoop fs -ls /var/lib/mysql
I see two files - _SUCCESS
and part-m-00000
. Why is that? Why did it not work in the first time.
Also, in the first attempt, even when I specify the target to HDFS --target-dir /user/hduser
, it doesn't take it for some reason. When I give the target as a local file system, it takes it. Why?