I am new to spark shell and I am trying to add new table and read it. I have added this file:
workers.txt:
1201, satish, 25
1202, krishna, 28
1203, amith, 39
1204, javed, 23
1205, prudvi, 23
and run the commands:
spark-shell
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
sqlContext.sql("CREATE TABLE workers (id INT, name VARCHAR(64), age INT)")
sqlContext.sql("LOAD DATA LOCAL INPATH 'workers.txt' INTO TABLE workers")
>> res5: org.apache.spark.sql.DataFrame = []
val resultW = sqlContext.sql("FROM workers SELECT *")
>> resultW: org.apache.spark.sql.DataFrame = [id: int, name: string ... 1 more field]
resultW.show()
>>
+----+----+----+
| id|name| age|
+----+----+----+
|null|null|null|
|null|null|null|
|null|null|null|
|null|null|null|
|null|null|null|
+----+----+----+
but as you see table has only nulls, why is that? The file workers.txt is in same working dirictory.