1

I'm using HDP 3.X cluster and running spark sql using spark_llap, Is there a way to create external hive table using hive.createTable because the example provided in Hortonworks website is to use following code whereas this code will create manged table but I need external table.

hive.createTable("web_sales").ifNotExists().column("sold_time_sk", "bigint").column("ws_ship_date_sk", "bigint").create()
Sridhar
  • 27
  • 3

1 Answers1

0

you can directly use spark session to create a table.

example1 :

  //drop the table if already created
    spark.sql("drop table if exists my_table");
    //create the table using the dataframe schema
    spark.sql("create table my_table(....
    ") row format delimited fields terminated by '|' location '/my/hdfs/location'");

example 2:

spark.sql('create table movies \
         (movieId int,title string,genres string) \
         row format delimited fields terminated by ","\
         stored as textfile')                                              # in textfile format
spark.sql("create table ratings\
           (userId int,movieId int,rating float,timestamp string)\
           stored as ORC" ) 
Ram Ghadiyaram
  • 28,239
  • 13
  • 95
  • 121