0

can anyone explain how to load the data from HDFS to hive external table with out deleting the source file. If I use

LOAD DATA INPATH '/user/root/cards/deckofcards.txt' INTO TABLE deck_of_cards;

is file user /user/root/cards will be deleted?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
User1
  • 107
  • 4
  • 15
  • You may check this http://stackoverflow.com/questions/7567605/how-to-load-data-to-hive-from-hdfs-without-removing-the-source-file – anandan Mar 12 '17 at 00:15

1 Answers1

1

For loading the data into Hive tables, we can use

  1. Use external tables when files are already present in HDFS, and the files should remain even if the table is dropped.

Example :-

create external table table_name (
   id int,   
   field_name string 
) 
row format delimited 
fields terminated by <any delimiter>
location '/hdfs_location';
  1. Use managed tables when Hive should manage the lifecycle of the table, or when generating temporary tables.

Example :-

create table table_name (   
    id int,   
    field_name string 
)  
row format delimited 
fields terminated by <any delimiter>
location '/hdfs_location';

To find out what kind of table :- describe formatted table_name

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Deepan Ram
  • 842
  • 1
  • 10
  • 25