I am storing data in Hadoop according to the date they were created, so I have multiple directories
on HDFS that are created based on the format /data/{year}/{month}/{day}.
I wish to load these data in Hive (periodically) and create the corresponding partitions. For the moment I am experimenting with several approaches like the one below
CREATE EXTERNAL TABLE tablename (...)
PARTITIONED BY (year STRING, month STRING, day STRING)
LOCATION '/data';
ALTER TABLE tablename ADD PARTITION(year='2014', month='10', day='13') LOCATION '/data/2014/10/13';
but with this approach I need to manually create each partition with the ALTER command. Is there any way to automate and parameterize this process (and put it in a workflow) to dynamically load data into Hive partitions for each one of the sub-directories ?