1

I am creating an external hive table from a csv file located on IBM Cloud Object Storage. I am using beeline client while ssh'd into the cluster with the clsadmin user. I was able to make jdbc connection. Getting the below error while creating the table.

The csv file is located in the bucket - bucket-name-masked and I have named the fs.cos parameter set as 'hivetest'

0: jdbc:hive2://***hostname-masked***> CREATE EXTERNAL TABLE NYC311Complaints (UniqueKey string, CreatedDate string, ClosedDate string, Agency string, AgencyName string, ComplaintType string, Descriptor string, LocationType string, IncidentZip string, IncidentAddress string, StreetName string, CrossStreet1 string, CrossStreet2 string, IntersectionStreet1 string, IntersectionStreet2 string, AddressType string, City string, Landmark string, FacilityType string, Status string, DueDate string, ResolutionDescription string, ResolutionActionUpdatedDate string, CommunityBoard string, Borough string, XCoordinateStatePlane string, YCoordinateStatePlane string, ParkFacilityName string, ParkBorough string, SchoolName string, SchoolNumber string, SchoolRegion string, SchoolCode string, SchoolPhoneNumber string, SchoolAddress string, SchoolCity string, SchoolState string, SchoolZip string, SchoolNotFound string, SchoolorCitywideComplaint string, VehicleType string, TaxiCompanyBorough string, TaxiPickUpLocation string, BridgeHighwayName string, BridgeHighwayDirection string, RoadRamp string, BridgeHighwaySegment string, GarageLotName string, FerryDirection string, FerryTerminalName string, Latitude string, Longitude string, Location string) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LOCATION 'cos://*bucket-name-masked*.hivetest/IAE_examples_data_311NYC.csv';

Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:cos://bucket-name-masked.hivetest/IAE_examples_data_311NYC.csv is not a directory or unable to create one) (state=08S01,code=1) 0: jdbc:hive2://hostname-masked>

This looks like a permission issue but I have provided all credentials for the relevant user ids in hdfs as well as cos

Chris Snow
  • 23,813
  • 35
  • 144
  • 309

1 Answers1

2

Issue was with the cos URL. Filename is not to be provided. Only the bucket is to be named and objects in it would be read. With filename, the whole path gets read as bucket name and looks for obj in there.