I have a simple pipe delimited file (newfile.txt) on my hdfs and I have configured my polybase correctly. I am having a tough time to import this file into my SQL Server using Polybase. Here are the queries:
At first an external file format is created:
CREATE EXTERNAL FILE FORMAT TextFile
WITH (
FORMAT_TYPE = DELIMITEDTEXT
, FORMAT_OPTIONS ( Field_Terminator = '|',
USE_TYPE_DEFAULT = TRUE));
Second, the data source for Hadoopcluster is created:
CREATE EXTERNAL DATA SOURCE HadoopCluster
WITH (
TYPE = HADOOP,
LOCATION = 'hdfs://10.153.14.11:8020'
)
Finally, the external table is created which imports the data from hdfs:
create external table tmpExternal
(
patientEncounter varchar(8000),
PtAcctNo varchar(200))
with (location = '/user/newfolder/',
data_source = HadoopCluster,
file_format = TextFile,
reject_type = value,
reject_value = 0);
After running the above query, here is the error which I end up getting:
Msg 596, Level 21, State 1, Line 26
Cannot continue the execution because the session is in the kill state.Msg 0, Level 20, State 0, Line 26
A severe error occurred on the current command. The results, if any, should be discarded.
I am sure that there is no issue with disk space. Please assist.
UPDATE: HDP 2.5 version is being used. I have installed Polybase successfully and this is my first external table I am trying to import from hdfs.
Format of the txt file has 2 rows and 2 columns:
1234|abcd
5676|efgh