1

I'm trying to load data from AWS S3 into snowflake table, using the following COPY INTO command:

COPY INTO "TABLE_NAME"
FROM @<stage>/<file>/
FILE_FORMAT = (type=CSV SKIP_HEADER = 1 
RECORD_DELIMITER="\n" 
FIELD_DELIMITER="," 
FIELD_OPTIONALLY_ENCLOSED_BY = '"')

However I'm getting the following error message: Field delimiter ',' found while expecting record delimiter '\n' File '<file>', line 59994, character 1638 Row 59993, column ""<file>""["SOURCE_GROUP_ID":27] If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client.

Any ideas?

1 Answers1

1

In order to keep the backslash in the data and avoid error, then you can use the file format option ESCAPE_UNENCLOSED_FIELD=NONE to avoid the error.

copy into "TABLE_NAME" 
from @<stage>/<file> 
FILE_FORMAT = (type = CSV SKIP_HEADER = 1
RECORD_DELIMITER = '\n'
FIELD_DELIMITER="," 
FIELD_OPTIONALLY_ENCLOSED_BY = '"'
ESCAPE_UNENCLOSED_FIELD=NONE);

Note: The default value of ESCAPE_UNENCLOSED_FIELD is set to '\' and we set this option to NONE to load the data with backslash into the Snowflake table.