25

I am trying to load data into AWS redshift using following command

copy venue from 's3://mybucket/venue'
credentials 'aws_access_key_id=<access-key-id>;aws_secret_access_key=<secret-access-key>'
delimiter '\t';

but data load is failing, when I checked Query section for that specific load I noticed it failed because of "Bad UTF8 hex sequence: a4 (error 3)"

Is there a way to skip bad records in data load into redshift?

Mike G
  • 4,232
  • 9
  • 40
  • 66
roy
  • 6,344
  • 24
  • 92
  • 174

1 Answers1

53

Yes, you can use the maxerror parameter. This example will allow up to 250 bad records to be skipped (the errors are written to stl_load_errors):

copy venue 
from 's3://mybucket/venue' 
credentials 'aws_access_key_id=;aws_secret_access_key=' 
delimiter '\t' 
maxerror as 250;
Vzzarr
  • 4,600
  • 2
  • 43
  • 80
mike_pdb
  • 2,828
  • 16
  • 16