1

There are table columns with data type TEXT from our SQL Server, and when DMS runs to load data into S3 from the SQL Server, those columns with the data type are dropped. I've tried following the below instruction but the columns are still not present from files in the target S3 bucket.

CLOB

TEXT

To use this data type with AWS DMS, you must enable the use of CLOB data types for a specific task.

For SQL Server tables, AWS DMS updates LOB columns in the target even for UPDATE statements that don't change the value of the LOB column in SQL Server.

During CDC, AWS DMS supports CLOB data types only in tables that include a primary key.

Documentation

I've tried variations of the below transformation rules from the console.

Results:

  • data-type nclob: loads successfully but columns are dropped
  • data-type clob: loads successfully but columns are dropped
  • data-type string: not able to save due to unspecified length but no place to set the length on the console
  • Using terraform: Adding both source and target data type would error on Terraform

E.g

table_mappings = [
    {
      "rule-type" : "transformation",
      "rule-id" : "<RULE_ID>",
      "rule-name" : "ConvertDataType",
      "rule-action" : "change-data-type",
      "rule-target" : "column",
      "object-locator" : {
        "schema-name" : "<TARGET_SCHEMA>",
        "table-name" : "<TARGET_TABLE>",
        "column-name" : "%",
      },
      "data-type" : {
        "type" : "nclob"
      }
    }
  ]

How do I properly set the transformation rule for the SQL server so the columns with TEXT data type are not dropped when DMS loads the data? The goal is to read the data from Snowflake with all columns as VARCHAR

Thank you!

Young
  • 419
  • 3
  • 19

0 Answers0