1

We've got to load large pipe-delimited files. When loading these into a SQL Server DB by using Rhino ETL (relying upon FileHelpers), is it mandatory to provide a record class? We have to load files to different tables which have dozens of columns - it might take us a whole day to generate them. I guess we can write a small tool to generate the record classes out of the SQL Server tables.

Another approach would be to write an IDataReader wrapper for a FileStream and the pass it on to a SqlBulkCopy.

SqlBulkCopy does require column mappings as well but it does allow column ordinals - that's easy.

Any ideas/suggestions?

Thanks.

shamp00
  • 11,106
  • 4
  • 38
  • 81
Alberto
  • 333
  • 2
  • 16

1 Answers1

1

I don't know much about Rhino ETL, but FileHelpers has a ClassBuilder which allows you to generate the record class at run time. See the documentation for some examples.

So it would be easy to generate a class with something like the following:

SqlCommand command = new SqlCommand("SELECT TOP 1 * FROM Customers;", connection);
connection.Open();

// get the schema for the customers table
SqlDataReader reader = command.ExecuteReader();
DataTable schemaTable = reader.GetSchemaTable();

// create the FileHelpers record class
// alternatively there is a 'FixedClassBuilder'
DelimitedClassBuilder cb = new DelimitedClassBuilder("Customers", ","); 
cb.IgnoreFirstLines = 1; 
cb.IgnoreEmptyLines = true; 

// populate the fields based on the columns
foreach (DataRow row in schemaTable.Rows)
{
     cb.AddField(row.Field<string>("ColumnName"), row.Field<Type>("DataType")); 
     cb.LastField.TrimMode = TrimMode.Both;
}

// load the dynamically created class into a FileHelpers engine
FileHelperEngine engine = new FileHelperEngine(cb.CreateRecordClass());

// import your records
DataTable dt = engine.ReadFileAsDT("testCustomers.txt"); 
shamp00
  • 11,106
  • 4
  • 38
  • 81