-2

How to pass an Excel file from a WinForms client to a WCF service and into an SQL Server table?

Can anyone provide any guidance, code, or advice?

  1. WCF service contract and implementation that will take an Excel file as a parameter
  2. Contract implementation should insert that Excel file into a varbinary(MAX) column in SQL Server.
Sam
  • 2,663
  • 10
  • 41
  • 60

2 Answers2

2
  • Here is a post that addresses the WCF portion of your question.
  • Once you get the file to the server you can use FileHelpers.net to parse that file into an object.

I don't care what your requirement says, do not insert an excel document into a sql server varbinary(max) column. The requirement should say "Upload an Excel document, insert its contents into a database. However, we need to relate the original excel file to the data within the database so we can repudiate any claims that our process failed as well as have a verification mechanism."

  • Create another table called EXCEL_IMPORT or something that has more
    or less the following columns

Check the extended property I put on there for column clarifications

create table EXCEL_IMPORT 
(
     ImportID   int identity(1,1) NOT NULL CONSTRAINT [PK_EXCEL_IMPORT] PRIMARY KEY,
     FileName_Incoming  varchar(max),
     FilePath_Internal  varchar(max),
     FileName_Internal  varchar(max),
     FileRowCount   int NOT NULL CONSTRAINT [CK_EXCEL_IMPORT_FileRowCount] CHECK  (FileRowCount >= 0),
     ImportDate datetime NOT NULL CONSTRAINT [DF_EXCEL_IMPORT_ImportDate] DEFAULT(getdate())
)

EXEC sys.sp_addextendedproperty @name=N'MS_Description', @value=N'The location on the client computer i.e. C:\Users\jimmy\Desktop\iHeartExcel.xls' , @level0type=N'SCHEMA',@level0name=N'dbo', @level1type=N'TABLE',@level1name=N'EXCEL_IMPORT', @level2type=N'COLUMN',@level2name=N'FileName_Incoming'
EXEC sys.sp_addextendedproperty @name=N'MS_Description', @value=N'The folder on your fileshare the file is in (this is incase you decide to change the fileshare name)' , @level0type=N'SCHEMA',@level0name=N'dbo', @level1type=N'TABLE',@level1name=N'EXCEL_IMPORT', @level2type=N'COLUMN',@level2name=N'FilePath_Internal'
EXEC sys.sp_addextendedproperty @name=N'MS_Description', @value=N'The unique filename that you decided on in the fileshare i.e. 2012_04_20_11_34_59_0_71f452e7-7cac-4afe-b145-6b7557f34263.xls' , @level0type=N'SCHEMA',@level0name=N'dbo', @level1type=N'TABLE',@level1name=N'EXCEL_IMPORT', @level2type=N'COLUMN',@level2name=N'FileName_Internal'
  • Next, write a process that copy's the excel file to a location on your fileshare, and creates a unique filename. At this point you have an object that represents the file, the file itself and all the information about where you are putting the file.
  • Create a table that mimicks the columns of the excel spreadsheet and add an ImportID on the end of it that references back to the excel_import table defined above as well as an identity primary key.
  • After that, write a process that inserts the objects into the database with the specified relationships. This will vary based on your current setup.
  • Finally you should have a keyed table with all the excel data in it that references a row in an import table that points to a file on disk.

Some other thoughts

  • I would think about not allowing the excel data within the table to be modified. Copy it in a calculated form to another table. Sometimes this doesn't make since because of the volume of data but what you are doing here is building a provable chain of data back to the original source and sometimes it makes sense to have an untainted file copy as well as sql copy.
  • The first response your going to have is "But all the excel files are different!" If that is the case, you just create the import table that points to a file on disk (Assuming they are supposed to be different). If they are supposed to be the same, you just need more error checking.
  • Saving the binary file within the database is going to have consequences. Namely the backup size and that sql can't really index those kinds of columns. Your traversals of that table will get also get slower with every file insert and as a general rule you don't want that. (You can't do any more or less with the file on disk than you can with the binary file)
  • Use a GUID with a prepended date in your filename on the share. You will never hunt through those files anyway and if you need to do it by date you can use the filename. This makes the names globally unique incase other processes need to write here as well.
  • I know this isn't what you asked for, but I have been down this path before with terrible consequences. I've imported millions of files with the method I described above and had no major process issues.
  • Speak up when requirements are unfeasible. Suggest alternatives that do the same thing cheaper or easier (use words like testable/scalable).
Community
  • 1
  • 1
Jeremy Gray
  • 1,378
  • 1
  • 9
  • 24
  • Hi Jeremy, the reason we want to store the entire Excel file in the DB is so that we can download it later, whereas if we load it's contents only then we would have to reconstitute the Excel file before its downloaded. By saving the entire file we make our lives easier. BTW, Team Foundation Server uses an SQL Server DB to save *files* not their contents, so there is nothing wrong with storing files in databases. Indeed that's what an operating system does too, though it may not be a relational database. :-) – Sam Apr 21 '12 at 03:10
  • The method I posted just stores the contents on a file server instead of in the database, uses the database to point to the file on disk. I assume you needed the data in some way as well, which is why I suggest storing the data in a table as well. You can still download it this way, faster and easier. – Jeremy Gray Apr 23 '12 at 13:42
  • You are correct, TFS does store files in the DB. They also support functionality that the file system does not...versioning of files. I don't mean to say there are not cases for storing files in databases, but this functionality comes at a cost. I stand by my answer. – Jeremy Gray Apr 23 '12 at 13:52
  • Also Jeremy, we don't really know how to reconstitue an Excel file. – Sam Apr 23 '12 at 14:18
  • There is no reconstituting, you save the actual file on disk then serve the same file over and over again. Hope it all works out. – Jeremy Gray Apr 23 '12 at 18:14
0

I'm sure the experts out there can improve upon this, but here are the basics ...

On the Server

1a. Add a new OperationContract to your Interface (eg. IService.cs)

[OperationContract]
string UploadBinaryFile(byte[] aByteArray);

1b. Insert into SQL Server table, in your contract Implementation (eg. Service.cs)

public string UploadBinaryFile(byte[] aByteArray)
{
    try
    {
        SqlConnection conn = new SqlConnection();
        conn.ConnectionString = MyConnectionString; // from saved connection string
        conn.Open();
        using (SqlCommand cmd = new SqlCommand("INSERT INTO MyTestTable(BinaryFile) VALUES (@binaryfile)", conn))
        {
            cmd.Parameters.Add("@binaryfile", SqlDbType.VarBinary, -1).Value = aByteArray;
            cmd.ExecuteNonQuery();
        }

        return "1"; // to me, this indicates success
    }
    catch (Exception ex)
    {
        return "0: " + ex.Message; // return an error indicator + ex.message
    }
}

On the Client

2a. Use the OpenFileDialog component to browse for files on your filesystem using the standard dialogue box that's used by most Windows applications.

if (openFileDialog1.ShowDialog() == DialogResult.OK)
{
    txtUploadFilePath.Text = openFileDialog1.FileName;
}

2b. Load the file's contents into a byte array

var byte[] BinaryFile = System.IO.File.ReadAllBytes(txtUploadFilePath.Text);

2c. Call your WCF contract, passing in the byte array

string UploadResponse = client.UploadBinaryFile(BinaryFile);

It's working... YAY :-)

Sam
  • 2,663
  • 10
  • 41
  • 60