0

I'm building a .net web app that will involve validating an email input field against a list of acceptable email addresses. There could be up to 10,000 acceptable values and they will not change very often. When they do, the entire list would be replaced, not individual entries.

I'm debating the best way to implement this. We have a SQL Server database but since these records will be relatively static and only replaced in bulk, I'm considering just referencing / searching text files containing the string values. Seems like that would make the upload process easier and there is little benefit to having this info in an rdbms.

Feedback appreciated.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
  • 2
    It's typically just much easier to work with only one datastore, and since you're already using SQL Server, I really don't see any benefit in putting something into the file system in a web app..... even if the list of e-mails doesn't change often - just put it into a table and be done with it! – marc_s May 20 '13 at 16:44

2 Answers2

2

If the database is already there then use it. What you are talking about is exactly what databases are designed to do. If down the road you decided you need do something slightly more complex you will be very glad you went with the DB.

Abe Miessler
  • 82,532
  • 99
  • 305
  • 486
  • 1
    +1 exactly - if that database is already there and being used - what's the point of ignoring it and using something else? – marc_s May 20 '13 at 16:46
0

I'm going to make a few assumptions about your situation.

  1. Your data set contains more than 1 column of usable data.
  2. Your record set contains more rows than you will always display.
  3. Your data will need to be formatted into some kind of output view (e.g. HTML).

Here are some specific reasons why your semi-static data should stay in SQL and not in a text file.

  1. You will not have to parse the text data every time you wish to read and process it. (String parsing is a relatively heavy memory and CPU load). SQL will store your columns as structured data which are pre-parsed.
  2. You will not have to develop your own row-filtering or searching algorithm (or implement a library that does it for you). SQL is already a sophisticated engine that applies advanced algorithms of caching, query optimization (with many underlying advanced algorithms for seek/search/scan/index/hash/etc.)
  3. You have the option with SQL to expand your solution's robustness and integration with other tools over time. (Putting the data into a text or XML file will limit future possibilities).

Caveats:

  • Your particular SQL Server implementation can be affected by disk IO and network latency performance. To tune disk IO performance, a well constructed SQL Server places the data files (.mdf) on fast multi-spindle disk arrays tuned for fast reads, and separates them from the Log Files (.log) on spindles that are tuned for fast writes.
  • You might find that the latency and busy-ness (not a word) of the SQL server can affect performance. If you're running on a very busy or slow SQL server, you might be in a situation where you'd look to a local-file alternative, in which case I would recommend a structured format such as XML. (However, if you find yourself looking for work-arounds to avoid using your SQL server, it would probably be best to invest some time/money into improving your SQL implementation.