5

I have such an idea (don't know bad or good). I have utility, which connects by reglament to SQL server and fetches some data to application. Data is simple (2 varchar text attributes), but count of data is ~ 3 millions rows. So, my application uses network very intensively. Can I programmatically decrease (limit, throttling, etc...) the network bandwidth usage by SQL DataReader? Let it work more slowly, but not stress nither server nor client. Is this idea good? If not, what I have to do so?

Here is code, so far:

using (SqlConnection con = new SqlConnection("My connection string here"))
{
    con.Open();
    using (SqlCommand command = new SqlCommand(query, con))
    {                        
        using (SqlDataReader reader = command.ExecuteReader())
        {
            while (reader.Read())
            {
                yield return new MyDBObject()
                {                               
                    Date = (DateTime)reader["close_date"],                                
                    JsonResult = (string)reader["json_result"]
                };
            }
        }
    }
}
L. Guthardt
  • 1,990
  • 6
  • 22
  • 44
  • 1
    Yes - Data Readers can be used to lazily evaluate large data sets, without requiring materialization of all data on the client side. If you use async / await in your data access, you will likely need to move the yield return generator into a separate function (or perhaps local function). [Example here](https://stackoverflow.com/a/47376646/314291) – StuartLC Nov 21 '17 at 10:58
  • 1
    You can't throttle effectively on the client. If you do, all you do is hold up locks and buffers on the server end and block threads that can't proceed to produce rows -- even if it "works", in the sense that you reduce bandwidth, the server's wait stats will spike, which can get you some attention from a worried DBA who sees a lot of `ASYNC_NETWORK_IO` waits. You can reduce the amount of data you fetch per call to leave more room for other queries, but always process whatever data you are fetching as fast as possible. – Jeroen Mostert Nov 21 '17 at 11:01

1 Answers1

4

Making the server buffer data or hold an open query longer could actually be significantly increasing load on the server, but ultimately the only way to do what you're after would be to apply "paging" to your query, and access the data in successive pages, perhaps with pauses between pages. The pages could still be pretty big - 100k for example. You can achieve this relatively easily with OFFSET/FETCH in SQL Server.

Marc Gravell
  • 1,026,079
  • 266
  • 2,566
  • 2,900