public DataTable FetchData(string sQuery)
{
DataTable dtable = new DataTable();
using (SqlConnection conn = new SqlConnection(conString))
{
conn.Open();
using (SqlCommand sqlCmd = new SqlCommand(sQuery, conn))
{
SqlDataReader sdRead = sqlCmd.ExecuteReader();
dtable.Load(sdRead);
}
}
return dtable;
}
Datatable dt = FetchData(string sQuery);
foreach(DataRow row in table.Rows)
ClassA obj = new ClassA(row);
// Some manipulations
//.....
Class A
{
int id;
int name;
A(DataRow dr)
{
id = dr["ID"];
name = dr["Name"];
}
}
I need to retrieve nearly 15,00,000 rows from the database.
I need suggestions for two scenarios.
I call the above method 1-5 times so obviously it creates 1-5 connections. If it turns to 10-20 times wt about performance? (or create one global connection and open 1 connection and process all then close at the end.)
What about datatable? Any alternative. I think I need disconnected architecture here for this much of rows. I need to stuff my own class objects with the retrieved data (Or Iterate datareader and stuff the
List<objects>
inside the FetchData()).
Any suggestions?