0

I have a table with about 60gb and with these columns:

ColumnA: type: Int
ColumnB: type: Int
ColumnC: type: Real
ColumnD: type: Real
ColumnE: type: Real
ColumnF: type: Real
ColumnG: type: Real

I have an index [ColunaA, ColunaB]

I have a laptop with 16gb of ram, windows 10 pro, sql server 2017 enterprise, SSD with 1gb read / write.

You can query this table like this:

Select TOP 250000 * from table
where columnA> @parameter AND columnA <@ parameter2
AND columnB> @ parameter3 AND columnB <@ parameter4

without taking minutes / hours to develop a result?

Should I use partitioned tables with filegroups stored on other SSD disks to increase query performance?

Dee_wab
  • 1,171
  • 1
  • 10
  • 23
  • 1
    Proper index would help `CREATE INDEX my_idx ON my_tab_name(columnA, ColumnB)`; Assuming that your where conditions are SARGable. – Lukasz Szozda May 28 '18 at 10:02
  • I already have that index. that's not enough – Joao Rafael May 28 '18 at 10:09
  • Then standard procedure: please post query execution plan(I guess that you have some issue with FULL-SCAN) Also for your own sake avoid Real data type. – Lukasz Szozda May 28 '18 at 10:10
  • Any reason to avoid real data type? I want to reduce storage of data and I think that real type is better for it. But I dont know if it is worst for performace.. – Joao Rafael May 28 '18 at 10:25

0 Answers0