Hello SQL Gurus out there... I have this month long problem that I can't seem to solve. I have this really wide (Reporting) table approx. 100+ fields. Right now, it has around 2M records and with the rate the data is coming, it will probably double this count in the next 1-2months. It is running acceptable at the moment. But not so sure anymore in the next couple of months. Basically, I just want to know what would be the best indexing plan for this type of table. This is not the actual table, but it is very close. I just want to illustrate the various data types used.
ID bigint IDENTITY (UNIQUE CLUSTERED INDEX)
Column1 varchar(4000) NULL
...
Column50 varchar(1000) NULL
Column60 int NULL
...
Column60 nvarchar(max) NULL
Now, this table is currently queried in this manner:
SELECT <ALL COLUMNS EXCEPT ID>
FROM ThisTable WITH(NOLOCK) --i know how this works dont warn me :)
WHERE
(@Column1 IS NULL OR CHARINDEX('|' + [Column1] + '|', @Column1) > 0))
...
AND (@Column99 IS NULL OR CHARINDEX('|' + [Column99] + '|', @Column99) > 0))
Possible values for each params/Fields are:
@Column1 = '|value1|value2|value3|'
Column1
value1
value2
I know basic execution plan reading and from what I can see... there's a lot of SCANS going on. But as i've said, it seems like i've already hit the wall that's why I'm here now seeking for help :). I know how to index small tables based on what they are intended for, tables but THIS wide?? I can just lol at myself :D
Any idea guys? I've read a little about Columnstore INDEX.. this is the most viable solution that i can think of.. but there's very little information available at this point for this approach.
If you're wondering how I came up with this table. Well its contents are coming from different tables (flatten). Processing are done on a nightly basis. The result are exported into CSV file which will then serves as the data source for another Reporting application.
Thanks in advance!