I have a full-text index on a column in a table that contains data like this:
searchColumn
90210 Brooks Diana Miami FL diana.brooks@email.com 5612233395
The column is an aggregate of Zip, last name, first name, city, state, e-mail and phone number.
I use this column to search for a customer based on any of this possible information.
The issue I am worried about is with the high number of reads that occurs when doing a query on this column. The query I am using is:
declare @searchTerm varchar(100) = ' "FL" AND "90210*" AND "Diana*" AND "Brooks*" '
select *
from CustomerInformation c
where contains(c.searchColumn, @searchTerm)
Now, when running Profiler I can see that this search has about 50.000 page reads to return a single row, as opposed to when using a different approach using regular indexes and multiple variables, broken down like @firstName
, @LastName
, like below:
WHERE C.FirstName like coalesce(@FirstName + '%' , C.FirstName)
AND C.LastName like coalesce(@LastName + '%' , C.LastName)
etc.
Using this approach I get only around 140 page reads. I know the approaches are quite different, but I'm trying to understand why the full-text version has so much more reads and if there is any way I can bring that down to something closer to the numbers I get when using regular indexes.