2

I have a client who is utilizing full-text indexing in their application. They rebuild the indexes via the application. When they do this, the manually remove all columns from the catalog. Then they add them all back. After doing that they start a new Incremental Population.

After this process they often end up with corrupted indexes. When they perform searches that should return data, nothing is returned; even after the population is complete. So the index is corrupt. The only way they can get it back is to rebuild the indexes in SQL Server Management Tools.

They have many (20-50 or more) columns and tables in a catalog. Is there a recommended limit to the number if times in a catalog? Should they break this up into smaller catalogs?

Should they be rebuilding the indexes by removing all columns and adding them back? Or is there a better way? Or is just sending an ALTER REBUILD command the better way?

Thanks in advance.

Bridge
  • 29,818
  • 9
  • 60
  • 82
Dan
  • 195
  • 1
  • 3
  • 10
  • `Or is there a better way?` Use [Lucene.Net](http://incubator.apache.org/lucene.net/) ;) – L.B Jun 18 '12 at 19:51
  • From what I see, Lucene.Net indexes documents stored in a folder. My client is using SQL Server full-text search to search rows in a table. Can Lucene.Net do that? – Dan Jun 19 '12 at 00:21
  • `From what I see, Lucene.Net indexes documents stored in a folder` No, Lucene.Net indexes `strings`. Where you feed it from depends on your application http://incubator.apache.org/lucene.net/links.html – L.B Jun 19 '12 at 06:01

1 Answers1

0

there is no need to delete column ,you just have to recreate the full text indexes you can find the solution here http://msdn.microsoft.com/en-us/library/bb326034(v=sql.105).aspx

Buzz
  • 6,030
  • 4
  • 33
  • 47