2

I was trying to optimize my MySQL queries, but found out that i'm actually doing this wrong. I've changed my query from using

SELECT * FROM test WHERE tst_title LIKE '1%'

To:

SELECT * FROM `test` WHERE MATCH(tst_title) AGAINST("+1*" IN BOOLEAN MODE)

And the runtime, for the FULLTEXT, was terrible.. See them below:

USING LIKE:

Showing rows 0 - 24 (1960 total, Query took 0.0004 sec)

USING FULLTEXT:

Showing rows 0 - 24 (1960 total, Query took 0.0033 sec)

I've read many tutorials wherein they explained, on why you should use FULLTEXT (since this actually searches by indexes). But how would this be a slower way to retrieve data, then the LIKE statement (since the LIKE statement has to go through every single record in order to return their validity)?

I literally can't figure out on why this is happening.. Help on optimization would be appericiated a lot!

2 Answers2

1
  • Unless you set the min_word_len to a smaller number than the default, FULLTEXT cannot find all the values starting with 1

  • If test_title is a numeric value (INT, FLOAT, etc), then both LIKE and FULLTEXT are terrible ways to do the query.

  • Given INDEX(tst_title) (and it being VARCHAR or TEXT), then LIKE is likely to run faster, since it only has to check all entries starting with 1.

  • The timings you list smell like the Query Cache took over. For timing purposes, use SELECT SQL_NO_CACHE ... to avoid the QC.

  • If you use MATCH or LIKE without having FULLTEXT or INDEX, respectively, then the query has no choice but to scan all rows in the table.

  • Where did 1960 total come from? Does the timing include computing that?

  • Is the table MyISAM? Or InnoDB? There are differences in FULLTEXT that may factor in this thread.

Rick James
  • 135,179
  • 13
  • 127
  • 222
  • To answer all your questions, yes my table is InnoDB and has the variable min_word_len set to 1. I also tried disabling the QC, but the difference in time, was still slower. Besides that, i'd also like to mention that the DATA type is VARCHAR. I'll be trying out the INDEX() function now. P.S: 1960, we're the affected rows including '1'. – Testuser070 Sep 04 '15 at 07:11
  • I assume you set `innodb_ft_min_token_size`. What got the 1960? `SQL_CALC_FOUND_ROWS`? A separate `COUNT(*)`? Something else? How many rows in the table? – Rick James Sep 04 '15 at 16:12
0

From what I've read, if you were using ...tst_tile LIKE '%1%' this would be slower as it has to perform a full table scan and has no index. The one you currently have with a wildcard on the right can use an index, and it is probably the reason why it is faster than using FULLTEXT.

Not too sure on it myself, but this is what I read and hope it helps.

EDIT: May want to read this answer here for a full explanation on FULLTEXT vs LIKE

Community
  • 1
  • 1
SeanS
  • 418
  • 1
  • 6
  • 14
  • Even though using both wildcards, LIKE is faster than FULLTEXT. Using LIKE '%1%' gave me more rows + a faster runtime.. – Testuser070 Sep 03 '15 at 08:11
  • Have you tried using CONTAINS as well and compared that to LIKE and FULLTEXT? – SeanS Sep 03 '15 at 08:14
  • I did, but the runtime got even worse. LIKE seems to be the fastest, using just one wildcard. Not sure why though.. – Testuser070 Sep 03 '15 at 08:24
  • I included a link to another question on SO similar to yours, the first answer describes in-depth that there isn't a real performance improvement. It's an interesting read. Sorry I could not provide further insight. – SeanS Sep 03 '15 at 08:56
  • I read it, thank you for the link though. Doing some research myself, now. Hopefully I can find a way to optimize my query at its best. – Testuser070 Sep 03 '15 at 08:57