0
explain select id, nome from bea_clientes where id in (
     select group_concat(distinct(bea_clientes_id)) as list
     from bea_agenda
     where bea_clientes_id>0
     and bea_agente_id in(300006,300007,300008,300009,300010,300011,300012,300013,300014,300018,300019,300020,300021,300022)
)

When I try to do the above (without the explain), MySQL simply goes busy, using DEPENDENT SUBQUERY, which makes this slow as hell. The thing is why the optimizer calculates the subquery for each ids in client. I even put the IN argument in a group_concat believing that would be the same to put that result as a plain "string" to avoid scanning.

I thought this wouldn't be a problem for MySQL server which is 5.5+? Testing in MariaDb also does the same.

Is this a known bug? I know I can rewrite this as a join, but still this is terrible.

Generated by: phpMyAdmin 4.4.14 / MySQL 5.6.26
Comando SQL: explain select id, nome from bea_clientes where id in ( select group_concat(distinct(bea_clientes_id)) as list from bea_agenda where bea_clientes_id>0 and bea_agente_id in(300006,300007,300008,300009,300010,300011,300012,300013,300014,300018,300019,300020,300021,300022) );
Lines: 2

 Current selection does not contain a unique column. Grid edit, checkbox, Edit, Copy and Delete features are not available.

| id | select_type        | table        | type  | possible_keys                 | key           | key_len | ref  | rows  | Extra                              |
|----|--------------------|--------------|-------|-------------------------------|---------------|---------|------|-------|------------------------------------|
| 1  | PRIMARY            | bea_clientes | ALL   | NULL                          | NULL          | NULL    | NULL | 30432 | Using where                        |
| 2  | DEPENDENT SUBQUERY | bea_agenda   | range | bea_clientes_id,bea_agente_id | bea_agente_id | 5       | NULL | 2352  | Using index condition; Using where |
Miguel
  • 3,349
  • 2
  • 32
  • 28
  • 1
    I updated the question with the expain result – Miguel Apr 28 '16 at 17:40
  • `id IN ... GROUP_CONCAT(...)` -- Are you sure you want that? `123 IN ('123,456,789')` does _not_ succeed. This does: `123 IN ('123','456','789')`, but that is not what you have, nor can you get it. Start over. – Rick James Apr 29 '16 at 04:45
  • Because this is just numbers i believe that you can opt out the quotes... – Miguel Apr 29 '16 at 09:55
  • But `GROUP_CONCAT` will generate _one string_. What is needed is several numbers. It will not produce that, unless you construct the `IN` clause in code or a Stored Procedure. – Rick James Apr 30 '16 at 03:53

1 Answers1

0

Obviously hard to test without the data but something like below. Subqueries are just not good in mysql (though its my prefered engine). I could also recommend indexing the relevant columns which will improve performance for both queries. For clarity can I also advise expanding queries.

select t1.id,t1.nome from (
    (select group_concat(distinct(bea_clientes_id)) as list from bea_agenda where bea_clientes_id>0 and bea_agente_id in                    (300006,300007,300008,300009,300010,300011,300012,300013,300014,300018,300019,300020,300021,300022)
    ) as t1
    join
    (select id, nome from bea_clientes) as t2   
    on t1.list=t2.id
)
Datadimension
  • 872
  • 1
  • 12
  • 31
  • The table has 3k records, and those fields are indexed already. Its really the subquery that messes up, I do solved with a join.. but thats wrong, mysql should accept a logical query like that. Its not my fault that the optimizing engine really optimizes wrong. – Miguel Apr 28 '16 at 12:46
  • And in the query you answer, remind that group_concat is already returning a coma id list, so you cannot use the t1.list=t2.id because that would match a full string with ids with a single id in the t2.id table. In that join one have to remove the group_concat to make it work. But the thing is the query that i wanted to do looks very logical to me, sad mysql cannot handle it. – Miguel Apr 28 '16 at 12:49
  • Well spotted re group_concat. I totally agree with you about the subquery but we are stuck with it !!! – Datadimension Apr 29 '16 at 09:22