I have a database with listings on quite a good server (Quad Core Xeon 2.0Ghz, 16GB RAM, SSD Drives). The database has about 180,000 listings. There's no traffic yet on the server and I'm just testing it with a lot of listings to ensure there are no problems later on when there's actually that many live listings and actual traffic.
But even with no traffic yet, I feel like they should return faster than they actually are.
From the slow queries log, I was able to find this:
# Query_time: 1.575742 Lock_time: 0.000113 Rows_sent: 12 Rows_examined: 549024
There are 180,000 records, it has to return only 12 but it examines over 500,000 and it takes over 1.5 seconds? Something has to be wrong, right? :(
The actual query is:
SELECT a.listing_id, a.name, a.item_price, a.max, a.nb, a.currency,
a.end_time, a.closed, a.bold, a.hl, a.buy_price, a.is_offer, a.reserve,
a.owner_id, a.postage_amount, a.fb_current_bid, a.type, a.start_time,
a.is_relisted_item, a.enable
FROM db_listings a
LEFT JOIN db_users u ON u.user_id=a.owner_id WHERE a.active=1 AND
a.approved=1 AND a.deleted=0 AND a.creation_in_progress=0 AND
a.closed=0 AND (a.list_in='store' OR u.shop_active='1')
GROUP BY a.listing_id
ORDER BY a.list_in ASC, a.end_time ASC LIMIT 0, 12;
Indexes are already set on listing_id in db_listings as well as on user_id in db_users. I don't think the db_users join is a problem, because there are only 2 users in there right now.
If you need any additional information to tackle this problem, just let me know.
Any help is greatly appreciated :)