I have experience slow queries on mysql database. The database seems simple like words than set of words and related things like images, recordings, sentences to words. Bluehost claim I have huge overhead and slow queries and need to optimize it.
What can be solutions to optimize database! List of typical steps to check for each table (relation) or heavily used queries to improve this?
Example queries found in tmp/logs
SELECT transcription FROM english_details WHERE translation_id = '2216'
SELECT recording FROM word_recordings_de WHERE translation_id = '3342'
SELECT english_sentences.sentence_id, english_sentences.sentence_eng,
english_sentences.sentence_pl,english_sent_recordings.recording
FROM english_sentences LEFT JOIN english_sent_recordings USING (sentence_id)
WHERE english_sentences.translation_id = '2239'
SELECT post_it FROM post_it WHERE user_id = '' AND translation_id = 10562 LIMIT 1
I consider whether this is problem with database, queries or two many users. Is number of 480 000 sessions (google analytics) per month for Bluehost shared hosting to big number?
UPDATE Do you think that crucial operation of loading words for given wordset using this:
$arWordIds = $this->getWordsIds();
foreach($arWordIds as $wordId) {
switch($this->wordset_type) {
case WordSet::TYPE_BASIC:
$col->addItem(new Word($this->dbc, $wordId), $wordId);
break;
case WordSet::TYPE_DETAILED:
$col->addItem(new WordDetails($this->dbc, $wordId), $wordId);
break;
}
}
May be the problem? As there is one query for ids of words and then queries for each word for given id (possibly several queries independently for sentences, recordings, etc.) ? Maybe better to do one bigger query like all words data like WHERE translation_id IN (id1,id2,id3,...) with many joins returning data for 100 words in one run?
But than how can I fill this Word objects manually? as I have this Word objects in PHP as some DAO objects that loads data for given Word with supplied id. Then this programming will be better suited for some procedural programming than object oriented I think. While I am lading all data in one request (word, sentences, comments)? Is that even possible to load in one request words for many ids and for each word many associated sentences? or images?
UPDATE 2 I have Indexes, Unique, etc. in many places and in some of them I have added it. Primary Keys are everywhere. Indexes I have added when attributes are used in WHERE clause. Of course there may be some lacking Index if I forgotten to add it.
I have been executing **EXPLAIN SELECT ... ** and try too achieve access type other than ALL like const, ref, eq_ref, or system.
Example table: (there is I suppose > 50-100 tables in database)
CREATE TABLE `post_it` (
`post_it` int(10) unsigned NOT NULL AUTO_INCREMENT,
`user_id` int(10) unsigned NOT NULL,
`translation_id` int(10) unsigned NOT NULL,
`text` tinytext NOT NULL,
`lang` varchar(5) NOT NULL DEFAULT 'plen',
PRIMARY KEY (`post_it`),
KEY `user_id` (`user_id`,`translation_id`)
) ENGINE=MyISAM AUTO_INCREMENT=235 DEFAULT CHARSET=utf8
I have also replaced Word object construction. So far Word properties where lazily loaded each when accessed. So it was in loop (e.g. 100 words in the list) and then printing out in XML each properties caused 5-10 queries to get native word, foreign word, transcription, etc. No I added to Word constructor eager preloading of all properties that are 1to1, and only properties 1toN I left lazily loaded. Now I consider whether such query will be better then several simple without joining:
SELECT nt.french_id, nt.article, n.french_word, ft.english_id, '-' AS 'english_article', en.english_word, p.part,
ed.transcription, ed.definition, r.recording
FROM translation ft
INNER JOIN translation_enfr nt ON nt.translation_id = ft.translation_id
INNER JOIN french n ON n.french_id = nt.french_id
INNER JOIN english en ON en.english_id = ft.english_id
INNER JOIN parts p ON p.part_id = ft.part_id
LEFT JOIN english_details ed ON ed.translation_id = ft.translation_id
LEFT JOIN word_recordings r ON r.translation_id = ft.translation_id
WHERE ft.translation_id = 3