3

I have searched here but can't find anything using the search terms I can come up with. There is probably another post here somewhere that would answer my question but I have failed to find any.

The PHP manual states:

Using this method to fetch large result sets will result in a heavy demand on system and possibly network resources.

It also goes on to say you should use WHERE and ORDER BY to lighten the load which makes perfect sense. But this still does not answer the question, how much is too much? How would you define a result set that is too large?

For example, I wrote this bit of code for a class I am creating:

public function getQuestions($quiz_id){
        $quiz_id = parent::onlyNums($quiz_id);
        $stmt = $this->db->prepare(
                        "SELECT quiz.title, quiz.description, questions.question,
                         questions.answer, questions.wrong_one, questions.wrong_two,
                         questions.wrong_three FROM quiz, questions WHERE 
                         questions.quiz_id = :quiz_id AND quiz.id = questions.quiz_id");
                $stmt->bindValue(':quiz_id',$quiz_id,PDO::PARAM_INT);
                try
                {
                    $stmt->execute();
                    if($stmt->rowCount() > 0 )
                    {
                        return $stmt->fetchAll(PDO::FETCH_ASSOC);
                    }else{
                        echo "No Items match the query";
                        exit();

                    }
                }
                catch(PDOException $e)
                {
                    return $e->getMessage();
                }
    }

Now, for the application I am building I would not expect this query to retrieve a massive amount of results because there should never be that many due to the nature of the app. But at some point, if someone went crazy and added millions of questions to a single quiz would this script fail?

Thanks for any info or advice.

codeguerrilla
  • 452
  • 4
  • 16
  • 2
    its all about memory setting, try to brute force your application and soon you'll see PHP complaining about not enough memory. only then you can calculate approximately the maximum number of records for your case – Hawili Jun 09 '13 at 00:29

3 Answers3

4

I don't think there is a fixed size that is too much. It would really depend on the hardware/network that you are running on. You would be able to handle a much larger dataset on a dedicated server with 100GB of RAM than you would with a small EC2 instance. However, you should always strive to be as efficient as possible with your queries and only return the rows that you need for the current operation.

adear11
  • 935
  • 6
  • 11
2

This waring is actually not on fetchAll() in particular but on sanity in general.

As a matter of fact, this function is just a syntax sugar for the simple looping over query results and adding them into array. So, if there would be millions of questions, your page definitely would crash, though not because of fetchAll() but because of amount of data.

So, for the average web page it's all right to use this function.
But in case of some cron-based data-mining script it would be unwise to use it - create a loop and process each row one by one without storing them into array instead.

Your Common Sense
  • 156,878
  • 40
  • 214
  • 345
0

One way to check is to use limit like , run 10 files each having different limit. first, 1000, second, 2000 , then 3000 so on and see when you are getting error. you will get idea from that.

  • that theoretically would work if the hardware never changes, but the point at which errors start to occur will vary depending on the server hardware and configuration. – adear11 Jun 09 '13 at 00:45