0

I need further help with my SQL problem.

In this database on movies, ratings and actors: https://i.stack.imgur.com/qFIbC.jpg

I am required to find the actor who has the largest difference between their best and their worst rated movie.

The condition is that the ratings cannot be lower than 3! (>3) My current SQL looks as follows:

SELECT * FROM stars
JOIN ratings ON stars.movie_id = ratings.movie_id
WHERE ratings.movie_id = (
    SELECT MAX(rating) - MIN(rating) FROM ratings
    WHERE rating > 3);

My expectations were that I would get somewhat of a result in my Github terminal that I can work with to adjust my SQL query.

But I seem to have reached a dead-end and I'm not sure how to solve this solution

  • Please don't upload images, but provide both sample input and expected outcome as tables. – Jonas Metzler Oct 26 '22 at 06:04
  • To solve this you need to follow 2 logical steps 1) For each actor, find the difference between their best and worst rating 2) From this information, find which actor has the greatest difference. Your existing SQL is a step in the right direction to solving 1) so it shouldn't take much to get that to work; then you can solve 2) – NickW Oct 26 '22 at 09:59

1 Answers1

0

You need to GROUP BY actor to calculate everyone's rating range. Then, take the actor with the largest range. Something like this:

SELECT
    person_id,
    MAX(rating) - MIN(rating) AS rating_range
FROM
    stars
    JOIN ratings ON stars.movie_id = ratings.movie_id
WHERE
    rating > 3
GROUP BY
    person_id
ORDER BY
    2 DESC
LIMIT
    1
;
John K.
  • 480
  • 4
  • 8