Hello, everyone, this is my first post here.
So today during my university class, our professor gave us a task to write an algorithm:
Write a function that returns the count for the amount of steps you need to make in order to get the most score in a board game:
Rules of the game:
- you throw a dice and move accordingly (1-6 steps).
- the amount of tiles on the board can range anywhere between 2 - 99 999.
- when you step on a tile you receive or lose points (the points on each tile vary from -99 999, to 99 999).
- if you are at the end of the board and your dice throw gets you out of it's boundaries, you don't move.
My approach
It's sort of a greedy algorithm:
- count for each step if it's above or equal to 0,
- if it's negative, check for the next 6 tiles and move to the one with the highest score, to lose the least amount of points.
I realized that my approach is wrong, after I imagined this example:
So imagine an array of {1, -40, -40, -40, -40, -1, -38, -40, -40, -40, -40, -40, 1}
My greedy algorithm starts at 1 and sees four -40's, one -38 and one -1. It chooses -1 because it's the best option, but now we will end up with a result of: 1 + (-1) + (-38) + 1 = -37, however if we choose -38 instead of -1, we would end up with: 1 + (-38) + 1 = -36.
This is just a simple example of what the problems could be, I imagine I'd have to check for every path possible, because greedy algorithms don't check for the best path out there, only the best applicable for some particular moment.
I was wondering if a graph with all the possibilities could be an option here, but if we had an array of only negative numbers, then we would end up with a graph with maximum size of something around (99999^6?), which would result in taking up too much memory.
I'm a newbie and I've ran out of ideas. Could anyone point me towards the right direction?