I have a difficulty selector set as an enum (None=0, Easy = 1<<0, Medium = 1<<1, Hard = 1<<2, Expert = 1<<3). Along with this, I have an array of point values I want to assign to these difficulties. So the array has indexes as so. [0, 100, 133, 166, 200].
The tricky bit, is this. I want to grab the index of the array, that is equivalent to the bit shift of the difficulty. So None = 0 (0000)-> Index = 0. Easy = 1 (0001)-> Index = 1. Medium = 2 (0010)-> Index = 2. Hard = 4 (0100) -> Index = 3. Expert = 8 (1000) -> Index = 4.
I tried doing the Square root originally, as I thought that it was powers of two, but quickly realized that it's actually not RAISED to two, it's just a base of two. So that would never work.
I also know I can get this value via a forloop, where I start at 8 (1000) for instance, and keep a counter as I shift right, and keep that going until it hits 0.
int difficulty = (int)LevelSetup.difficulty;
int difficultyIndex = 0;
while(difficulty != 0)
{
difficultyIndex++;
difficulty = difficulty >> 1;
}
currScorePerQuestion = ScorePerQuestion[difficultyIndex];
IE. Counter = 0; val = 8. | SHIFT | Counter = 1; val = 4; |SHIFT| Counter = 2; val = 2; |SHIFT| Counter = 3; val = 1; |SHIFT| Counter = 4; val = 0; |END| and we end with a value of 4.
The problem with this is that it seems really messy and overkill, especially if you wanted to go up to 64 bits and have lots of indicies. I just know that there is some kind of algorithm that I could use to convert these very simply. I am just struggling to come up with what that equation is exactly.
Any help is super appreciated, thanks!