I know at a first glance, this will get a duplicate mention. And it very well might be, but I imagine most people are going to think that the answer is similar to this question: Division returns zero
That is not my case. That case, is dividing a smaller number by a larger number integers and getting 0. That makes sense. What's happening for me does not make sense.
Here's my code:
decimal correctedImageWidth = screenWidth / maxColumnsWithMaxSizeImages;
The value for screenwidth, which is an int, is 1024
. The value for maxColumnsWithMaxSizeImages, also an int, is 3
.
Somehow, correctedImageWidth
becomes 0. Which is odd, because 1024/3 does not equal 0, nor would the rounded off number be 0, like the other SO questions had. You'd think I'd get something like 341
. Here's proof that the numbers are what I say they are:
As you can see in my watch
. screenWidth
is 1024, maxColumnsWithMaxSizeImages
is 3. However, dividing these 2 into correctedImageWidth
is 0? Is there any reason why I would get 0 from this? I have shown this to colleagues, who are equally as confused. Here's my entire code, perhaps there's something I'm doing? Unlikely, seeing as they're both ints and they both have valid integer values. But obviously there must be something I'm doing? Here's the code:
int maxColumnsWithMaxSizeImages = (int)System.Math.Ceiling(decimal.Divide(1024, 480));
if (maxColumnsWithMaxSizeImages < _minimumImagesPerRow)
{
.....
} else if (maxColumnsWithMaxSizeImages > _maximumImagesPerRow)
{
....
} else
{
//between 2 and 4 columns
var screenWidth = App.ScreenWidth;
decimal correctedImageWidth = (decimal)((decimal)screenWidth / (decimal)maxColumnsWithMaxSizeImages);
decimal test2 = 1024 / 3;
decimal test3 = (decimal)1024 / (decimal)3;
var test = correctedImageWidth;
}
UPDATE For some reason, it appears that there was a conflict in my variable declarations. Even though I declared them both in different scopes, for whatever reason it was stirring a conflict. After renaming the variable, I get the corrected value.