I'm attempting to write an image processing library in Clojure, but I've run into a problem while writing my tests.
The image data is stored as a 2D array of integers, which are signed (Java, and by extension Clojure, doesn't have unsigned integers). I have a function to get a pixel at a given pair of coordinates. My test of that function looks something like this:
(is (= (get-pixel image 0 (dec width)) 0xFFFF0000))
That is, see if the pixel at (0, width-1) is red. The problem is, get-pixel returns a signed int, but Clojure treats 0xFFFF0000 as a long. In this case, get pixel is returning -65536 (0xFFFF0000 in hex), and Clojure is checking if it's equal to 4294901760.
Now, I have a few options. I could rewrite the test using the signed integer interpretation of the hex as a decimal (-65536), but I think that makes the intent of the test less clear. I could write a function to convert the negative number into a positive long, but that's an added layer of complexity. The simplest way is probably just to do a bitwise and between the two numbers and see if it's changed, but that still seems more complicated than it should be.
Is there any built-in way to force 0xFFFF0000 to be evaluated as a signed integer rather than a long, or to do a bitwise comparison of two arbitrary numbers? The int function doesn't work, as the number is too large to be represented as a signed int.
Thanks!