The function that encodes a Unicode Code Point (Integer) to a char array (Bytes) in java is basically this:
return new char[] { (char) codePoint };
Which is just a cast from the integer value to a char.
I would like to know how this cast is actually done, the code behind that cast to make the conversion from an integer value to a character encoded in UTF-16. I tried looking for it on the java source codes but with no luck.