I'm trying to figure out how to work with the buffer's representation in int. Simple code
var buffer = Buffer.alloc(4, 'a');
var interface16 = new Uint16Array(buffer);
var interface8 = new Uint8Array(buffer);
console.log(interface16);
console.log(interface8);
Why does the log output the same arrays (by value and length)? After all, I'm doing a data representation in the form of 1 and 2 bytes per integer. At least the manuals say that Uint8Array - takes 1 byte from the buffer, which I think of as a sequence of 0s and 1s, in this case 32 ( 4 * 8). That is, I can put up with the fact that it will be an array with 4 elements. But Uint16Array takes 2 bytes for an integer, that is, the number must be different and the size of the array is 2. What am I wrong about and which buffer to pass to these constructors in order to visually see the difference? I suspect that in any case the returned array will be the same length as the number of bytes, or that it is a matter of how the console generates output. Probably. But there is not enough knowledge to understand why. Thank you for your attention.
P.S. If you also advise some non-brain-exploding literature directly on this issue, it will be great.