1

I'm trying to figure out how to work with the buffer's representation in int. Simple code

var buffer = Buffer.alloc(4, 'a');
var interface16 = new Uint16Array(buffer);
var interface8 = new Uint8Array(buffer);
console.log(interface16);
console.log(interface8);

Why does the log output the same arrays (by value and length)? After all, I'm doing a data representation in the form of 1 and 2 bytes per integer. At least the manuals say that Uint8Array - takes 1 byte from the buffer, which I think of as a sequence of 0s and 1s, in this case 32 ( 4 * 8). That is, I can put up with the fact that it will be an array with 4 elements. But Uint16Array takes 2 bytes for an integer, that is, the number must be different and the size of the array is 2. What am I wrong about and which buffer to pass to these constructors in order to visually see the difference? I suspect that in any case the returned array will be the same length as the number of bytes, or that it is a matter of how the console generates output. Probably. But there is not enough knowledge to understand why. Thank you for your attention.

P.S. If you also advise some non-brain-exploding literature directly on this issue, it will be great.

Vixer
  • 13
  • 1
  • 4
  • https://nodejs.org/api/buffer.html#buffers-and-typedarrays – Bergi Jan 16 '22 at 23:24
  • A nodejs `Buffer` is, despite the name, not an `ArrayBuffer` but an `Uint8Array` subclass. To access its underlying buffer (and construct the typed array as a view on it), you need to write `new Uint16Array(buffer.buffer);` and `new Uint8Array(buffer.buffer);`. – Bergi Jan 16 '22 at 23:29

2 Answers2

0

first you create a buffer with 4 a's inside.

then you you call this function with the buffer :

the doc says : new Uint16Array(object); When called with an object argument, a new typed array is created as if by the TypedArray.from() method.

see more : https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray/from

it creates an array of 4 uint16 by using every bytes of the buffer

a = 0x61 = 97 you then get four 97.

here is a stackoverflow of a sucessful buffer to uint16 conversion convert nodejs buffer to Uint16Array

Raphael PICCOLO
  • 2,095
  • 1
  • 12
  • 18
0

In your example, both typed arrays end up having 4 elements, but the Uint8Array only uses 4 bytes to represent its contents, while the Uint8Array uses 8. You can see this by inspecting the byteLength property:

var buffer = Buffer.alloc(4, 'a');
var interface16 = new Uint16Array(buffer);
var interface8 = new Uint8Array(buffer);
console.log('Uint16 length: %d', interface16.length);
console.log('Uint16 byteLength: %d', interface16.byteLength);
console.log('Uint8 length: %d', interface8.length);
console.log('Uint8 byteLength: %d', interface8.byteLength);

Output

Uint16 length: 4
Uint16 byteLength: 8
Uint8 length: 4
Uint8 byteLength: 4

So the typed array constructors are creating typed arrays with the same number of elements as the source buffer, but the Uint16Array constructor is using two bytes instead of one for each element, filling the high-order byte of each element with zero.

GOTO 0
  • 42,323
  • 22
  • 125
  • 158