Any of these formats could work, I've seen APIs using both Arrays and base64 encoding. You could also use hex encoding, base58 any any other.
While you're not concerned about size, I would note that an Array encoded as JSON will use around 2.7 times the amount of data that base64 encoding will. Multiply this by thousands of requests and you have a lot more data going over the wire.
Any modern client should be able to decode any of these formats. I don't think base64 encoding will be substantially more difficult than using an Array, but it will be much more resource efficient.
You can demonstrate this easily enough:
function getRandomBuffer(length) {
return Buffer.from(Array.from( { length }, (v,k) => Math.floor(Math.random()*256)))
}
function getJSONLength(obj) {
return JSON.stringify(obj).length;
}
const buf = getRandomBuffer(200 * 1024);
const jsonLengths = {
Array: getJSONLength({ pdf: [...buf] }),
Hex: getJSONLength({ pdf: buf.toString("hex")}),
Base64: getJSONLength({ pdf: buf.toString("base64") })
}
console.table(jsonLengths);
JSON data size in bytes for 200kb:
┌─────────┬────────┐
│ (index) │ Values │
├─────────┼────────┤
│ Array │ 731143 │
│ Hex │ 409610 │
│ Base64 │ 273078 │
└─────────┴────────┘