The code above generates one contiguous buffer of a billion elements, with []
access that lets you get at elements as a 3-dimensional 1000-sided cube.
A vector of vectors of vectors would be a whole pile of non-contiguous buffers linked by pointers and ownership semantics.
I suspect you are suggesting
using u_ptr=std::vector<std::array<std::array<int,1000>,1000>>;
then resizing said arr_t
to 1000
once created. This has the modest cost of an extra 2 pointer overhead in the handle object. It also permits varible size, which means that ensuring it is fixed size as intended is something the user code has to ensure. You'd want to block a pile of methods, basically everything unique_ptr
doesn't expose, to ensure safety, or audit that your code doesn't use any of them.
Some of those operations could be very expensive; .push_back({})
would reallocate a gigabyte.
Now, maybe you intend you won't ever call that; but if you have generic code that processes vectors, you'd have to audit all of it to ensure that none of it every does these operations. It isn't possible to have a non-const handle to a vector
that cannot resize it, for example, without rolling-your-own-span-class at this point.
We could block the methods we do not want to expose with private inheritance and using
statements, but at this point we end up doing most of the work to get back to the unique_ptr
solution.