It's easier to understand the memory use of arrays:
In [100]: p = np.arange(10)
In [101]: sys.getsizeof(p)
Out[101]: 176
In [102]: p.itemsize*p.size
Out[102]: 80
The databuffer of p
is 80 bytes long. The rest of p
is object overhead, attributes like shape
, strides
, etc.
An indexed element of the array is a numpy
object.
In [103]: q = p[0]
In [104]: type(q)
Out[104]: numpy.int64
In [105]: q.itemsize*q.size
Out[105]: 8
In [106]: sys.getsizeof(q)
Out[106]: 32
So this multiplication doesn't tell us anything useful:
In [109]: sys.getsizeof(p[3])*len(p)
Out[109]: 320
Though it may help us estimate the size of this list:
In [110]: [i for i in p]
Out[110]: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
In [111]: type(_[0])
Out[111]: numpy.int64
In [112]: sys.getsizeof(__)
Out[112]: 192
The list of 10 int64
objects occupies 320+192 bytes, more or less (the list overhead and its pointer buffer plus the size objects pointed to).
We can extract an int object from the array with item
:
In [115]: p[0].item()
Out[115]: 0
In [116]: type(_)
Out[116]: int
In [117]: sys.getsizeof(p[0].item())
Out[117]: 24
Lists of the same len
can have differing size, depending on how much growth space they have:
In [118]: sys.getsizeof(p.tolist())
Out[118]: 144
Further complicating things is the fact that small integers have a different storage than large ones - ones below 256 are unique.