0

How can I see the size (i.e. memory usage) of all objects currently in memory?

Running the following does not provide correct figures for memory usage

[(m, sys.getsizeof(m)) for m in dir() if not m.startswith('__')]

# Returns
# ('ElasticNet', 59),
#  ('ElasticNetCV', 61),
#  ('In', 51),
# ...

Whereas for example sys.getsizeof(ElasticNet) shows me that ElasticNet has a size of 1064.

Additionally, is there a convenient tool for assessing which objects are taking up large amounts of RAM, so as to delete and garbage collect them during the script or session?


NB Total memory used by Python process points out how to profile memory at the level of the Python process, whereas I want to determine memory usage for all objects separately, and moreover do so conveniently by retrieving size of the object that m (inside the list comprehension) is referencing.

Anil
  • 1,097
  • 7
  • 20
  • Does this answer your question? [Total memory used by Python process?](https://stackoverflow.com/questions/938733/total-memory-used-by-python-process) – Olvin Roght Mar 17 '22 at 09:55
  • @OlvinRoght No, my question is about determining memory usage for all objects separately, and moreover how to get the size of the object that `m` inside the comprehension is referencing – Anil Mar 17 '22 at 09:59

1 Answers1

2

sys.getsizeof is not recursive, it just gives you the "intrinsic" size of the object you pass in. So e.g. for an instance of a regular pure-python class it'll pretty much always give you 48 (for recent Python versions), with __slots__ it'll be 32 + n_slots * 8, that's because a regular instance is really an empty shell with a __dict__ and a __weakref__ pointer, whereas if you add __slots__ all the fields are "inline" (not stored through a dict) and the __weakref__ pointer is only included if you add it. If you want to know the "full" size of an object you'll have to recurse into them, but the insight you get from that is complicated by e.g. shared ownership (if you have a 1KB list which is associated with two different objects, do you count it twice? If not, to which object's tally does it contribute?).

Python classes are pretty large objects, so each class is about 1k (there's some variation depending on the class' specifics).

If you want to have more visibility into object sizes, the tools to do so are in the gc module, which let you iterate essentially everything in various ways. Though that's very low level and not exactly convenient, a memory profiler of some sort (pympler, guppy, memory_profiler) will usually offer higher-level tools, though that doesn't make their usage trivial.

Masklinn
  • 34,759
  • 3
  • 38
  • 57