0

I have Memory Error when I run np.arange() with large number like 1e10. how can I fix Memory Error on np.arange(0.01*1e10,100*1e10,0.5)

SuperKogito
  • 2,998
  • 3
  • 16
  • 37

2 Answers2

1

arange returns a numpy array.

if we go from 0.01e10 to 100e10 with steps of 0.5, there are approximately 200e10 items in your array. As these numbers are double precision (64 bits, or 8 bytes) per item, you would need 16 terabytes of RAM.

The best idea would be to change your algorithm. For instance, if you are using it in a for loop. e.g:

for t in np.arange(0.01*1e10,100*1e10,0.5):
  do_simulationstep(t)

Changing to use range in python3, or xrange in python2, means this array will be created on the fly using a generator.

for t in range(0.01*1e10,100*1e10,0.5):
  do_simulationstep(t)

As noted in the comments, however, this will not work. Range will only work with integers, so we will have to scale range to use integers and then rescale the result again:

for t in (x*0.5 for x in range(int(1e8/0.5),int(1e12/0.5))):
  do_simulationstep(t)

However, should you really need such a large amount of memory, then I think amazon is renting out server that might support it: EC2 In-Memory Processing Update: Instances with 4 to 16 TB of Memory + Scale-Out SAP HANA to 34 TB

Lanting
  • 3,060
  • 12
  • 28
  • 1
    `range` doesn't support float steps, so you would do `for t in range(2e8,2e12): do_simulationstep(0.5*t)` instead. – user2653663 Apr 11 '19 at 10:02
0

You are trying to create an array of roughtly 2e12 elements. If every element was to be a byte, you would need approximately 2Tb of free memory to allocate it. Not sure you have so much ram available, that is why you have the memory error.

Note: the array you are trying to allocate contains floats, so it is even bigger. Do you really need so much elements?

Hope it helps,

Tom
  • 303
  • 3
  • 14