I'm trying to create a large file with numpy memmap
big_file = np.memmap(fnamemm, dtype=np.float32, mode='w+', shape=(np.prod(dims[1:]), len_im), order='F')
The system is a Windows 10-64bits operating in a 64bits python
In [2]: sys.maxsize
Out[2]: 9223372036854775807
With enough virtual memory (maximum of 120000Megas)
However every time I try to create a file which resulting size should exceed 2Gigas I get a runtime error
In [29]: big_file = np.memmap(fnamemm, dtype=np.int16, mode='w+', shape=(np.prod(dims[1:]), len_im), order=order)
C:\Users\nuria\AppData\Local\Continuum\anaconda3\envs\caiman\lib\site-packages\numpy\core\memmap.py:247: RuntimeWarning: overflow encountered in long_scalars
bytes = long(offset + size*_dbytes)
---------------------------------------------------------------------------
OSError Traceback (most recent call last)
<ipython-input-29-66578da2d3f6> in <module>()
----> 1 big_file = np.memmap(fnamemm, dtype=np.int16, mode='w+', shape=(np.prod(dims[1:]), len_im), order=order)
~\AppData\Local\Continuum\anaconda3\envs\caiman\lib\site-packages\numpy\core\memmap.py in __new__(subtype, filename, dtype, mode, offset, shape, order)
248
249 if mode == 'w+' or (mode == 'r+' and flen < bytes):
--> 250 fid.seek(bytes - 1, 0)
251 fid.write(b'\0')
252 fid.flush()
OSError: [Errno 22] Invalid argument
This error does not happen when the files sizes are under 2Gigas...
I have replicated the same problem with another windows 7 also 64bits
Have I forgotten something? Why is memmap acting as I have a 32bits system?
EDIT: The error is not exactly a runtime error. the variable "bytes" gets an runtime warning when trying to get the length of the file, resulting I guess in a bad argument that raises the Errno 22