3

I am struggling with using the train_test_split function from scikit-learn with 3d Numpy arrays.

I have a feature array with shape (1860000, 144, 12) and a label array with shape (1860000,). In a different case train_test_split works well. But when I try to do split this data, my kernel shuts down. Already updated the related packages in my Conda environment

I am grateful for any advise.

Milo
  • 3,365
  • 9
  • 30
  • 44
MadPhil
  • 31
  • 2
  • 2
    Looks like this has been answered here https://stackoverflow.com/questions/31467487/memory-efficient-way-to-split-large-numpy-array-into-train-and-test – nithin Nov 19 '19 at 13:48
  • It is a memory error, see nithin posts for help – PV8 Nov 20 '19 at 07:08
  • Possible duplicate of [Memory efficient way to split large numpy array into train and test](https://stackoverflow.com/questions/31467487/memory-efficient-way-to-split-large-numpy-array-into-train-and-test) – PV8 Nov 20 '19 at 07:08

0 Answers0