1

I'm using SVM-light for my researches and it's doing fine (at least it's still processing)

svm_learn.exe -t 3 -m 4000 learn_data model

Learn_data here is about 14.000.000 lines of of data with 20.000 of features.

But as for -z p flag for ranking mode it's crashes with 1.000.000 lines of data:

svm_learn.exe -t 3 -z p -m 4000 learn_data_1mil model

As a result:

 OK. (10000000 examples read)
 Constructing 1380570988 rank constrains...Out of memory!: Not enough space

Current setup is 64GB of RAM and It doesn't seem SVM tries to use all of it. I've tried to use more than 4000 MB as a cache up to 20000. But it didn't help and if it was a cache error SVM should tell about cache not just memory error. Didn't find documented way to solve it. What can be done to make my data processed?

aromatvanili
  • 175
  • 4
  • 12
  • I got the same error under quite different conditions: Ubuntu 14, 1400 lines, 1e10 features. It ran out of memory while using only 100 MB of the computer's 32 GB. – Camille Goudeseune Jul 29 '16 at 15:54
  • It OOMed when it tried to allocate 8 * 1e10 bytes (80 GB) for the array of weights used by `update_linear_component()`. In the source code, look for `totwords` and `my_malloc`. There's a few dozen more malloc's there. One of those may be failing for your large number of lines. – Camille Goudeseune Jul 29 '16 at 19:39

0 Answers0