I am building a big pyomo model with over 1 million constraints and 2 million variables.
And I am looking for suggestions to reduce the memory requirements of the model that I am building.
At the moment it requires over 20gb
's of RAM.
How would I reduce this?
I've never tested defining variables with/without within=pyomo.NonNegativeReals
. But I am assuming it would reduce the amount of memory required for a given variable. Is there other things I could do without reducing the amount of the variables or constraints.
Eg:
Following var
will need X
bytes of memory
m.var = pyomo.Var(
m.index)
And maybe following will need X-1
bytes of memory
m.var = pyomo.Var(
m.index,
within=pyomo.NonNegativeReals)
Of course this is a speculation. Without testing one cannot be sure about this. However, I am willing to try anything if some1 has an idea or more experience regarding this issue.
Any ideas?
Some Tests:
Keep in mind that is not the real model but the example builded with an other data. But still the same script.
index=1000 // Full Consts // 347580 KB (commit) // 370652 KB (working set)
0 Const Full Rules // 282416 KB (commit) // 305252 KB (working set)
0 Const 0 Rule // 282404 KB (commit) // 305200 KB (working set)
1 Const 1 Rule // 290408 KB (commit) // 313136 KB (working set)
index=8760 // Full Consts // 1675860 KB (commit) // 1695676 KB (working set)