1

I have a swig generated C++ code file of 24MB, nearly 5,00,000 lines of code. I am able to compile it when set the compiler Optimization level to xO0,but fails as soon as i add any other C++ compiler flags(like xprofile ...). I am using Solaris Studio 12.3 C++ compiler.

Below is the console error:

    Element size (in bytes):        48
    Table size (in elements):  2560000
    Table maximum size:        134217727
    Table size increment:         5000
    Bytes written to disk:           0
    Expansions required:             9
    Segments used:                   1
    Max Segments used:               1
    Max Segment offset:        134217727
    Segment offset size::           27
    Resizes made:                    0
    Copies due to expansions:        4
    Reset requests:                  0
    Allocation requests:       2827527
    Deallocation requests:      267537
    Allocated element count:      4086
    Free element count:        2555914
    Unused element count:            0
    Free list size (elements):       0

ir2hf: error: Out of memory

Thanks in Advance.

  • 1
    Sorry, my magic wand isn't working, and I doubt anyone else's is either. Either install more memory on the host machine or (if running in a virtual machine) allocate more to the virtual machine. If a compiler runs out of memory with some options - compilation level is an option - that's a fair sign it needs more memory to work. There aren't generally ways to make a compiler do things with less memory than it needs. So you're left with making more memory available for it to use. Either that, or edit the file and create a set of smaller files from it, and compile them separately. – Peter Feb 11 '19 at 09:37
  • 1
    My crystal ball tells to try this: as root: `usermod -K defaultpriv=basic,sys_resource `, then as user: `ulimit -d unlimited` – rustyx Feb 11 '19 at 09:39
  • I don't wan't to make it into smaller files as it is Auto generated file. – Prajwal Kumar Feb 11 '19 at 10:01
  • 12.3 is old. IIRC, the tools were still 32-bit at the time, so they couldn't use more than 4G of memory. You may need a more recent compiler. – Marc Glisse Feb 11 '19 at 10:24

1 Answers1

0

I found this article suggesting that it has to do with the fact that Solaris the amount of memory for data segments.

Following the steps in the blog, try to remove the limit.

$ usermod -K defaultpriv=basic,sys_resource karel

Now logoff and logon again and change the limit:

$ ulimit -d unlimited

Then check that the limit has changed

$ ulimit -d

The output should be unlimited

Edgar H
  • 1,376
  • 2
  • 17
  • 31
  • 1
    the memory is already set to "unlimited" still it's not working. – Prajwal Kumar Feb 11 '19 at 10:00
  • 1
    Are you sure that you have enough memory on the machine? – Edgar H Feb 11 '19 at 10:03
  • 1
    I am working on a Server. It has enough Memory.load averages: 3.41, 4.07, 4.48 17:00:36 1211 processes:1199 sleeping, 3 zombie, 6 stopped, 3 on cpu CPU states: 92.2% idle, 3.5% user, 4.3% kernel, 0.0% iowait, 0.0% swap Memory: 160G real, 41G free, 102G swap in use, 20G swap free – Prajwal Kumar Feb 11 '19 at 10:40
  • Ok, looks like plenty of memory. Sorry, don't know how to solve this. – Edgar H Feb 11 '19 at 10:51