We've got quite a massive codebase compiling and starting to run in FlasCC. When you just open the .swf, player's memory usage is ~300MB. It is more or less fine, since it seems like there's still around 300MB of dynamically-allocated memory available to the C++ code.
Problems start when we create threads. According to documentation, every thread copies the .swf in memory and runs in a sandbox. Does that mean that every pthread will eat up the same ~300MB of memory that were used by the player to open the .swf?
It seems so. I've done a simple test of spawning pthreads and dumping memory usage (what flash.system.System reports to us, as well as CModule.ram.length). Here's the log:
Starting 10 threads.
Memory usage: total=288MB private=335MB free=2MB CModule=33MB
Thread 0 started.
Memory usage: total=683MB private=732MB free=1MB CModule=36MB
Thread 1 started.
Memory usage: total=1071MB private=1121MB free=1MB CModule=37MB
Thread 2 started.
Memory usage: total=1459MB private=1510MB free=1MB CModule=38MB
At that point plash_player_debugger has exited (crashed) without any error messages.
This basically means no threading for us. After starting 2 pthreads, there's only ~50MB of memory available to the C++ code left.
Adobe Scout gives a bit deeper breakdown of memory usage. Here's what it reports when an .swf is running with 2 background threads: (a picture from the same question on Adobe forums)
The "Other" block has inflated from 11 to 800 MB after spawning these 2 idle pthreads. The memory was going into "Other Players" and "Uncategorized".
So the main question is: how to workaround this? Maybe there's a way to make AS3 workers consume less memory?