I'm running Apache Spark with Standalone, and when I connect to myip:8080, I always see something like "Memory in use: 120.0 GB Total, 1.0 GB Used". Why only 1Gb is used if much more memory is available? Is it possible (or desirable) to increase the amount of memory that is actually used?
Asked
Active
Viewed 496 times
0
-
4`spark.executor.memory`? – zero323 Jul 30 '16 at 14:53
-
It works, thanks. I thought it could only be used for Yarn, but it works on Standalone too. – Ric Jul 30 '16 at 15:24