7

Why does Java not expand the heap size until it hits the OS-imposed process memory limit, in the same way .NET CLR does?

Is it just a policy made by JVM developers, or is an advantage of .NET CLR's architecture over JVM's one? In other words, if Oracle engineers want to implement automatic heap expansion for the JVM, are they able to do that?

Thanks

EDIT: I really think it is a bad design choice for java. It is not safe to set the Xmx as high as possible (e.g. 100 GB!). If a user need to run my code on bigger data, he may run it on a system with more available RAM. Why should I, as the developer, set the maximum available memory of my program? I do not know which size the data is !!!

Vatine
  • 20,782
  • 4
  • 54
  • 70
Ali
  • 443
  • 4
  • 15

2 Answers2

7

The JVM increases the heap size when it needs to up to the maximum heap size you set. It doesn't take all the memory as it has to preallocate this on startup and you might want to use some memory for something else, like thread stacks, share libraries, off heap memory etc.

Why Java does not expand the heap size until it hits the OS-imposed process memory limit, in the same way .NET CLR does?

If you set the maximum heap size large enough, or use off heap memory, it will. It just won't do this by default. One reason is that heap memory has to be in main memory and cannot be swapped out without killing the performance of your machine (if not killing your machine) This is not true of C programs and expanding so much is worse than failing to expand.

If you have a JVM with a heap size of 10% more than main memory and you use that much, as soon as you perform a GC, which has to touch every page more than once, you are likely to find you need to power cycle the box.

Linux has a process killer when resources run out, and this doesn't trigger you might be luck enough to restart.

Is it just a policy made by JVM developers, or is an advantage of .NET CLR's architecture over JVM's one

A key feature of the JVM is that it is platform independent, so it has its own control. The JVM running at the limit of your process space is likely to prevent your machine from working (from heavy swapping) I don't know .NET avoids this from happening.

In other words, if Oracle engineers want to implement automatic heap expansion for the JVM, are they able to do that?

It does already as I have said, it's just not a good idea to allow it to use too much memory.

Peter Lawrey
  • 525,659
  • 79
  • 751
  • 1,130
3

It is a developers decision to decide how much heap memory must be allowed for the java process. it is based on various factors like the project design, platform on which it is going to run etc.

We can set heap size properties

-Xms<size>        set initial Java heap size
-Xmx<size>        set maximum Java heap size
-Xss<size>        set java thread stack size

As you can see we set the initial heap size and if later JVM finds that more is needed then it can increase the heap size upto the maximum specified limit. Infact the size changes when we do GC(not a mandate). I had posted question on similar grounds. You can refer to it. So increase/decrease of heap size is done by JVM. All we have to do as developers is specify limit based on our requirements.

Community
  • 1
  • 1
Aniket Thakur
  • 66,731
  • 38
  • 279
  • 289
  • yes, but one cannot always guess how much memory does the code require? Also, it is not safe to set the Xmx as high as possible (e.g. 100 GB!). If a user need to run my code on bigger data, he may possibly supply more RAM. Why should I, as the developer, set the maximum available memory of my program? I do not know how big is the data being processed. – Ali Sep 12 '13 at 11:21
  • If you do not know how big your data is being processed then it is your design flaw. Before the product is shipped to the clients edge cases are tested(part of SDLC). And you ask **should I, as the developer, set the maximum available memory of my program?** because you have to tell your client(also document) that this is the maximum load the program can take(or generally it is other way around meaning depends on client requirement but none the less we know the estimates). – Aniket Thakur Sep 12 '13 at 12:13
  • For example if you are making some text based app in android and data exceeds the max memory you have set(if not set some defaults are set which BTW is not good programming practice if you are making a client product) then the app will simply crash(which you obviously don't want). Rather if you have the estimates you may give warnings to the user about the load being increased. – Aniket Thakur Sep 12 '13 at 12:15
  • I agree, but to some extent! I think JVM's behavior is OK in most cases, e.g. in a usual web application or android game. However, in some applications it is inevitable to use maximum memory size. As an example, If you develop a scientific computing tool, a video processing lib., or a data mining application, the user potentially want the program to use maximum available RAM, accroding to the data being parsed! It is not possible to expect the programmer to define maximum required memory! – Ali Sep 12 '13 at 18:41
  • Assume you are writing a video converter software, which of these errors in your opinion, are more acceptable for the user, when he is trying to convert a very big file: **"**_File too big to fit in the memory_**"** , OR **"**_File too big to keep in the JVM heap size!_**"**. I am sure the second one is just a joke! – Ali Sep 12 '13 at 18:45
  • haha :) For java heap is the memory. What you display to user what makes sense to them(file to big in the example you gave). Even if I have 3 GB RAM free then it is very much possible to get that error for 1GB file because the video converter software has 1GB limit. – Aniket Thakur Sep 12 '13 at 19:05