No, thatâ€™s not an acceptable compromise. All modern operating systems use virtual memory. This allows you to allocate more memory than there is RAM. I want my java application to have access to this.
I can hear it now: but youâ€™ll be grinding the hard drive and your application will come to a crawl.
Yes, thatâ€™s a possibility. But:
A) Applications donâ€™t access every bit of its memory at once. My apps usually allow multiple documents to be opened at a time. This hit might only occur when they switch documents.
B) When the user notices this, theyâ€™ll realize theyâ€™re pushing their system, but instead of losing their data, they are just slowed down a little. How is data-loss acceptable and preferable to a little slow down?
Let me repeat: How is data-loss acceptable and preferable to a little slow down?
I think this problem only really affects a small minority of applications. Specifically where the heap requirements may approach the amount of contiguous address space available to a client or where a combination of java heap plus varying memory requirements of native code in the same process approach this limit. Where this does not apply you can use something like -Xmx800m and your clients will able to work on some pretty big documents without concern.
So we are concerned with processes with heap rquirements in the 1GB+ range. Given address space fragmentation (due to DLLs and other items) this is starting to push the limit of what fits at all under Windows at least. The easy answer is that such processes are good candidates for using a 64 bit JVM which avoids the problem by allowing the use of much larger values for the -Xmx parameter.
If you are stuck on 32bit machine then address space fragmentation means you will have trouble with larger values of maximum heap as the current implementation requires a contiguous address space for the heap and I've seen a number of PCs where the max contiguous space was little more than 1GB. So to reliably get a bigger heap we would need a garbage collector that supported a discontinuous heap address space --- exactly the same change as needed to remove the requirement for -Xmx parameter.
If you really need such a big heap, one approach is to split the task into multiple processes with a 'server' process that does the heavy lifting. This can be started dynamically with a suitable -Xmx parameter, and by restricting the scope of its work might load fewer address space fragmenting DLLs.
I agree this is an issue that needs to be solved. It's also really bad that Java currently can't free memory back to the OS without killing the VM.
I don't think elimating Xmx is a good idea - it'll give Java applications the possibility to engulf enough memory would start to hinder other processes; this happens with non-Java applications, and is not a desirable behaviour as far as I'm concerned.
But if Java can RELEASE memory back to the OS instead of a monotonic increase in memory footprint over time, the default Xmx can then be set to a much higher value. So in effect it will achieve the same aim as you have proposed here ;)
So you're saying that because some programs are ill-behaved and use too much memory (garbage collection does not eliminate memory leaks), well-behaved programs should not have the opportunity to use as much memory as they need?
Of course the VM should release memory back to the OS, but I see no valid reason for limiting a program's maximum amount of memory beyond technical issues in the VM. And end users don't want to hear that technical issues with the language are causing them problems.
Also, what other language that is in widespread use has such a memory limitation?
> Also, what other language that is in widespread use
> has such a memory limitation?
AFAIK, there don't seem to be one - and that is why we should keep this Feature in Java, coz it's a good one.
What if the ability to limit your memory usage were left in for whatever purposes (backward compatibility if nothing else), but a flag is made available allowing the application to use as much memory as needed?
I'm not following what's a use-case for wanting to limit memory. Yes, a program COULD use all your memory. So? Maybe that's what my program is supposed to do. That's why I have 2 GB in my machine: so my apps can use it.
I do image editing with large files at home. This consumes alot of memory, so I've upgraded my memory. I bought it so my apps can use it.
I've also used programs that had memory leaks. If it was bad enough that it caused me problems, I quit using it.
I'm not against java helping the developer whenever possible, but forcing me to know up front my memory usage is just bad. What if you made a java-based Word processor? You're going to arbitrarily limit how long of a document the user can write? You have no way of knowing how much memory they will need.
Yes, I've got tons of RAM on my desktop as well - but I want myself, the user, to decide how much I'm going to allow that program that you have written to use.
There are times (and it's quite often) that "If it was bad enough that it caused me problems", I will simply end up having data losses (due to your program dragging the whole system), so it's irreversible damage right before I can "quit using it".
and while we're at it bring in the pointers so we can malloc(100000000)?
This would definatly be nice.
Your use of this web site or any of its content or software indicates your agreement to be bound by these Terms of Participation.
Copyright © 2014, Oracle and/or its affiliates. All rights reserved. Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.