Posted by javaben
on October 21, 2006 at 8:59 AM PDT
Is there a way to determine the largest set of strongly referenced heap data a process creates at any one time?
While profiling a Java EE web app, I found myself wondering how much memory a process required. I attached a low-instrusion profiler to the JVM and watched a pattern repeat itself several times: the heap grew by ~15 MB, got collected, grew again by ~15 MB, got collected, etc.
My first reaction was to conclude that the process required 15 MB of heap in order to complete. But of course, that's an erroneous conclusion. With different settings or constraints, perhaps the process would have added only 7 MB of garbage before the collector cleaned the heap.
What I really need to know is the size of the largest set of strongly referenced object data placed on the heap as a result of the process at any one time. Because I can reasonably isolate the process, I'd settle for a definite measurement of the strongly referenced objects on the heap at any one time.
So how do I figure this out? I've used a number of tools that show me the size of the heap over time. Some let me take a snapshot of the heap and remove all weakly-referenced data. But I have yet to find a tool that will, over a period of time, tell me what the largest amount of the uncollectable heap data is (in bytes).
Anyone know of tool or process to figure this out? I know I could reduce the heap size until the process throws an OutOfMemoryException. I'm looking for something a little more automated.