Skip to main content

Problems with memory use, many large files

3 replies [Last post]
rossputin
Offline
Joined: 2006-05-05

Hi.

I am for the first time trying to write many, many large-ish files, on average about 10 Meg PDF's from a servlet environment. The memory my java webapp uses seems to stabilise out at around 800 Meg. If I produce 100 files, the free memory on my macbook pro drops to just over 100 Meg, (I have 4 Gig). Once my servlet has finished producing the PDF files, if I delete them, completely independently from stopping the servlet container... I instantly free up well over a Gig of memory. I am not sure what is causing this behaviour... as this memory is freed not when I stop the servlet container, but when I delete the files.

Can any tuning and performance experts out there give me a little advice ?

Thanks in advance for your help.

- Ross

Reply viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
peter__lawrey
Offline
Joined: 2005-11-01

It sounds like you are including the file cache in the memory used. Note, the file cache includes every file you have ever written or read (if you have enough memory)
So on unixes, they will tend to appear to have full memory utilisation just from disk activity.

The only way the file cache is freed up is if you unmount a volume which has cached files, or you delete a file. i.e. the OS doesn't cache deleted files.

This is why it appears there is free memory after deleting a file.

I suggest you also look at how much memory is used without the file cache.

rossputin
Offline
Joined: 2006-05-05

Hi Peter.

Thanks for that, I will try to analyse the breakdown of the memory use in a little more detail. Can you recommend a tool to do this? I am only familiar with top, and the 'Activity Monitor' on OSX. I suppose this cache behaves differently on say a linux server box? Is there a setting in the JVM to avoid this if I am writing large numbers of files? Or is it in fact quite harmless, it will never completely deplete the memory on the box? Sorry for the questions, this is not something I have encountered before.

Thanks in advance for your help,

-- Ross

peter__lawrey
Offline
Joined: 2005-11-01

> Can you recommend a tool to do this?
I am not familiar with Mac OS, a forum for this OS might be able to help you
> I suppose this cache behaves differently on say a linux server box?
Mac OS is basically unix (with a lot of extras), and even windows XP works much the same way.
> Is there a setting in the JVM to avoid this if I am writing large numbers of files?
Do you mean you want to cache files which have been deleted or not to cache files you have written? To what aim? I think you are trying to solve problem which isn't really a problem. If a program needs the memory it will always take priority so the overhead of having a disk cache is notional.
> Or is it in fact quite harmless, it will never completely deplete the memory on the box?
It doesn't matter if it does, what would be the point of having memory you could never use in full?
> Sorry for the questions, this is not something I have encountered before.
No problem, I have been performance tuning systems for over 15 years so it not big deal to me.