Accessing a large number of bufferedimages
I am sampling 128 x 128 points distributed across about 500 jpeg images (each 600x600) to create an image. This means that I must open these 500 files and read some RGB values at some x,y location with image.getRGB(x,y). I have a cache of the bufferedimage objects so that I don't need to re-create them too regularly. However the time to iterate through all the xy values is too long (>2 minutes) and I am running into memory problems when the cache holds more than 200 entries (~200MB used memory).
My current logic is:
1) lookup bufferedimage in cache. If found return bufferedimage.
2) create new bufferedimage for disk file.
3) add new bufferedimage to cache.
4) if number of entries in cache > max_cache_size remove least accessed cache entry.
5) return bufferedimage
My question is, can anyone tell me whether it would be faster and probably less memory exhaustive for me to replace the raster data in the bufferedimage object rather than create a new bufferedimage and ask the garbage collector reclaim the previous one. A sort of pool of bufferedimages to reuse. Also I am using ImageIO.read(...) to create the bufferedimages. Is this the fastest and best method.