J2K Lossy Decomposition Levels
I'm using the J2K image writer (codec) to perform lossy compression on uncompressed medical images. I have found that often times the images are over-compressed as to make them unusable. This seems to depend on the number of decomposition levels specified. For example, things generally work well for CT images (512x512, 16 bit, MONOCHROME2) when I use a value of 8 for decomposition levels. However, if I use a value of 10 about a third of the images are over-compressed by a factor of 2 and there are artifacts in the images. When I use CR images (2048x2500, 16 bits allocated, 12 bits stored, MONOCHROME2) it appears to work better with a value of 12 for decomposition levels.
Is there a rule of thumb for determining the appropriate number of decomposition levels to use when encoding an image?
Would such a rule be based on image size? Bits per pixel? Something else?
Why does the file size vary drastically based on the decomposition levels despite compressionQuality remaining constant? Should a .2 compression quality mean the image size will be roughly 20% of the original uncompressed image?
Any input would be appreciated.