Pixel values are defined at integer coordinate locations. This means that when an image is rendered in a scaled, rotates, or otherwise transformed coord. system, an interpolation algorithm should be used to provide a pixel value at any continuous coordinate.
Currently, Firefly is using
RenderingHints.KEY_INTERPOLATION = RenderingHints.VALUE_INTERPOLATION_NEAREST_NEIGHBOR,
which means that when an image is rendered in a transformed coord. system, the pixel value of the nearest neighboring integer coordinate sample in the image is used.
"As the image is scaled up, it will look correspondingly blocky. As the image is scaled down, the colors for source pixels will be either used unmodified, or skipped entirely in the output representation."
Jon Thaler's team would like to be able to choose a different interpolation algorithm, depending on the situation.
As an example see various resize algorithms in [thttp://stackoverflow.com/questions/4756268/how-to-resize-the-buffered-image-n-graphics-2d-in-java].