Astrometric calibration can be systematically offset in the presence of DCR if the effective wavelength weighted by the SED of the stars selected for fitting does not match the effective wavelength of the filter. In that case a constant offset should be added to each DCR subfilter, or else the recovered flux in each subfilter will be biased towards the blue or red, depending on the SEDs of the selected stars.
This bias can be seen in simulations, where we can achieve "perfect" astrometric calibration by skipping it altogether. In this case, we can compare the measured color of sources to the true color known from the input to the simulations. Note that these simulations include only point sources (both stars and quasars, in red), and the reference catalog for calibration includes only stars. With no calibration errors we recover the input colors accurately:
However, if we include astrometric calibration, then the slight shifts in the detected centroid locations of the stars results in a slight overall shift. The resulting measured color vs true color plot shows increased scatter and a systematic bias:
This ticket is to attempt to measure the shift in simulations, and correct for it by adding an offset to the DCR shifts.