Uploaded image for project: 'Request For Comments'
  1. Request For Comments
  2. RFC-416

Preliminary interface for transmission curves

    Details

    • Type: RFC
    • Status: Implemented
    • Resolution: Done
    • Component/s: DM
    • Labels:
      None

      Description

      On DM-12367, I've put together a proposal for a TransmissionCurve abstract base class for reporting the spatially- and wavelength-dependent throughput of an image; see in particular:

      for the full proposal.

      I am under no illusions that this design will survive unchanged to operations, and it may just be a throwaway prototype in the end, but I think it gives us enough to start propagating wavelength-dependent throughput information through the pipelines and at least some parts of photometric calibration now.

      A few notes on the design, mostly on how I expect these objects to be used and interpreted (which cannot be seen in the files linked above) :

      • We're not exposing this functionality as an expansion of the Filter API, because a TransmissionCurve is more general than a Filter - it could represent throughput variations due to CCD AR coatings, for instance, or the product of that with an actual Filter.
      • I am not currently planning to have Filter hold a TransmissionCurve, though I could be persuaded otherwise on this point. The problem is that the TransmissionCurve associated with a Filter would naturally be defined in focal plane coordinates, because the same Filter object is held by Exposures corresponding to different sensors. I'm worried that users of those Exposure objects will naively do exposure.getFilter().getTransmissionCurve() and pass pixel-coordinate points to that object. However, I do expect to persist the TransmissionCurve for a filter as a (butler) dataset that can be retrieved with a filter-only data ID.
      • The TransmissionCurve for an Exposure object will be attached directly to the Exposure object and persisted with it (for both CCD images and coadds). This TransmissionCurve's spatial coordinate system will be the pixel coordinate system of that Exposure, and it should represent the full wavelength-dependent throughput of the system for that Exposure, even if that comes from multiple physical components.
      • The fact that TransmissionCurve objects are composable and can be transformed to different coordinate systems does not mean that they should be backed by AST; AST composes coordinate system transforms, which actually are composed as functions (not just multiplied, as TransmissionCurves are) when combined. TransmissionCurves can be transformed to different spatial coordinate systems via afw::geom::Transform objects (though only the Point2 endpoint is supported), as shown in the headers linked above, which of course does use AST under the hood. While we could conceivably also support transforming the wavelength axes as well using AST, I do not think we have a use case for doing so (at least right now).
      • Someday, TransmissionCurve and PhotoCalib probably ought to be integrated. I'd like to mostly punt on this, so work that depends on each can proceed uninterrupted at the moment, and simply say that I think we should be able to replace the BoundedField held by PhotoCalib with a TransmissionCurve in the future. For now (and possibly for quite a while into the future), we will interpret the PhotoCalib of an Exposure as the transmission for some fiducial wavelength-dependent transmission, and the TransmissionCurve as a multiplicative correction to that.
      • I'd also like to punt for now on the throughput units and normalization for TransmissionCurve in general, as noted in the docstrings in the header; I'd like to see what different stages of the pipeline that work with TransmissionCurves know about the overall normalization or flux units before committing to anything. As noted in the previous bullet, the TransmissionCurve attached to an Exposure is interpreted as a multiplicative correction to the Calib (soon to be PhotoCalib), and is hence unitless.
      • In contrast, I believe we can just pick a wavelength unit and be done with it, and have done so in this proposal (Angstroms). I don't care at all deeply about that particular choice, but I don't think this is a case where we need a type system (like Angle or astropy.units) that lets developers use different units in different places.
      • I know other parts of the project (the Sims team in particular) has had Python objects that fill something like this role for a long time, with (I believe) a much richer interface for high-level operations (e.g. integral products between transmission curves and SEDs). The interface proposed here is really just the low-level part that different implementations of TransmissionCurve will have to support; we can add convenience methods to the base class in a future RFC. Because the low level interface here is still a superset of what Eli Rykoff came up (in Python) for his FGCM work, I have some confidence that it's sufficiently flexible for that. Checking whether the higher-level interfaces the Sims team has found useful can be implemented on top of these lower-level ones certainly is in scope, however (pinging Scott Daniel!).

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                jbosch Jim Bosch
                Reporter:
                jbosch Jim Bosch
                Watchers:
                Ian Sullivan, Jim Bosch, Merlin Fisher-Levine, Scott Daniel
              • Votes:
                0 Vote for this issue
                Watchers:
                4 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved:
                  Planned End:

                  Summary Panel