XMLWordPrintable

Details

• Type: RFC
• Status: Implemented
• Resolution: Done
• Component/s:
• Labels:
None

Description

The cp_verify package is being developed to provide a way to calculate quality metrics and compare those metrics to the thresholds set for various tests described in DMTN-101.  To be clear, this is not a verification of the stack code, but a check that the input raw exposures have produced a calibration product that will serve its intended purpose.

There are currently tests for bias, dark, and defects, with flats and brighter-fatter-kernels in active development.  The remaining calibrations will be added as metrics and tests are defined, and DMTN-101 will be updated to document tests that are added.

Although not yet complete, adding this package to lsst_distrib now will ensure that it is available on the summit computers without needing additional installation.  As it's expected that all daily calibration data taken will be run through cp_verify to monitor the stability of the camera, this will ensure that workflows at the summit are well defined.

If there is concern that this package doesn't match the meaning of "verify" used elsewhere, I am open to renaming the package to remove those concerns.

Activity

Hide
Tim Jenness added a comment -

Where are the metrics sent? Are they butler datasets?

Show
Tim Jenness added a comment - Where are the metrics sent? Are they butler datasets?
Hide
Christopher Waters added a comment -

They are stored in butler datasets, and accumulated at both the exposure and full processing run level, so there's both a boolean "did all tests pass" as well as a way to dig down to look at the value on a particular [exposure, detector, amplifier, test] level.  I'm working on ways to visualize these results, with a set of notebooks handling the butler retrieval of the cp_verify results and associated images to investigate why something failed.

Some subset that are useful for determining camera stability will likely be published to SQuaSH in the future to take advantage of their time domain plotting.

Show
Christopher Waters added a comment - They are stored in butler datasets, and accumulated at both the exposure and full processing run level, so there's both a boolean "did all tests pass" as well as a way to dig down to look at the value on a particular [exposure, detector, amplifier, test]  level.  I'm working on ways to visualize these results, with a set of notebooks handling the butler retrieval of the cp_verify results and associated images to investigate why something failed. Some subset that are useful for determining camera stability will likely be published to SQuaSH in the future to take advantage of their time domain plotting.
Hide
Kian-Tat Lim added a comment -

To be clear, the DMCCB was OK with the name.

Show
Kian-Tat Lim added a comment - To be clear, the DMCCB was OK with the name.

People

Assignee:
Christopher Waters
Reporter:
Christopher Waters
Watchers:
Andrés Alejandro Plazas Malagón, Christopher Waters, Colin Slater, Jim Bosch, John Parejko, Kian-Tat Lim, Leanne Guy, Michelle Butler [X] (Inactive), Robert Lupton, Tim Jenness, Wil O'Mullane, Yusra AlSayyad
0 Vote for this issue
Watchers:
12 Start watching this issue

Dates

Created:
Updated:
Resolved:
Planned End:

Jenkins

No builds found.