Uploaded image for project: 'Data Management'
  1. Data Management
  2. DM-11155

Implement object count metrics

    Details

    • Type: Story
    • Status: Done
    • Resolution: Done
    • Fix Version/s: None
    • Component/s: Alert Production
    • Labels:
      None
    • Story Points:
      6
    • Sprint:
      Alert Production F17 - 7, Alert Production F17 - 8, Alert Production F17 - 11, AP S18-1
    • Team:
      Alert Production

      Description

      By the All Hands Meeting, we should be able to demonstrate verification metrics by reporting object counts for pipeline stages. This ticket covers writing code for applying the verify framework to output data, analyzing each stage, and supporting SQuaSH export (though the export itself is the responsibility of verify_ap).

      The following metrics are covered:

      • Fraction of science image sources that become DiaSources after image differencing.
      • Fraction of previously-known DiaObjects that have detections in a new difference image.
      • Number of DiaSources that create new DiaObjects in a difference image.
      • Number of DiaObjects associated with only one DiaSource

        Attachments

          Issue Links

            Activity

            Hide
            swinbank John Swinbank added a comment - - edited

            I see the deadline of "by the All Hands Meeting", which is only a week and a half away, so I'm a bit worried that there doesn't seem to be anybody assigned to work on this ticket. Can somebody fill me in on what the plan is here, please?

            Show
            swinbank John Swinbank added a comment - - edited I see the deadline of "by the All Hands Meeting", which is only a week and a half away, so I'm a bit worried that there doesn't seem to be anybody assigned to work on this ticket. Can somebody fill me in on what the plan is here, please?
            Hide
            ebellm Eric Bellm added a comment -

            This is probably going to be assigned to Krzysztof Findeisen, but I think it's unlikely we meet the AHM goal unless DM-11040 turns out to be faster than expected. Since we have found some sticking points in the MVS development we will prioritize discussions of these with SQuARE, DRP, and DAX at AHM.

            Show
            ebellm Eric Bellm added a comment - This is probably going to be assigned to Krzysztof Findeisen , but I think it's unlikely we meet the AHM goal unless DM-11040 turns out to be faster than expected. Since we have found some sticking points in the MVS development we will prioritize discussions of these with SQuARE, DRP, and DAX at AHM.
            Hide
            krzys Krzysztof Findeisen added a comment - - edited

            Chris Morrison, measurers live in the lsst.ap.verify.measurements package. The intended architecture is that each set of "related" metrics (so probably everything on this ticket) lives in a module in measurements, and the measurers get called by code in compute_metrics.py. The functions in compute_metrics.py are the only ones visible to ap_verify, insulating the main module from the details of exactly which metrics are defined.

            As you can probably tell from the measure_from_metadata function, my initial assumption was that all measurements that couldn't be provided by tasks would be computed from metadata. I see two ways to quickly adapt the package for Butler/database-oriented measurements:

            • Since our current handling of the output repository and L1 database is already full of temporary hacks, we could just hard-code the locations within, e.g., measure_from_database.
            • We could pass in the repository location and/or database, the way measure_from_metadata currently accepts the combined metadata produced by ap_pipe. The least disruptive way to get that information from the current design would be to modify ap.pipe.doAssociation (and the other pipeline step functions?) to return the final output repo and/or database file, which would then be propagated up to ap.verify._measure_final_properties. (Note that doAssociation does not tie the DB sub-repository to those from earlier pipeline steps; this is probably a bug.)
            Show
            krzys Krzysztof Findeisen added a comment - - edited Chris Morrison , measurers live in the lsst.ap.verify.measurements package. The intended architecture is that each set of "related" metrics (so probably everything on this ticket) lives in a module in measurements , and the measurers get called by code in compute_metrics.py . The functions in compute_metrics.py are the only ones visible to ap_verify , insulating the main module from the details of exactly which metrics are defined. As you can probably tell from the measure_from_metadata function, my initial assumption was that all measurements that couldn't be provided by tasks would be computed from metadata. I see two ways to quickly adapt the package for Butler/database-oriented measurements: Since our current handling of the output repository and L1 database is already full of temporary hacks, we could just hard-code the locations within, e.g., measure_from_database . We could pass in the repository location and/or database, the way measure_from_metadata currently accepts the combined metadata produced by ap_pipe . The least disruptive way to get that information from the current design would be to modify ap.pipe.doAssociation (and the other pipeline step functions?) to return the final output repo and/or database file, which would then be propagated up to ap.verify._measure_final_properties . (Note that doAssociation does not tie the DB sub-repository to those from earlier pipeline steps; this is probably a bug.)
            Hide
            cmorrison Chris Morrison added a comment -

            Created ticket branch on ap_verify.

            Show
            cmorrison Chris Morrison added a comment - Created ticket branch on ap_verify.
            Hide
            sullivan Ian Sullivan added a comment -

            Looks good, and everything built on Jenkins.

            Show
            sullivan Ian Sullivan added a comment - Looks good, and everything built on Jenkins.
            Hide
            cmorrison Chris Morrison added a comment -

            Merged.

            Show
            cmorrison Chris Morrison added a comment - Merged.

              People

              • Assignee:
                cmorrison Chris Morrison
                Reporter:
                krzys Krzysztof Findeisen
                Reviewers:
                Ian Sullivan
                Watchers:
                Chris Morrison, Eric Bellm, Ian Sullivan, John Swinbank, Krzysztof Findeisen
              • Votes:
                0 Vote for this issue
                Watchers:
                5 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved:

                  Summary Panel