Uploaded image for project: 'Data Management'
  1. Data Management
  2. DM-21916

SQuaSH upload of Gen 3 Measurements

    XMLWordPrintable

    Details

      Description

      The existing upload script (dispatch_verify.py) assumes that metric values have been persisted in Job objects. Once we can store metric values in Gen 3 repositories, we need to be able to read those as part of the upload process.

      Story point value assumes that using Gen 3 Butler inside something that's not a PipelineTask is difficult.

        Attachments

          Issue Links

            Activity

            Hide
            krzys Krzysztof Findeisen added a comment -

            I assumed the script in metric-pipeline-tasks was a workaround for this issue. Or do you plan for the Job class to always be required as a sort of intermediate step?

            Show
            krzys Krzysztof Findeisen added a comment - I assumed the script in metric-pipeline-tasks was a workaround for this issue. Or do you plan for the Job class to always be required as a sort of intermediate step?
            Hide
            afausti Angelo Fausti added a comment -

            I think reconstructing the Job as an intermediate step is a possible approach to use dispatch_verify.py to submit to SQuaSH. This code could live in the lsst.verify package and be used by dispatch_verify.py if pointed to a Butler Gen 3 registry. Haven't thought much about this yet but it seems promising.

            Show
            afausti Angelo Fausti added a comment - I think reconstructing the Job as an intermediate step is a possible approach to use dispatch_verify.py to submit to SQuaSH. This code could live in the lsst.verify package and be used by dispatch_verify.py if pointed to a Butler Gen 3 registry. Haven't thought much about this yet but it seems promising.
            Hide
            krzys Krzysztof Findeisen added a comment - - edited

            I've been working on updating our SQuaSH stuff for DM-24262, and I noticed we don't seem to have a tag(?) that distinguishes between Gen 2 and Gen 3 runs. This would be very useful once we have more data from both.

            Show
            krzys Krzysztof Findeisen added a comment - - edited I've been working on updating our SQuaSH stuff for DM-24262 , and I noticed we don't seem to have a tag(?) that distinguishes between Gen 2 and Gen 3 runs. This would be very useful once we have more data from both.
            Hide
            krughoff Simon Krughoff added a comment -

            That is a good point. I hope to have validate_drp running on HSC in nightlies using both gen2 and gen3 in the next few weeks. It would be very good to be able to differentiate between the two using tags.

            I was initially thinking about starting another database, but I think separating by tags is probably better.

            Show
            krughoff Simon Krughoff added a comment - That is a good point. I hope to have validate_drp running on HSC in nightlies using both gen2 and gen3 in the next few weeks. It would be very good to be able to differentiate between the two using tags. I was initially thinking about starting another database, but I think separating by tags is probably better.
            Show
            ebellm Eric Bellm added a comment - Here is a Slack discussion of the faro implementation ( https://github.com/lsst-dmsst/metric-pipeline-tasks/blob/master/python/metric_pipeline_scripts/jobReporter.py )

              People

              Assignee:
              krzys Krzysztof Findeisen
              Reporter:
              krzys Krzysztof Findeisen
              Watchers:
              Angelo Fausti, Eric Bellm, Ian Sullivan, Jim Bosch, Krzysztof Findeisen, Simon Krughoff
              Votes:
              0 Vote for this issue
              Watchers:
              6 Start watching this issue

                Dates

                Created:
                Updated: