Uploaded image for project: 'Request For Comments'
  1. Request For Comments
  2. RFC-542

Reorganizing ci_hsc (and organizing ci_lsst?)

    XMLWordPrintable

    Details

    • Type: RFC
    • Status: Withdrawn
    • Resolution: Done
    • Component/s: DM
    • Labels:
      None

      Description

      Last week, the DRP team started to prototype a refactoring of ci_hsc on DM-9059 with the goal of adding test coverage for pipe_drivers and ultimately (on different tickets) PipelineTasks.  That would involve something like the following packages:

      • ci_hsc_data: (or just ci_hsc; name TBD) a git-lfs package (the only one here) containing all raw data, reference catalogs, master calibrations, etc.  "Building" this package essentially just runs (Gen2) ingest.  Installing the package copies the data repository to the installation directory, so it can be picked up by downstream packages in lsstsw/Jenkins.  This package also contains common Python validation code used by downstream code to test the existence and properties of expected outputs.
      • ci_hsc_cmdLineTasks: contains the remainder of the current ci_hsc scripts, and hence runs single-frame processing, warping, coadding, and multi-band coadd processing, by calling the CmdLineTasks individually.  Outputs go into a new repo with the installed ci_hsc_data repo as its parent.  Invokes ci_hsc_data's validation scripts to check that everything we expected to see is present.
      • ci_hsc_drivers: contains almost the same algorithmic content as ci_hsc_cmdLineTasks, but invoked via the higher-level scripts in pipe_drivers, which sometimes have slightly different behavior and certainly involve different code paths.  Also runs the validation scripts.  Depends only on ci_hsc_data.
      • ci_hsc_gen3: Depends on ci_hsc_drivers and ci_hsc_cmdLineTasks, and creates a Gen3 repo view into their output repos.  Uses Gen3 tools to compare the outputs and ensure they're the same in the ways that matter to us.

      In the future I expect to add:

      • Gen3 raw ingest directly to ci_hsc_data.
      • A new package to run the same code again in PipelineTask form.  This will start out depending on ci_hsc_gen3 in order to use Gen2 outputs that we can't yet produce via PipelineTasks, but when complete it should depend only on ci_hsc_data, and eventually replace both ci_hsc_cmdLineTask and ci_hsc_drivers.  And, ultimately, I'm hoping we'll have better ways of running Gen3 pipelines in CI than building them as if they were uncompiled code - but I don't think we should treat full retirement of Gen2 and a Pipeline-based test harness as right around the corner, and in the meantime we need this expanded test coverage.

      This is, of course, an initial factor of two and an eventual factor of three more processing than ci_hsc does today.  But I think that's inevitable; we have 2-3x as more stuff we should be testing than we are testing.  It's also an even larger expansion in the amount of disk space, as ci_hsc doesn't install anything before, and with installs we'd essentially have two copies of all data products.

      That said, a big goal of this refactoring is to make is possible to run individual pipeline flavors without running all of them.  Once ci_hsc_drivers is in good shape, I think it's all most Science Pipelines will need to run on most tickets, because there is a substantial overlap in coverage between _drivers and _cmdLineTasks.  Gen3 middleware and PipelineTask developers would probably find it most efficient to install and infrequently update copies of ci_hsc_data, ci_hsc_cmdLineTasks, and ci_hsc_drivers, and rely on local runs of ci_hsc_gen3 under the usually-reasonable assumption that they aren't breaking Gen2 pipelines.  And of course we can build all of these packages to ensure complete coverage on a timer or on ticket branches that are particularly far-reaching.

      As (I believe) we're about to create one or more ci_lsst packages for various kinds of simulated, test-data, and eventually on-sky LSST data, it might be worth discussing here as well whether it should follow a similar pattern.  I'm not convinced the answer is yes, as I think it's much more important right now to organize LSST data in a way that makes sense for more complete testing of earlier stages of the pipeline, rather than end-to-end testing of the DRP pipelines and Gen3 middleware.  But it's a discussion we should have.

        Attachments

          Issue Links

            Activity

            Hide
            hchiang2 Hsin-Fang Chiang added a comment -

            Will both ci_hsc_* and ci_lsst be part of the lsst_ci, or do they stay independent?

            Show
            hchiang2 Hsin-Fang Chiang added a comment - Will both ci_hsc_* and ci_lsst be part of the lsst_ci , or do they stay independent?
            Hide
            jbosch Jim Bosch added a comment - - edited

            Will both ci_hsc_* and ci_lsst be part of the lsst_ci, or do they stay independent?

            It's hard for me to make a recommendation for ci_lsst without knowing what its scope will be.  I would prefer that ci_hsc-* remain distinct, and hence opt-in; while I think most developers should be running ci_hsc on a healthy fraction of their tickets prior to merge (and I expect that to apply to at least one of ci_hsc_cmdLineTask or ci_hsc_drivers going forward), I'm happy relying on developer judgement as to when to do more or less than that.  It's hard for me to imagine us being able to put together tooling that would rigorously answer the question of whether a particular changeset could affect a particular integration test, and hence I think developer judgement is all we've got.

            Show
            jbosch Jim Bosch added a comment - - edited Will both ci_hsc_* and ci_lsst be part of the lsst_ci , or do they stay independent? It's hard for me to make a recommendation for ci_lsst without knowing what its scope will be.  I would prefer that ci_hsc-* remain distinct, and hence opt-in; while I think most developers should be running ci_hsc on a healthy fraction of their tickets prior to merge (and I expect that to apply to at least one of ci_hsc_cmdLineTask or ci_hsc_drivers going forward), I'm happy relying on developer judgement as to when to do more or less than that.  It's hard for me to imagine us being able to put together tooling that would rigorously answer the question of whether a particular changeset could affect a particular integration test, and hence I think developer judgement is all we've got.
            Hide
            jhoblitt Joshua Hoblitt added a comment - - edited

            Is ci_hsc_data being proposed to subsume validation_data_hsc or exist in parallel?

            Show
            jhoblitt Joshua Hoblitt added a comment - - edited Is ci_hsc_data being proposed to subsume validation_data_hsc or exist in parallel?
            Hide
            tjenness Tim Jenness added a comment -

            It has to be separate because if I recall correctly validation_data_hsc is huge and we don't want it on the Jenkins nodes. I do wonder if ci_hsc and testdata_subaru could be combined though.

            Another comment I have is that it would be better if ci_hsc_data did not depend on sconsUtils. Testdata packages don't so there is precedent for this. afwdata does and it's really annoying because you get lots of afwdata packages in your installed stack tree because they get recopied every time sconsUtils is tweaked.

            Show
            tjenness Tim Jenness added a comment - It has to be separate because if I recall correctly validation_data_hsc is huge and we don't want it on the Jenkins nodes. I do wonder if ci_hsc and testdata_subaru could be combined though. Another comment I have is that it would be better if ci_hsc_data did not depend on sconsUtils. Testdata packages don't so there is precedent for this. afwdata does and it's really annoying because you get lots of afwdata packages in your installed stack tree because they get recopied every time sconsUtils is tweaked.
            Hide
            jhoblitt Joshua Hoblitt added a comment -

            The multiple installations happens because sconUtils provides an install target but the same behavior should occur with any package that lsst-build/eupspkg.sh can default_install. What is probably needed is some way of flagging that changes to the dependencies, or at least select dependencies, don't cause a rebuild/re-installation of "data" packages. Alternatively, pure data packages could considered the users responsibility to manage and eups level dependencies completely removed. This is essentially how the validate_drp jenkins job functions.

            Show
            jhoblitt Joshua Hoblitt added a comment - The multiple installations happens because sconUtils provides an install target but the same behavior should occur with any package that lsst-build/eupspkg.sh can default_install . What is probably needed is some way of flagging that changes to the dependencies, or at least select dependencies, don't cause a rebuild/re-installation of "data" packages. Alternatively, pure data packages could considered the users responsibility to manage and eups level dependencies completely removed. This is essentially how the validate_drp jenkins job functions.
            Hide
            tjenness Tim Jenness added a comment -

            The multiple installations occur when a package depends on sconsUtils and sconsUtils is changed. The testdata_X packages only ever get installed once (assuming they don't change themselves) because they have no dependencies listed in the table file that can cause them to be reinstalled.

            Show
            tjenness Tim Jenness added a comment - The multiple installations occur when a package depends on sconsUtils and sconsUtils is changed. The testdata_X packages only ever get installed once (assuming they don't change themselves) because they have no dependencies listed in the table file that can cause them to be reinstalled.
            Hide
            tjenness Tim Jenness added a comment -

            Jim Bosch what do you want to do about this RFC?

            Show
            tjenness Tim Jenness added a comment - Jim Bosch what do you want to do about this RFC?
            Hide
            jbosch Jim Bosch added a comment -

            Withdrawing this, at least for now; those who would have done the work will be focused more directly on Gen3 conversion work for the next couple of months, and once that has landed the way we'll want to approach this will probably change.

            Show
            jbosch Jim Bosch added a comment - Withdrawing this, at least for now; those who would have done the work will be focused more directly on Gen3 conversion work for the next couple of months, and once that has landed the way we'll want to approach this will probably change.

              People

              Assignee:
              jbosch Jim Bosch
              Reporter:
              jbosch Jim Bosch
              Watchers:
              Hsin-Fang Chiang, Jim Bosch, John Parejko, Joshua Hoblitt, Krzysztof Findeisen, Tim Jenness
              Votes:
              0 Vote for this issue
              Watchers:
              6 Start watching this issue

                Dates

                Created:
                Updated:
                Resolved:
                Planned End:

                  Jenkins

                  No builds found.