An interesting update on this issue: I don't think it's quite as simple as we first thought. Since I couldn't figure out why I didn't see the same failure when working on DM-33911, I did another test where I set up weekly 25 (has the fix from 35162 but not this one) and then ran exactly what Meredith Rawls tried modulo the output collection name and restricting to a single patch for speed:
pipetask --long-log run -j 12 -b /repo/dc2 -d "skymap='DC2' AND tract=4431 AND patch IN (9)" -i 2.2i/defaults -o u/kherner/DM-35359_w25 -p $AP_PIPE_DIR/pipelines/LsstCamImSim/ApTemplate.yaml --register-dataset-types --clobber-outputs
|
Sure enough I could reproduce the problem. Then, without any changes, I tried doing something very similar on some of what goes into the diffim sprint datasets:
pipetask run -j8 -i HSC/runs/RC2/w_2022_22/DM-34983 -b /repo/main -d "tract=9813 AND instrument='HSC' AND skymap='hsc_rings_v1' AND patch=40" -p $AP_PIPE_DIR/pipelines/HyperSuprimeCam/ApTemplate.yaml -o u/kherner/DM-35359_w25 --register-dataset-types --clobber-outputs
|
but that worked! You'll immediately see two differences:
1) Different pipeline yaml files
2) Different input collections (from different repos)
As for the first, they do indeed pull in (very) different makeWarp.py files ($AP_PIPE_DIR/config/LSSTCam-imSim/makeWarp.py and $AP_PIPE_DIR/config/HSC/makeWarp.py), but there is nothing about doApplyFinalizedPsf in either one! Maybe it's getting set indirectly by some other setting that's different between the two (I would have to look at the code again) but I'm not sure what that mechanism would be. As for the second, it's not clear why the input collection would matter.
Finally, I don't think any of that is a reason not to go ahead and merge this change in, since it certainly does what we want, but I wanted to bring it up in case we're missing something more sinister going on behind the scenes.
Would you be willing to review this small config update, Ken? I'll make the PR momentarily.