Uploaded image for project: 'Data Management'
  1. Data Management
  2. DM-14282

IndexError in detectCoaddSources scaleVariance

    XMLWordPrintable

    Details

    • Type: Story
    • Status: Done
    • Resolution: Done
    • Fix Version/s: None
    • Component/s: None
    • Story Points:
      4
    • Sprint:
      DRP S18-6
    • Team:
      Data Release Production

      Description

      While running coaddDriver with the PDR1 data, using w_2018_15, I got 8 cases with the following IndexError

      coaddDriver WARN: lsst-verify-worker23:129753: Caught IndexError while detection on DataId(initialdata={'tract': 16972, 'filter': 'HSC-G', 'patch': '0,7'}, tag=set()): cannot do a non-empty take from an empty axes.
      coaddDriver INFO: lsst-verify-worker23:129753: Traceback:
      Traceback (most recent call last):
        File "/software/lsstsw/stack3_20171023/stack/miniconda3-4.3.21-10a4fa6/Linux64/ctrl_pool/15.0-1-ga91101e+7/python/lsst/ctrl/pool/parallel.py", line 512, in logOperation
          yield
        File "/software/lsstsw/stack3_20171023/stack/miniconda3-4.3.21-10a4fa6/Linux64/pipe_drivers/15.0-3-ga03b4ca+9/python/lsst/pipe/drivers/coaddDriver.py", line 321, in coadd
          detResults = self.detectCoaddSources.runDetection(coadd, idFactory, expId=expId)
        File "/software/lsstsw/stack3_20171023/stack/miniconda3-4.3.21-10a4fa6/Linux64/pipe_tasks/15.0-5-g389937dc+5/python/lsst/pipe/tasks/multiBand.py", line 301, in runDetection
          varScale = self.scaleVariance.run(exposure.maskedImage)
        File "/software/lsstsw/stack3_20171023/stack/miniconda3-4.3.21-10a4fa6/Linux64/pipe_tasks/15.0-5-g389937dc+5/python/lsst/pipe/tasks/scaleVariance.py", line 116, in run
          factor = self.pixelBased(maskedImage)
        File "/software/lsstsw/stack3_20171023/stack/miniconda3-4.3.21-10a4fa6/Linux64/pipe_tasks/15.0-5-g389937dc+5/python/lsst/pipe/tasks/scaleVariance.py", line 155, in pixelBased
          q1, q3 = np.percentile(snr[isGood], (25, 75))
        File "/software/lsstsw/stack3_20171023/python/miniconda3-4.3.21/lib/python3.6/site-packages/numpy/lib/function_base.py", line 4269, in percentile
          interpolation=interpolation)
        File "/software/lsstsw/stack3_20171023/python/miniconda3-4.3.21/lib/python3.6/site-packages/numpy/lib/function_base.py", line 4011, in _ureduce
          r = func(a, **kwargs)
        File "/software/lsstsw/stack3_20171023/python/miniconda3-4.3.21/lib/python3.6/site-packages/numpy/lib/function_base.py", line 4386, in _percentile
          x1 = take(ap, indices_below, axis=axis) * weights_below
        File "/software/lsstsw/stack3_20171023/python/miniconda3-4.3.21/lib/python3.6/site-packages/numpy/core/fromnumeric.py", line 134, in take
          return _wrapfunc(a, 'take', indices, axis=axis, out=out, mode=mode)
        File "/software/lsstsw/stack3_20171023/python/miniconda3-4.3.21/lib/python3.6/site-packages/numpy/core/fromnumeric.py", line 57, in _wrapfunc
          return getattr(obj, method)(*args, **kwds)
      IndexError: cannot do a non-empty take from an empty axes.
      

      The traceback have similarities to but are not exactly the same as DM-13563 and DM-13517. The 8 cases include 2 in UDEEP, 5 in DEEP, and 1 in WIDE. To reproduce, their data IDs are:

      UDEEP: --id tract=8766 filter=HSC-G patch=8,3 --selectId ccd=0..8^10..103 visit=9830^9832^9834^9836^9874^9878^11626^11628^11642^11644^11660^11662^42346^42348^42350^42352^42354^42356
      UDEEP: --id tract=9571 filter=HSC-Y patch=7,7 --selectId ccd=0..8^10..103 visit=318^322^324^326^328^330^332^340^344^346^348^350^352^354^356^358^360^362^1856^1868^1870^1872^1874^1876^1880^1882^11716^11718^11720^11722^11724^11726^11728^11730^11732^11734^11736^11740^22604^22606^22608^22626^22628^22630^22632^22642^22644^22646^22648^22658^22660^22662^22664
      DEEP: --id tract=9708 filter=HSC-G patch=7,6 --selectId ccd=0..8^10..103 visit=9796^9798^9812^9820^34488^34496^34504^34512
      DEEP: --id tract=9812 filter=HSC-G patch=5,3 --selectId ccd=0..8^10..103 visit=29308^29310^29316^29318^29328^29332^29342^29346^29354^29358
      DEEP: --id tract=8766 filter=HSC-I patch=0,5 --selectId ccd=0..8^10..103 visit=46860^46864^46866^14220^14222^14224^14226^14246^14248^14252^14254^14264^14266^14268
      DEEP: --id tract=9707 filter=HSC-I patch=6,6 --selectId ccd=0..8^10..103 visit=7240^7242^7244^7246^7256^7258^7264^7266^35978^35982^35984^35988^35990^35996^35998^36004^36006
      DEEP: --id tract=9707 filter=NB0816 patch=6,6 --selectId ccd=0..8^10..103 visit=37318^37320^37322^37324^37326^37328^37330^37796^37798^37800^38474
      WIDE: --id tract=16972 filter=HSC-G patch=0,7 --selectId ccd=0..8^10..103 visit=29530^29536^29538^34210^34212^34214^34216^34218
      

      One can run

      coaddDriver.py /datasets/hsc/repo --calib /datasets/hsc/repo/CALIB --rerun DM-13666/[DEEP|UDEEP|WIDE]:private/username/abcd  .... 

      with the above IDs to reproduce.

        Attachments

          Issue Links

            Activity

            Hide
            nlust Nate Lust added a comment -

            Can you take a look at this change? The pull request is not showing up in jira yet, but it is up on github

            Show
            nlust Nate Lust added a comment - Can you take a look at this change? The pull request is not showing up in jira yet, but it is up on github
            Hide
            price Paul Price added a comment -

            This is a good fix, but I have suggested changes to the implementation in the GitHub PR.

            Show
            price Paul Price added a comment - This is a good fix, but I have suggested changes to the implementation in the GitHub PR.

              People

              Assignee:
              nlust Nate Lust
              Reporter:
              hchiang2 Hsin-Fang Chiang
              Reviewers:
              Paul Price
              Watchers:
              Hsin-Fang Chiang, Nate Lust, Paul Price
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

                Dates

                Created:
                Updated:
                Resolved:

                  Jenkins

                  No builds found.