Uploaded image for project: 'Data Management'
  1. Data Management
  2. DM-17423

Mask ghosts in HSC

    Details

    • Type: Story
    • Status: In Progress
    • Resolution: Unresolved
    • Fix Version/s: None
    • Component/s: None
    • Labels:
    • Team:
      External

      Description

      Hisanori Furusawa is developing code that uses an optical model to identify and mask ghosts. This needs to be integrated and tested.

        Attachments

          Issue Links

            Activity

            Hide
            furusawa.hisanori Hisanori Furusawa added a comment - - edited

            Simple conversion from Python2 to Python3 based code - done.

            API change for extracting reference catalog for ghost prediction - tentatively done to be verified.

            API change for deriving and calculating CCD positions on the focal plane - coding done. being  verified. (2/27/2019)

            Setting mask planes dedicated for ghosts

            Adaption to the change in handling on-memory CCD coordinates that happened at hscPipe 4 to 5

            Experiments to justify and improve functionality of the prediction engines

            Cleaning/rewriting codes borrowed from someone else for getting the modules ready for release

            Show
            furusawa.hisanori Hisanori Furusawa added a comment - - edited Simple conversion from Python2 to Python3 based code - done. API change for extracting reference catalog for ghost prediction - tentatively done to be verified. API change for deriving and calculating CCD positions on the focal plane - coding done. being  verified. (2/27/2019) Setting mask planes dedicated for ghosts Adaption to the change in handling on-memory CCD coordinates that happened at hscPipe 4 to 5 Experiments to justify and improve functionality of the prediction engines Cleaning/rewriting codes borrowed from someone else for getting the modules ready for release
            Hide
            furusawa.hisanori Hisanori Furusawa added a comment -

            We had much higher density of reference catalogs than in hscPipe 4, but this turned out that the new reference sources seem to be recorded in the unit of Jy. I did this adaption in the code, and now I have a very similar distribution of the sources in both hscPipe 4 and 6.

            Show
            furusawa.hisanori Hisanori Furusawa added a comment - We had much higher density of reference catalogs than in hscPipe 4, but this turned out that the new reference sources seem to be recorded in the unit of Jy. I did this adaption in the code, and now I have a very similar distribution of the sources in both hscPipe 4 and 6.
            Hide
            furusawa.hisanori Hisanori Furusawa added a comment -

            Prediction codes for two kinds of ghosts mustache and caustic now work with a testing script, just like the hscPipe 4 base codes did. Integrated tests with task runner for processing an exposure is to be done.

            Show
            furusawa.hisanori Hisanori Furusawa added a comment - Prediction codes for two kinds of ghosts mustache and caustic now work with a testing script, just like the hscPipe 4 base codes did. Integrated tests with task runner for processing an exposure is to be done.
            Hide
            furusawa.hisanori Hisanori Furusawa added a comment - - edited

            mustache example (left: CORR w/ prediction, right:mask layer)

            caustic example (left: CORR w/ prediction, right: mask layer)

             

            Show
            furusawa.hisanori Hisanori Furusawa added a comment - - edited mustache example (left: CORR w/ prediction, right:mask layer) caustic example (left: CORR w/ prediction, right: mask layer)  
            Hide
            furusawa.hisanori Hisanori Furusawa added a comment -

            Integrating test with BatchPoolTask over PBS under way. The error message below is to be addressed.

            CameraMapper INFO: Loading calib registry from /lfs/Subaru/HSC/CALIB_s18a/calibRegistry.sqlite3
            NoResults on hda-ana05:147053 in run: No locations for get: datasetType:ref_cat_config dataId:DataId(initialdata={'name': 'cal_ref_cat'}, tag=set())
            Traceback (most recent call last):
              File "/ana/products.6.7/stack/miniconda3-4.3.21-10a4fa6/Linux64/ctrl_pool/6.0-hsc+4/python/lsst/ctrl/pool/pool.py", line 113, in wrapper
                return func(*args, **kwargs)
              File "/data_qa/furu/work/furu/ghosts/hscGhost/python/hsc/ghost/hscGhost.py", line 1391, in run
                self.makeSubtask('refObjLoader', butler=butler)
              File "/ana/products.6.7/stack/miniconda3-4.3.21-10a4fa6/Linux64/pipe_base/6.0-hsc+4/python/lsst/pipe/base/task.py", line 299, in makeSubtask
                subtask = taskField.apply(name=name, parentTask=self, **keyArgs)
              File "/ana/products.6.7/stack/miniconda3-4.3.21-10a4fa6/Linux64/pex_config/6.0b1-hsc+9/python/lsst/pex/config/configurableField.py", line 83, in apply
                return self.target(*args, config=self.value, **kw)
              File "/ana/products.6.7/stack/miniconda3-4.3.21-10a4fa6/Linux64/meas_algorithms/6.7-hsc/python/lsst/meas/algorithms/loadIndexedReferenceObjects.py", line 49, in __init__
                dataset_config = butler.get("ref_cat_config", name=self.config.ref_dataset_name, immediate=True)
              File "/ana/products.6.7/stack/miniconda3-4.3.21-10a4fa6/Linux64/daf_persistence/6.7-hsc/python/lsst/daf/persistence/butler.py", line 1378, in get
                raise NoResults("No locations for get:", datasetType, dataId)
            lsst.daf.persistence.butlerExceptions.NoResults: No locations for get: datasetType:ref_cat_config dataId:DataId(initialdata={'name': 'cal_ref_cat'}, tag=set())
            application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
            

            Show
            furusawa.hisanori Hisanori Furusawa added a comment - Integrating test with BatchPoolTask over PBS under way. The error message below is to be addressed. CameraMapper INFO: Loading calib registry from /lfs/Subaru/HSC/CALIB_s18a/calibRegistry.sqlite3 NoResults on hda-ana05:147053 in run: No locations for get: datasetType:ref_cat_config dataId:DataId(initialdata={'name': 'cal_ref_cat'}, tag=set()) Traceback (most recent call last): File "/ana/products.6.7/stack/miniconda3-4.3.21-10a4fa6/Linux64/ctrl_pool/6.0-hsc+4/python/lsst/ctrl/pool/pool.py", line 113, in wrapper return func(*args, **kwargs) File "/data_qa/furu/work/furu/ghosts/hscGhost/python/hsc/ghost/hscGhost.py", line 1391, in run self.makeSubtask('refObjLoader', butler=butler) File "/ana/products.6.7/stack/miniconda3-4.3.21-10a4fa6/Linux64/pipe_base/6.0-hsc+4/python/lsst/pipe/base/task.py", line 299, in makeSubtask subtask = taskField.apply(name=name, parentTask=self, **keyArgs) File "/ana/products.6.7/stack/miniconda3-4.3.21-10a4fa6/Linux64/pex_config/6.0b1-hsc+9/python/lsst/pex/config/configurableField.py", line 83, in apply return self.target(*args, config=self.value, **kw) File "/ana/products.6.7/stack/miniconda3-4.3.21-10a4fa6/Linux64/meas_algorithms/6.7-hsc/python/lsst/meas/algorithms/loadIndexedReferenceObjects.py", line 49, in __init__ dataset_config = butler.get("ref_cat_config", name=self.config.ref_dataset_name, immediate=True) File "/ana/products.6.7/stack/miniconda3-4.3.21-10a4fa6/Linux64/daf_persistence/6.7-hsc/python/lsst/daf/persistence/butler.py", line 1378, in get raise NoResults("No locations for get:", datasetType, dataId) lsst.daf.persistence.butlerExceptions.NoResults: No locations for get: datasetType:ref_cat_config dataId:DataId(initialdata={'name': 'cal_ref_cat'}, tag=set()) application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
            Hide
            furusawa.hisanori Hisanori Furusawa added a comment -

            The reported error above was due to failure in setting up proper config to the LoadIndexedReferenceObjectsTask.
            In addition, integrating the masking engine tasks with BatchPoolTask is done. So the status is as followings:

            Simple conversion from Python2 to Python3 based code - done.
            API change for extracting reference catalog for ghost prediction - done.
            API change for deriving and calculating CCD positions on the focal plane - basically done. tobe checked with various cases
            Setting mask planes dedicated for ghosts - done.
            Adaption to the change in handling on-memory CCD coordinates that happened at hscPipe 4 to 5 - done.
            Experiments to justify and improve functionality of the prediction engines - TBD
            Cleaning/rewriting codes borrowed from someone else for getting the modules ready for release - TBD

            Show
            furusawa.hisanori Hisanori Furusawa added a comment - The reported error above was due to failure in setting up proper config to the LoadIndexedReferenceObjectsTask. In addition, integrating the masking engine tasks with BatchPoolTask is done. So the status is as followings: Simple conversion from Python2 to Python3 based code - done. API change for extracting reference catalog for ghost prediction - done. API change for deriving and calculating CCD positions on the focal plane - basically done. tobe checked with various cases Setting mask planes dedicated for ghosts - done. Adaption to the change in handling on-memory CCD coordinates that happened at hscPipe 4 to 5 - done. Experiments to justify and improve functionality of the prediction engines - TBD Cleaning/rewriting codes borrowed from someone else for getting the modules ready for release - TBD
            Hide
            furusawa.hisanori Hisanori Furusawa added a comment - - edited

            note: ds9 (even latest one) cannot decode compressed mask plane in the resultant FITS files written by dataRef.put() or butler.put(). CORR seems no problem, but once I read CORR onto memory and write it into a new file by butler, the new file comes with this problem. I have no idea about the difference.

            Per investigation by @Sogo Mineo, we strongly suspect that this is due to inappropriate handling of FITS header in calexp files with tiled-image compression both in LSST stack and ds9 decoding, which is violating FITS Standard. 1) LSST stack should not write ZQUANTIZ= 'NONE ' in the primary HDU that is not associated with any pixel data. Also the value 'NONE' is not allowed by the standard. For compression with quantization enabled, put 'NO_DITHER'. And when we do lossless compression without any quantization for reducing noise, simply dropping this keyword would be appropriate (When neither ZZERO & ZSCALE exists, we assume no quantization is performed, and so ZQUANTIZ is irrelevant). Otherwise, widely-used common tools like FITSIO and ds9 would not properly decode the data. 2) ds9 should not interpret the ZQUANTIZ in the primary HDU as the default value for other HDUs. The tiled-image compression for integer images should not have ZQUANTIZ, but the current ds9 code seems to mistakenly decode compressed integer values into float when ZQUANTIZ exists in the primary HDU, which is totally wrong. ZQUANTIZ in the same HDU as the pixel data should be used in any case.

             

            Show
            furusawa.hisanori Hisanori Furusawa added a comment - - edited note: ds9 (even latest one) cannot decode compressed mask plane in the resultant FITS files written by dataRef.put() or butler.put(). CORR seems no problem, but once I read CORR onto memory and write it into a new file by butler, the new file comes with this problem. I have no idea about the difference. Per investigation by @Sogo Mineo, we strongly suspect that this is due to inappropriate handling of FITS header in calexp files with tiled-image compression both in LSST stack and ds9 decoding, which is violating FITS Standard. 1) LSST stack should not write ZQUANTIZ= 'NONE ' in the primary HDU that is not associated with any pixel data. Also the value 'NONE' is not allowed by the standard. For compression with quantization enabled, put 'NO_DITHER'. And when we do lossless compression without any quantization for reducing noise, simply dropping this keyword would be appropriate (When neither ZZERO & ZSCALE exists, we assume no quantization is performed, and so ZQUANTIZ is irrelevant). Otherwise, widely-used common tools like FITSIO and ds9 would not properly decode the data. 2) ds9 should not interpret the ZQUANTIZ in the primary HDU as the default value for other HDUs. The tiled-image compression for integer images should not have ZQUANTIZ, but the current ds9 code seems to mistakenly decode compressed integer values into float when ZQUANTIZ exists in the primary HDU, which is totally wrong. ZQUANTIZ in the same HDU as the pixel data should be used in any case.  

              People

              • Assignee:
                furusawa.hisanori Hisanori Furusawa
                Reporter:
                price Paul Price
                Watchers:
                Hisanori Furusawa, Masayuki Tanaka, Paul Price
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:

                  Summary Panel