Uploaded image for project: 'Data Management'
  1. Data Management
  2. DM-38800

bias stability study

    XMLWordPrintable

    Details

    • Type: Story
    • Status: Done
    • Resolution: Done
    • Fix Version/s: None
    • Component/s: None
    • Labels:
      None
    • Story Points:
      2
    • Epic Link:
    • Team:
      Data Release Production
    • Urgent?:
      No

      Description

      Movie of bias evolution spanning 20210908 to 20230316

      [BiasEvolution.mov]

      this is row 500 of the bias frames (offsets of 50 ADU introduced for clarity)

      [image (8).png]

      zoom in near column 1000: tens of ADUs of variation at amp boundaries:

      [image (9).png]

      row 2500:

      [image (10).png]

      (variable SNR is due to differing number of frames stacked to make the bias images)
      many 10's of ADUs of variation in sloping bias structure.
      looking at a single extracted column and plotting bias level vs. row number for column 2500 we see amp-boundary variation in some but not in others

      [image (11).png]

      I claim this validates the assertion that we should construct biases daily. That’s the fastest cadence at which we can do so, currently. And there is no SNR penalty as long as we stack up 11 or so.
      legend order : top legend is plotted line on the bottom

        Attachments

        1. BiasEvolution.mov
          24.94 MB
        2. biasNoise-20230421.png
          biasNoise-20230421.png
          114 kB
        3. biasNoise-20230421-cpVerifyBias.ipynb
          2.09 MB
        4. biasNoise-20230421-SEQ.png
          biasNoise-20230421-SEQ.png
          188 kB
        5. biasNoise-20230421-tile.png
          biasNoise-20230421-tile.png
          1.03 MB
        6. image (10).png
          image (10).png
          416 kB
        7. image (11).png
          image (11).png
          398 kB
        8. image (8).png
          image (8).png
          1.74 MB
        9. image (9).png
          image (9).png
          601 kB

          Activity

          Hide
          czw Christopher Waters added a comment -

          I'm adding a notebook with the results of a selection of biases, along with the individual plots so I can share on slack as well.  To check these results, I ran a set of three biases from each of the days listed in Chris Stubb's plots through the cp_verify code as a proxy for the full-bias construction results.  This will use the appropriate bias from the butler.

          I've plotted the profiles from the cpVerifyProc images (bias frames that have had only overscan and bias corrections applied) for both rows and columns 500 and 2500 (full size available in attachments).  The row profiles (left panels) show a similar pattern as those above, and the column profiles (right panels) show the midline break (exposures sorted such that dateobs increases in the -y direction):

          There's a change in the pattern that corresponds to the updated sequencer file, also plotted as a function of exposure in the -SEQ.png image.  This also corresponds with the construction of calibrations with the parallel overscan enabled.  The existence of the patterns indicates that while new data is being treated correctly with parallel overscan, the older data is still using a previous bias that is "re-printing" the pattern back into the images.  I've filed DM-38828 to fix that.

          I've also attached a plot of the difference in measured and nominal read noise (which comes from the detector object), as the verification notebook (also attached) indicates that all amplifiers are failing the NOISE test, such that the measured noises are more than 5% different than the nominal values.  The plots show that the measured read noises are smaller than the nominal values, which I thought was no longer flagged as a failure.  

          Show
          czw Christopher Waters added a comment - I'm adding a notebook with the results of a selection of biases, along with the individual plots so I can share on slack as well.  To check these results, I ran a set of three biases from each of the days listed in Chris Stubb's plots through the cp_verify code as a proxy for the full-bias construction results.  This will use the appropriate bias from the butler. I've plotted the profiles from the cpVerifyProc images (bias frames that have had only overscan and bias corrections applied) for both rows and columns 500 and 2500 (full size available in attachments).  The row profiles (left panels) show a similar pattern as those above, and the column profiles (right panels) show the midline break (exposures sorted such that dateobs increases in the -y direction): There's a change in the pattern that corresponds to the updated sequencer file, also plotted as a function of exposure in the -SEQ.png image.  This also corresponds with the construction of calibrations with the parallel overscan enabled.  The existence of the patterns indicates that while new data is being treated correctly with parallel overscan, the older data is still using a previous bias that is "re-printing" the pattern back into the images.  I've filed  DM-38828 to fix that. I've also attached a plot of the difference in measured and nominal read noise (which comes from the detector object), as the verification notebook (also attached) indicates that all amplifiers are failing the NOISE test, such that the measured noises are more than 5% different than the nominal values.  The plots show that the measured read noises are smaller than the nominal values, which I thought was no longer flagged as a failure.  

            People

            Assignee:
            czw Christopher Waters
            Reporter:
            czw Christopher Waters
            Watchers:
            Christopher Waters, Kian-Tat Lim
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Dates

              Created:
              Updated:
              Resolved:

                Jenkins

                No builds found.