Uploaded image for project: 'Data Management'
  1. Data Management
  2. DM-9303

Provide feedback on SUIT code

    Details

    • Type: Story
    • Status: Done
    • Resolution: Done
    • Fix Version/s: None
    • Component/s: None
    • Labels:
      None

      Description

      David Shupe has provided me with some code he's been experimenting with to explore interfaces to the Science Pipelines and has asked for feedback. Provide it to him (or find somebody else who can).

      He writes:

      Thanks, this would be very helpful! I’ve pushed the current working version to https://github.com/stargaser/forced_phot . The goal is return a table of photometry at a specified location on a specified
      set of single-epoch exposures.

      The README has some information. The main script is “get_forcedphot.py” and the arguments are:
      ------------------
      usage: get_forcedphot.py [-h]
      input_repo output_repo json_input coord_str
      filter_name output_table

      Run forced photometry on specified images and output a time-series table

      positional arguments:
      input_repo input repository directory path for images
      output_repo dummy repository directory for schema
      json_input json file as returned by imgserv
      coord_str coordinates string as name,ra,dec
      filter_name name of filter to subset input table [ugriz]
      output_table output table path in FITS format

      optional arguments:
      -h, --help show this help message and exit
      ———————

      To interface with Firefly, this will be modified to take a JSON input from Firefly and to return a JSON file specifying the output table location.

      Some problems I had include:

      • couldn’t get the schema right to add PSF magnitudes
      • couldn’t use the writeOutput method for each catalog because of schema problem
      • calib.getMagnitudes couldn’t convert the ndarrays for psfFlux to C++ (non-aligned arrays?)
      • multiple exposures can be specified with multiple ‘—id ‘ arguments — very cool! Is there a way to gather up all the individual results into one table in the same pipeline task?

      To get something working, I have fallen back to Astropy to manipulate the catalog table files.

      For now, I’ve been running this on lsst-dev since the data are available there, in /home/shupe/projects/forced_phot.

        Attachments

          Issue Links

            Activity

            There are no comments yet on this issue.

              People

              • Assignee:
                swinbank John Swinbank
                Reporter:
                swinbank John Swinbank
                Watchers:
                John Swinbank
              • Votes:
                0 Vote for this issue
                Watchers:
                1 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved:

                  Summary Panel