Uploaded image for project: 'Data Management'
  1. Data Management
  2. DM-20910

Design salobj to Kafka feeder

    XMLWordPrintable

    Details

    • Type: Story
    • Status: Done
    • Resolution: Done
    • Fix Version/s: None
    • Component/s: ts_middleware
    • Labels:
    • Story Points:
      2
    • Sprint:
      TSSW Sprint - Aug 5 - Aug 17, TSSW Sprint - Aug 19 - Aug 31
    • Team:
      Telescope and Site

      Description

      Design the salobj to Kafka feeder and negotiate with Angelo Fausti and Jonathan Sick about details of the schema. I hope they will be willing to change the Avro schema to match the DDS topics.

      Another point to discuss is to consider using the IDL or Python topic data classes, instead of the XML files, to create the Avro schema. However, unless doing such a change requires me to write some code, this is a side issue.

      Figure out how to write unit tests for this code.

      The product will be a preliminary implementation, not necessarily tested.

        Attachments

          Issue Links

            Activity

            Hide
            rowen Russell Owen added a comment -

            The basic design (as implemented in a preliminary implementation that has since been superseded by more complete code):

            • A command-line script accepts a list of one or more SAL components to listen to.
            • For each SAL component make a ComponentProducer which listens to all messages from that SAL component (using index 0, so all instances).
            • For each topic make a TopicProducer that listens to that one topic. Its callback function produces a kafka message for each DDS sample.
            • The Avro schema is generated by introspecting the DDS sample (with one field added for the time at which the kafka message is produced). Once SAL 4 is released we can parse the IDL files to generate a description and units for each field, though we may well want to continue to generate the rest of the schema the same way we are doing now.

            One detail: the TopicProducer contains an instance of lsst.ts.salobj.ReadTopic. We cannot use ControllerCommand to listen to commands because we cannot afford to have an ack sent when a command is received. We could use RemoteEvent and RemoteTelemetry for those topics, but it is simpler to treat all topics the same.

            Show
            rowen Russell Owen added a comment - The basic design (as implemented in a preliminary implementation that has since been superseded by more complete code): A command-line script accepts a list of one or more SAL components to listen to. For each SAL component make a ComponentProducer which listens to all messages from that SAL component (using index 0, so all instances). For each topic make a TopicProducer that listens to that one topic. Its callback function produces a kafka message for each DDS sample. The Avro schema is generated by introspecting the DDS sample (with one field added for the time at which the kafka message is produced). Once SAL 4 is released we can parse the IDL files to generate a description and units for each field, though we may well want to continue to generate the rest of the schema the same way we are doing now. One detail: the TopicProducer contains an instance of lsst.ts.salobj.ReadTopic. We cannot use ControllerCommand to listen to commands because we cannot afford to have an ack sent when a command is received. We could use RemoteEvent and RemoteTelemetry for those topics, but it is simpler to treat all topics the same.

              People

              Assignee:
              rowen Russell Owen
              Reporter:
              rowen Russell Owen
              Watchers:
              Angelo Fausti, Jonathan Sick, Russell Owen
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

                Dates

                Created:
                Updated:
                Resolved:

                  Jenkins

                  No builds found.