Details
-
Type:
Story
-
Status: Done
-
Resolution: Done
-
Fix Version/s: None
-
Component/s: alert_stream
-
Labels:
-
Story Points:1
-
Epic Link:
-
Sprint:AP S20-6 (May), AP F20-1 (June)
-
Team:Alert Production
-
Urgent?:No
Description
Provide a consumer application which simply counts the number of alerts it has received and prints the count to stdout every second. This is intended to be scaffolding code for users to work with.
Attachments
Issue Links
- blocks
-
DM-25226 Provide alert stream simulator for distribution to potential community broker authors
- Done
Hi Spencer Nelson,
Since the sample client application is intended to give users code to build from, I'm concerned that we've got a lot of code in streamsim that would might reside more naturally elsewhere. In particular it seems like basically all of serialization.py could live in lsst/alert_packet, and similarly the _KafkaClient abstraction could be in a separate package as well.
By doing that users could build client applications against a more permanent Rubin interface (acknowledging that we're a looong ways from having that be stable). Leaning harder on alert_packet would also let us avoid hard-coding schema versions as is done currently.
Might be worth talking about this live so I can understand better where you're coming from. I would welcome John Swinbank's thoughts here also.