Details
-
Type:
Story
-
Status: Done
-
Resolution: Done
-
Fix Version/s: None
-
Component/s: None
-
Labels:
-
Story Points:0
-
Sprint:TSSW Sprint - Oct 24 - Nov 07
-
Team:Telescope and Site
-
Urgent?:Yes
Description
On Slack Michael Reuter reports:
Looks like we are having an issue with ts_salkafka making Avro schemas from the IDL. Here's the link to the traceback (I hope): https://lsstc.slack.com/files/U2JPDUE86/F04985AHV7T/untitled.cpp. Is it possible for you to look into this when you get in as I know you're busy with the TMA? If not, I'll try to dig at it when I get in.
Traceback (most recent call last):
|
File "/opt/lsst/software/stack/miniconda/bin/run_salkafka_producer", line 11, in <module>
|
sys.exit(run_salkafka_producer())
|
File "/opt/lsst/software/stack/miniconda/lib/python3.10/site-packages/lsst/ts/salkafka/component_producer_set.py", line 693, in run_salkafka_producer
|
asyncio.run(ComponentProducerSet.amain())
|
File "/opt/lsst/software/stack/miniconda/lib/python3.10/asyncio/runners.py", line 44, in run
|
return loop.run_until_complete(main)
|
File "/opt/lsst/software/stack/miniconda/lib/python3.10/asyncio/base_events.py", line 646, in run_until_complete
|
return future.result()
|
File "/opt/lsst/software/stack/miniconda/lib/python3.10/site-packages/lsst/ts/salkafka/component_producer_set.py", line 234, in amain
|
await producer_set.run_producers(components=args.components)
|
File "/opt/lsst/software/stack/miniconda/lib/python3.10/site-packages/lsst/ts/salkafka/component_producer_set.py", line 556, in run_producers
|
await self._interruptable_start_task
|
File "/opt/lsst/software/stack/miniconda/lib/python3.10/site-packages/lsst/ts/salkafka/component_producer.py", line 192, in start
|
await asyncio.gather(
|
File "/opt/lsst/software/stack/miniconda/lib/python3.10/site-packages/lsst/ts/salkafka/topic_producer.py", line 60, in start
|
self.kafka_producer = await self.kafka_factory.make_producer(
|
File "/opt/lsst/software/stack/miniconda/lib/python3.10/site-packages/lsst/ts/salkafka/kafka_producer_factory.py", line 198, in make_producer
|
serializer = await Serializer.register(
|
File "/opt/lsst/software/stack/miniconda/lib/python3.10/site-packages/kafkit/registry/serializer.py", line 112, in register
|
id_ = await registry.register_schema(schema, subject=subject)
|
File "/opt/lsst/software/stack/miniconda/lib/python3.10/site-packages/kafkit/registry/sansio.py", line 433, in register_schema
|
schema = fastavro.parse_schema(schema)
|
File "fastavro/_schema.pyx", line 121, in fastavro._schema.parse_schema
|
File "fastavro/_schema.pyx", line 322, in fastavro._schema._parse_schema
|
File "fastavro/_schema.pyx", line 381, in fastavro._schema.parse_field
|
File "fastavro/_schema.pyx", line 184, in fastavro._schema._parse_schema
|
File "fastavro/_schema.pyx", line 132, in fastavro._schema._raise_default_value_error
|
fastavro._schema_common.SchemaParseException: Default value <0> must match schema type: double
|
I suspect the issue is that the defaults for private_sndStamp and private_rcvStamp are 0 instead of 0.0 and something in Kafka is no longer willing to cast the values.
Pull request: https://github.com/lsst-ts/ts_salkafka/pull/new/tickets/DM-36810