Fix Version/s: None
We now have an initial/preliminary obscore configuration for embargo repo, so we can go ahead with adding obscore table based on that. Actual job is done using migration scripts in daf_butler_migrate to create the table and butler command in dax_obscore to populate that with pre-existing data.
I want to test this on a copy of embargo repo to make sure (once more) that migration scripts work OK. I also think that migration scripts were written for namespace=oga, and new config uses namespace=embargo (I think embargo is better choice), so likely I'll need to update migration scripts as well.
Do you have a suggestion for how to poke at the data that are accumulating? I could probably figure out how to do something with SQLAlchemy from a USDF-RSP notebook, if that connection is allowed.
A direct SQL query is probably the only way until ObsTAP is there. sqlalchemy should work OK from USDF, I think authentication should be already configured for you (if not then you need to create .pgpass file). Embargo repo is in oga schema, other than that it should be easy to write simple select(). Let me know if you need an example.
And another approach, which I frequently to use personally, is some other database tool, e.g. DBeaver, and connect to the server directly and run SQL queries. You may need to set SSH tunnel for that if you are outside SLAC.
Closing this as obscore is created, and it is regularly updated. We'll need to extend ingest process (needs to update raw regions after ingesting them), but it will need a separate ticket.
A small change was needed to a migration script in daf_butler_migrate to change namespace from "oga" to "embargo". Self-reviewed and merged.