It has come to my attention that we under-documented a planned capability of the LSP, and I would like to add corresponding requirements to LDM-554. If this is unacceptable because of the present scope concerns, I would like to at least add tickets that recognize the existence of the specific work needed.
The capability is the one in the TAP and ADQL standards that involves the ability to upload an explicitly ephemeral table to the TAP service for use in a single query. This is distinct from the "user database workspace" which is foreseen in the LSP requirements, in which a user can create their own persistent tables, "MyDB"-style.
The relevant references are:
- DALI standard section 3.4.5 (UPLOAD service parameter): http://www.ivoa.net/documents/DALI/20170517/REC-DALI-1.1.html#tth_sEc3.4.5
- TAP standard section 2.7.6 (UPLOAD service parameter): http://www.ivoa.net/documents/TAP/20190927/REC-TAP-1.1.html#tth_sEc2.7.6
- TAP standard section 5.1.2 (example of using UPLOAD in an asynchronous query): http://www.ivoa.net/documents/TAP/20190927/REC-TAP-1.1.html#tth_sEc5.1.2
This capability is used, for instance, when a user has a list of N objects and wishes to perform cone searches around every one of them in a single bulk operation.
Many legacy pre-TAP query interfaces have this capability (e.g., including the legacy IRSA query services). It is important to provide in order to reduce the likelihood that users will submit thousands of trivial queries instead of batching them up.
There is a Portal Aspect requirement, DMS-PRTL-REQ-0021, to support an interface to such queries, and the Discussion for this requirement states:
Efficient implementation of list-based queries requires a corresponding API aspect / Data Access Web API service, to avoid the submission of large numbers of separate queries.
(Firefly supports such queries, though the capability was not exposed in the TAP query UI due to the absence of the underlying feature in the LSST TAP service.)
Temporary-table-upload queries are already supported by PyVO:
and are therefore immediately germane to the Notebook Aspect environment as well. They could easily be used to perform functions like querying for a set of light curves for multiple objects in a single operation, or to perform cone searches around a user's list of favorite AGNs, etc.
Due largely to an oversight in the LSP requirements-generation process, the precisely corresponding DAX requirement was never included in LDM-554.
There is a related requirement, DMS-API-REQ-0032, which states:
The API Aspect shall provide a capability for users to upload catalog data products (formatted as VOTables) residing within their allocated VOSpace, such that the catalog products after upload may be joined in queries against data release catalog products, subject to limitations of a resource quota system.
but this has been interpreted (including by me) as referring to the persistent User Database Workspace functionality.
I would like to suggest that we accept the following requirement, as a child of DMS-API-REQ-0006 "TAP Service for Tabular Queries":
DMS-API-REQ-xxx1 "TAP service temporary table upload":
Specification: The API Aspect TAP service shall support the standard UPLOAD parameter for the use of temporary, user-uploaded tables in ADQL expressions. Such temporary tables shall be able to be joined (including both ID-equality and spatial joins) against the principal LSST catalog data products.
Discussion: This requirement is distinct from requirements for a User Database Workspace for persistent, user-created databases.
As part of the discussion of this RFC, we should determine whether this needs to be expressed more clearly in the database requirements, LDM-555, as well. The existing requirement DMS-DB-REQ-0014 "Cross-matching with external/user data" could be pressed into service to support both temporary tables and the User Database Workspace, but it's fairly vague:
Users shall be able to cross-match the LSST catalogs with external catalogs. Some catalogs shall be provided by LSST, whilst other catalogs can be uploaded by the user. Results from these cross-matches can be used in subsequent queries.