...
- Alarm Handling
- The Fast Monitoring process should output an xml file with a list of the alarms/warnings/errors detected on the produced monitoring histograms
- This file should be registered with the data catalog (is it there already?)
- A notification should be added to the Logging application which should point to the Data Quality Monitoring application (is it possible to add the desired target link as part of the notification's metadata?)
- The file needs to be ingested producing summary and detailed information on the alarms/warnings/errors
- Data Trending
- Design tables to ingest the trending data: some 20K quantities at a frequency between 10 seconds and 5 minutes
- We might have several copies of the same tables to accumulate data at different frequencies
- Given the volume of data a database table only approach might be insufficient. We might have to consider a hybrid solution that involves reading data straight from tuple files (less efficient that reading from a db)
- Write the code to ingest the data.
- Find somebody responsible for the ingestion code. Should be somebody that understands the data! (in other words not me)
- Write the code to produce trends
- Improve the application's UI
- User preferences
- Improvements based on user's feedback
Scope
Display ASP data products for a pool of sources.
Status
This is the second implementation of this application. We have developed a set of databases to keep a list of sources and to ingest data to be trended at different frequencies. The sources database can be loaded from xml files used by ASP. Scripts are available to ingest the data from the pipeline at the end of ASP processing.
This is still a preliminary version of the application and needs feedback to better define its scope and use.
To-Do List
- Improve the application's UI
Scope
This application should provide a quick and intuitive look at the status of data processing from the moment it is being Fast Copied through the various processing steps that lead to the final data products.
Status
Only a mock-up version is ready. It was meant to be used to prompt a discussion.
To-Do List
- Downlink-RunId database table
- We need a database table to extract which run numbers are contained in each downlink (Bryson will write and populate this database table?)
- Define the progress mechanism
- Each of the processing steps has to provide some feedback on its progress status. It might be possible to extract it from the Pipeline. Otherwise we have to define a mechanism for it.
- Meet with Tony and Warren to talk about this
The portal is the icing on the cake. It will be targeted and developed at the very end.
Scope
Provide a rich and highly customizable environment for viewing data from all the above (and below) applications.
Status
We have developed three portlets to prove that we can extract data from external applications. These portlets can provide tables (from Logging and Fast Copy) and pots (from TelemetryTrending).
To-Do List
Do the rest.
Scope
Status
To-Do List
Cross Trending/Reports
Scope
This application should give users the possibility to fetch histograms and data trends from all the above applications and to create scatter plots, overlays or tables of data.
It should also be possible to load simple user-written jsp pages as reports. Users can create a list of favorite reports to be processed at different frequencies (say last 24 hours, last week, last month etc.)
Status
A toy version is available for the cross trending part. The Reports are still in the discussion phase.
To-Do List