You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

The following is a list of web applications that will be used in the October ISOC Operations Simulation, with a short description, their status and to-do list.

The purpose of this document is to collect comments to help us prioritize the work to get the ready for the October deadline.

Data Quality Monitoring

Scope

This application will provide output histograms and data trends resulting from the Fast Monitoring, Digi and Recon processing. The driving process is the L1Proc; root files from each of these processing steps will be registered with the data catalog and picked up by the application in order to display quality histograms. Separate processes will load Digi and Recon tuple files to ingest them into a database for the data trending.

Status

It is currently possible to display histograms from Fast Monitoring, Digi and Recon.

We have a first implementation of the code to ingest trending data into general purpose database tables and managed to produce plots from the web application. The result of this effort was that we need to create new database tables specific to this application.

To-Do List

  • Alarm Handling
    • The Fast Monitoring process should output an xml file with a list of the alarms/warnings/errors detected on the produced monitoring histograms
    • This file should be registered with the data catalog (is it there already?)
    • A notification should be added to the Logging application which should point to the Data Quality Monitoring application (is it possible to add the desired target link as part of the notification's metadata?)
    • The file needs to be ingested producing summary and detailed information on the alarms/warnings/errors
  • Data Trending
    • Design tables to ingest the trending data: some 20K quantities at a frequency between 10 seconds and 5 minutes
    • We might have several copies of the same tables to accumulate data at different frequencies
    • Given the volume of data a database table only approach might be insufficient. We might have to consider a hybrid solution that involves reading data straight from tuple files (less efficient that reading from a db)
    • Write the code to ingest the data.
    • Find somebody responsible for the ingestion code. Should be somebody that understands the data! (in other words not me)
    • Write the code to produce trends
  • Improve the application's UI
    • User preferences
    • Improvements based on user's feedback

Source Monitoring

Scope

Status

To-Do List

Data Processing

Scope

Status

To-Do List

Portal

Scope

Status

To-Do List

Ops Log

Scope

Status

To-Do List

Cross Trending/Reports

Scope

Status

To-Do List

  • No labels