You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 7 Next »

General

  • Stop using SLACDEV database (in progress)
    • Rationalize use of Dev/prod etc, decide if we need other configuration options
      • For ISOC the databases are: Flight, Integration, Test, Nightly
  • Look into tomcat clustering
  • Integration between monitoring tools and Ops Log
    • Ability to easily copy plots to ops log and comment on them
    • Ability to find all recent comments on a plot
  • Figure out why automatic generation of tomcat configuration on glast-win01,02 did not work

Infrastructure

  • implement links between application (in progress)

Data Quality Monitoring

Scope

This application will provide output histograms and data trends resulting from the Fast Monitoring, Digi and Recon processing. The driving process is the L1Proc; root files from each of these processing steps will be registered with the data catalog and picked up by the application in order to display quality histograms. Separate processes will load Digi and Recon tuple files to ingest them into a database for the data trending.

Status

  • It is currently possible to display histograms from Fast Monitoring, Digi and Recon.
  • We have a first implementation of the code to ingest trending data into general purpose database tables and managed to produce plots from the web application. The result of this effort was that we need to create new database tables specific to this application.
  • Alarm Handling
    • The Fast Monitoring process ouputs an xml file with a list of the alarms/warnings/errors detected on the produced monitoring histograms
    • This file is registered with the data catalog
    • A notification is added to the Logging application which points to the Data Quality Monitoring application
    • The xml file is ingested by the application producing summary and detailed information on the alarms/warnings/errors
  • Data Trending
    • Tables to ingest the trending data are available : some 50K quantities at a frequency between 10 seconds and 5 minutes
    • Several copies of the same tables are used to accumulate data at different frequencies
    • Code to ingest the data is available and used as part of the L1Proc.
    • Trending plots are produced from the database
  • Plot Description
    • Description can be added to each plot

To-Do List

  • Alarm Handling
    • Ingest detailed information from xml file, like which bins produced an alarm etc.
    • Display alarm information on the plots, like warning/alarm limits or arrows on responsible bins
  • Data Trending
    • Given the volume of data a database table only approach might be insufficient. We might have to consider a hybrid solution that involves reading data straight from tuple files (less efficient that reading from a db)
  • Improve the application's UI
    • User preferences
    • Improvements based on user's feedback

Source Monitoring Jira

Scope

Display ASP data products for a pool of sources.

Status

This is the second implementation of this application. We have developed a set of databases to keep a list of sources and to ingest data to be trended at different frequencies. The sources database can be loaded from xml files used by ASP. Scripts are available to ingest the data from the pipeline at the end of ASP processing.

This is still a preliminary version of the application and needs feedback to better define its scope and use.

To-Do List

  • Improve the application's UI
    • User preferences

Pipeline Jira

To-Do List

  • (in progress Dan) Process rollback
    • This is something which is really needed by the L1Proc folks. It would be good to look at the existing code and see if it is possible to implement this before rewriting all the stored procedures in Java.

Data Catalog Jira FE Jira BE

To-Do List

  • (in progress Dan) Finish migration from dataportal-model to datacat-client
  • (in progress Dan) Add the ability to make files from the data catalog search result in the pipeline
  • (in progress Dan) Finish proposed changes to datacat line mode client
    • easier registration of files
    • ability to remove existing files
  • (in progress Karen) add meta-data from the web interface
  • Improve data catalog interface especially for real data
    • L1 data products arranged by groups rather than folders
    • Look into WebDav/GUI for data catalog

Data Processing

Scope

This application should provide a quick and intuitive look at the status of data processing from the moment it is being Fast Copied through the various processing steps that lead to the final data products.

Status

A preliminary version of the application has been used during the October Test and it worked pretty well.

  • Downlink-RunId database tables have been created
    • with these tables it is possible to extract which run numbers are contained in each downlink
  • The progress mechanism has been defined as the number of completed sub tasks for a given process.
    • When new sub tasks are forked off the overall progress might go backwards

To-Do List

  • Remove duplication between data processing page and other apps
    • One possibility is to hide queries in functions

Ops Log

Scope

Status

To-Do List

  • Make Ops Log use same login system as everything else

GCN/GRB Web front end

Scope

Tabular overview of most recent Noticies and possibility to browse GRB/Noticies.

Status

  • Databases have been designed.

Portal

The portal is the icing on the cake. It will be targeted and developed at the very end.

Scope

Provide a rich and highly customizable environment for viewing data from all the above (and below) applications.

Status

We have developed three portlets to prove that we can extract data from external applications. These portlets can provide tables (from Logging and Fast Copy) and pots (from TelemetryTrending).

To-Do List

Do the rest.

Cross Trending/Reports

Scope

This application should give users the possibility to fetch histograms and data trends from all the above applications and to create scatter plots, overlays or tables of data.

It should also be possible to load simple user-written jsp pages as reports. Users can create a list of favorite reports to be processed at different frequencies (say last 24 hours, last week, last month etc.)

A user would write reports in the form of jsp pages with some special tags to embed plots, text, links, etc.
These pages will be registered with the applications.
The application would provide a tabulated lists of all the available reports and a way to generate them on some time period.

Status

A toy version is available for the cross trending part. The Reports are still in the discussion phase.

To-Do List

  • Write database tables to store report jsp pages
  • Define tags API to embed plots in jsp apges

Data Server

  • Get LAT Data Server tied into L1Proc

Data Portal Jira

  • Get portal working, at least for items like data processing page, grb summary etc
    • Generate RSS feeds from LogWatcher, Ops Log, JIRA, Confluence etc to display on portal page

Web Apps / Servers Monitor

  • Ability to monitor all tomcat servers/applications from one page (and maybe restart them)
  • No labels