You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

Table of Contents


IRMIS Overview

"IRMIS is a collaborative effort between several EPICS sites to build a common Relational DataBase schema and a set of tools to populate and search an RDB that contains information about the operational EPICS IOCs installed at that site." IRMIS (the schema, crawler programs and UI) was developed by Don Dohan and Claude Saunders at APS. For general information and distributions see the IRMIS home page.


IRMIS @ SLAC

IRMIS is used at SLAC for the following purposes, from several different interfaces:

Elements of the IRMIS database that have been adopted and modified for the controls software group at SLAC:

  • The Oracle PV schema
  • PV, ALH crawlers
  • the IRMIS gui
  • Other elements of the collaboration IRMIS installation include cabling, device and application schemas. We are not populating these now...but may in the future.

Elements modified or created at SLAC:

  • IOC boot syntax adaptations in the PV crawler and IOC Info crawler (adopted from SNS) and schema
  • Special application-oriented tables in the IRMIS schema, and their population scripts (for example for EPICS camdmp and the archiver PV viewer app)
  • Addition of config file tree structure crawling in the ALH crawler
  • New crawlers for Channel Watcher and Channel Archiver
  • New PV Client viewer addition to the IRMIS Desktop
  • New guis

UI and DATABASE QUERYING

IRMIS GUI
This is a java UI for the IRMIS Oracle database, developed by Claude Saunders of the EPICS collaboration, can be invoked in 3 ways:

  1. from the lclshome edm display:
    click the “IRMIS…” button
  2. from a Solaris or Linux workstation:
    run this script:
    irmisUI

The gui paradigm is a set of “document types”; click the File/New Document menu for the list. Right now there are only 2 available for use at SLAC:

  1. idt::pv – Search for lists of PVs and IOCs. This is the most useful interface, and it comes up upon application startup.
  2. idt::pvClient – Search for PV Client lists (alarm handler, archiver, channel watcher)

Query results can be saved to an ascii file for further processing.

IOC Parameters APEX application
This is an APEX application showing IOC configuration data, and various operational parameter snapshots (obtained live nightly using caget).
https://seal.slac.stanford.edu/apex/mccqa/f?p=104:8

IOC Info query application
This is a jsp web application containing data about IOCs and applications, and their configurations.
https://seal.slac.stanford.edu/IRMISQueries/

EPICS camdmp APEX application
Various reports listing PVs and their module and channel connections.
https://seal.slac.stanford.edu/apex/mccqa/f?p=103:4

Archiver PV query APEX application
Lists of archived PVs by IOC application, IOC.
https://seal.slac.stanford.edu/apex/mccqa/f?p=259:8

IOC list report (run nightly)
This html report is created nightly by the LCLS PV Crawler
http://mccas0.slac.stanford.edu/crawler/ioc_report.html

PV Crawler logs (which include the duplicate PV lists)
http://www.slac.stanford.edu/grp/lcls/controls/sysGroup/report/
a subset here: http://www.slac.stanford.edu/cgi-bin/lwgate/CONTROLS-SOFTWARE-REPORTS/archives

SQL querying
A view has been created to ease sql querying for PV lists. This view combines data from the IOC_BOOT, IOC, REC and REC_TYPE tables. It selects currently loaded PVs, where IOC_BOOT.CURRENT_LOAD = 1, with the latest IOC boot date captured.

  • CURR_PVS view has all currently loaded PVs

OPERATIONAL and SUPPORT DETAILS

At SLAC, in a nutshell the PV and PV client crawlers update:

  • all IRMIS tables in MCCQA. All IRMIS UIs query data on MCCQA: PV and IOC data, and PV Client data.
  • 3 production tables on MCCO, which are used by other system – AIDA, BSA applications

Oracle schemas and accounts
The IRMIS database schema is installed in 4 SLAC Oracle instances:

  • MCCQA IRMISDB schema: Production for all gui’s and applications. Contains data populated nightly by perl crawler scripts from production IOC configuration files. Data validation is done following each load.
  • MCCO IRMISDB schema: contains production data, currently for 3 tables only: bsa_root_names, devices_and_attributes, curr_pvs. MCCQA data is copied to MCCO once it has been validated. So MCCO is as close as possible to pristine data at all times.
  • SLACDEV contains the 3 schemas which are used for developing, testing and staging new features before release to production on MCCQA (see accounts below)
    #IRMISDB – sandbox for all kinds of development, data sifting, etc. Not refreshed from prod, or not very often.
    #IRMISDB_TEST – testing for application implementation. Refreshed from prod at the start of a development project.
    #IRMISDB_STAGE – staging for testing completed applications against recently-refreshed production data.
  • _SLACPROD_ IRMISDB schema: this is obsolete, but will be kept around for awhile (6 months?) as a starting point in case the DB migration goes awry somehow.
    Other Oracle accounts
    · IRMIS_RO – read-only account (not used much yet – but available)

· IOC_MGMT – created for earlier IOC info project with a member of the EPICS group which is not active at the moment- new ioc info work is being done using IRMISDB

For passwords see Judy Rock or Poonam Pandey or Elie Grunhaus.

As of September 15, all crawler-related shell scripts and perl scripts use the getPwd script (Greg White) to get the latest Oracle password. Oracle passwords must be changed every 6 months; new passwords will be given to Ken Brobeck to update the secure master password files at password change time.

**The IRMIS GUI and the JSP application still use hardcoded passwords. These must be changed “manually” at every password change cycle.

Database structure: see schema diagram below. (this diagram excludes the EPICS camdmp structure, which is documented separately here: <url will be supplied>)

Crawler scripts
The PV crawler is run once for each IOC boot directory structure. The LCLS PV crawler (runLCLSPVCrawlerLx.bash) runs the crawler only once. It is separate to enable it to be run on a different schedule and different host which can see the LCLS IOC directories. Also, the crawler code has been modified to be LCLS-specific; it is a different version than the SLAC PV crawler.

The SLAC PV crawler (runSLACPVCrawler.csh) runs the crawler a couple of times to accommodate the various CD IOC directory structures.

cron jobs

· LCLS side: laci on lcls-daemon2: runLCLSPVcrawlerLx.bash: crawls LCLS PVs and creates lcls-specific tables (bsa_root_names, devices_and_attributes), copies LCLS client config files to dir where CD client crawlers can see them.

· LCLS side: laci on lcls-daemon2: caget4curr_ioc_device.bash: does cagets to populate curr_ioc_devices for the IOC Info APEX app. Run separately from the crawlers because cagets can hang unexpectedly – they are best done in an isolated script!

· CD side: cddev on slcs2: runAllCDCrawlers.csh: runs CD PV crawler and all client crawlers, data validation, and sync to MCCO.

· FACET side: flaci on facet-daemon2: runFACETPVcrawlerLx.bash: crawls FACET PVs, copies FACET client config files to dir where CD client crawlers can see them.

PV crawler operation summary

For the location of the crawler scripts, see Source code directories below.

Basic steps as called by cron scripts are:

1. run FACET pv crawler to populate MCCQA tables

2. run LCLS pv crawlers to populate MCCQA tables

3. run CD pv and pv client crawlers to populate MCCQA tables

4. run Data Validation for PV data in MCCQA

5. if Data Validation returns SUCCESS, run synchonization of MCCQA data to selected (3 only at the moment) MCCO tables.

6. run caget4curr_ioc_device to populate caget columns of curr_ioc_device

    • For PV Crawlers: the crawler group for any given IOC is determined by its row in the IOC table. The system column refers to the boot group for the IOC, as shown below.
  • The PV client crawlers load all client directories in their config files; currently includes both CD and LCLS.

LOGFILES, Oracle audit table

Log filenames are created by appending a timestamp to the root name shown in the tables below.

The major steps in the crawler jobs write entries into the Oracle CONTROLS_GLOBAL.DATA_VALIDATION_AUDIT table. Each entry has these attributes:

o Instance

o Schema

o Process

o Stage

o Status

o Message

o TOD (time of day)

(see below for details on querying this table)

Descriptions of the MAIN scripts (there other subsidiary scripts as well):

these are all ultimately invoked from the cron jobs shown above; the cron scripts call the others.

BLUE script names are on the CD side

GREEN script names are on the LCLS side

PURPLE script names are on the FACET side

  • No labels