ILC VO

Firstly, you need to obtain a membership in the ilc Virtual Organization (VO), which is sponsored by DESY. This is a VO on the LCG grid. Alternately, you can become a member of the calice VO, but this will only work if your institution is actually a member of this collaboration.

Follow the workflow at https://grid-voms.desy.de:8443/voms/ilc/register/start.action which will require several rounds of confirmations and emails.

Setup

The rest of the tutorial assumes that you are running from an lxplus node at CERN and using the bash shell.

This command can be used to setup the LCG environment using tools on the DESY AFS file system.

source /afs/desy.de/group/grid/UI/GLITE-pro/etc/profile.d/grid_env.sh

There also exists a setup script maintained at CERN.

source /afs/cern.ch/project/gd/LCG-share/current/etc/profile.d/grid_env.sh

The DESY afs script seems to work the best. I am having trouble running EDG commands with the CERN setup script.

VOMS Server

At the beginning of each session, a proxy certificate needs to be obtained.

voms-proxy-init -verify -debug -voms ilc

This command should be executed at the start of every session in which jobs are going to be submitted.

If you get the message VOMS Server for ilc not known! when trying to run this command, or if other similar error messages occur when running other grid commands, then the information about the ILC VOMS server is missing and needs to be installed at your site.

The ILC VO information can be found on the VOMS at DESY page.

The ilc VOMS information can be downloaded from http://grid.desy.de/etc/vomses/ilc-grid-voms.desy.de. The contents of this file should be added to the list of VOMS configuration data kept in ~/.glite/vomses. Alternately, your system administrator can install the ilc VO information into a central config file.

The following commands will download the VOMS file and add the information to the user config file in the home directory.

mkdir -p ~/.glite/vomses
cd ~/.glite/vomses
wget http://grid.desy.de/etc/vomses/ilc-grid-voms.desy.de

Now, the voms-proxy-init command should work correctly.

voms-proxy-init -verify -debug -voms ilc

LCG File System

In order to use the LCG file system, the LFC_HOST variable must be setup.

export LFC_HOST=`lcg-infosites --vo ilc lfc`

List files on the mass storage system at DESY.

lfc-ls -l  /grid

Put a file into DESY mass storage.

echo "test" > /tmp/test_file
lcg-cr -v --vo ilc -l lfn:/grid/ilc/test/test_file file:/tmp/test_file -d srm-dcache.desy.de

Copy a file from mass storage to local disk.

lcg-cp -v --vo calice lfn:/grid/ilc/test/test_file file:/tmp/test_file

Copying one of the Standard Model background files from a local NFS disk to the DESY dcache system and registering into the grid catalog.

> . /afs/desy.de/group/grid/UI/GLITE/etc/profile.d/grid-env.sh
> export LFC_HOST=grid-lfc.desy.de
> voms-proxy-init -debug -verify -voms ilc
> lcg-cr -v --vo ilc -n 10 file:/nfs/slac/g/lcd/ilc_data/ILC500/StandardModel/250fb-1_-80e-_+30e+_polarization-003.stdhep
  -l /grid/ilc/mc/ILC500/SM250fb-1/generated/250fb-1_-80e-_+30e+_polarization-003.stdhep
  -d srm://srm-dcache.desy.de/pnfs/desy.de/ilc/mc/ILC500/SM250fb-1/generated/250fb-1_-80e-_+30e+_polarization-003.stdhep

Listing the available SM files

lfc-ls -l /grid/ilc/mc/ILC500/SM250fb-1/generated/

Fetching a Grid LCIO File to SLAC from DESY mass storage

Here is a more advanced script showing how to fetch a file to SLAC, assuming that you have an lxplus account at CERN.

# ssh to cern
ssh lxplus

# setup grid tools
source /afs/cern.ch/project/gd/LCG-share/current/etc/profile.d/grid_env.sh

# start a grid session
voms-proxy-init -verify -debug -voms ilc

# copy the file to temp location on lxplus (it's fast)
lcg-cp -v --vo ilc lfn:/grid/ilc/mc-2008/simulated/LDC01_05Sc/ZPole/M-6-5p2_ZPoleuds_LDC01_05Sc_LCP_Test_ZPole_0001.slcio file:/tmp//M-6-5p2_ZPoleuds_LDC01_05Sc_LCP_Test_ZPole_0001.slcio

# copy file to slac (slow...go get some coffee)
scp M-6-5p2_ZPoleuds_LDC01_05Sc_LCP_Test_ZPole_0001.slcio jeremym@iris01.slac.stanford.edu:/afs/slac/g/lcd/public_data/ILC/test/mokka/

The user name jeremym should be replaced above with your real SLAC account name.

Test Commands

The GLITE UI will be used to run test jobs. It uses commands starting with "edg-" for job submission, monitoring, etc.

If not running from DESY, many of the edg commands will require a config file for the ilc VO, which can be copied from a DESY afs location.

In another shell on lxplus, execute the following commands.

source /afs/desy.de/group/grid/UI/GLITE-pro/etc/profile.d/grid_env.sh
cp $EDG_LOCATION/etc/ilc/edg_wl_ui.conf ~

The file ~edg_wl_ui.conf needs to be used in place of "--vo ilc" in the test commands at http://grid.desy.de/ilc/.

Now follow the instructions for submitting a test job at http://grid.desy.de/ilc/ under "Submitting jobs to the Grid".

Look for resources to run a job.

edg-job-list-match --config-vo ./edg_wl_ui.conf test.jdl

Run a test job.

edg-job-submit --nogui --config-vo ./edg_wl_ui.conf -o test.jid test.jdl

Check the job status.

edg-job-status -i test.jid

Get the output.

edg-job-get-output -i test.jid

Running SLIC

Simple Shell Script

Create a shell script slic.sh.

#!/bin/sh
wget http://www.lcsim.org/dist/slic/slic_v2r3p0_geant4_v9r0_i686_linux-gnu.tar.gz
tar xzf ./slic_v2r3p0_geant4_v9r0_i686_linux-gnu.tar.gz
wget http://www.lcsim.org/detectors/acme0605/acme0605.lcdd
printenv
./SimDist/scripts/slic.sh -g ./acme0605.lcdd -r 1
ls *.slcio

The above script does the following.

  1. Downloads a tarball with the slic binary and untars it.
  2. Downloads a detector XML file.
  3. Prints the environment.
  4. Runs one event (single muon) on the detector file.
  5. Prints a list of LCIO files created.

Shell Script Using DESY Mass Storage

Here is another version of the script that uses grid commands to fetch a stdhep file and upload the output LCIO file using DESY mass storage. (This example would only work if you have been granted access to DESY mass storage.)

#!/bin/sh
printenv
wget http://www.lcsim.org/dist/slic/slic_v2r3p0_geant4_v9r0_i686_linux-gnu.tar.gz
tar xzf ./slic_v2r3p0_geant4_v9r0_i686_linux-gnu.tar.gz
wget http://www.lcsim.org/detectors/acme0605/acme0605.lcdd
lcg-cp -v --vo ilc lfn:/grid/ilc/test/test.stdhep file:`pwd`/test.stdhep
./SimDist/scripts/slic.sh -g ./acme0605.lcdd -i ./test.stdhep -r 1
ls -la
lcg-cr -v --vo ilc -l lfn:/grid/ilc/test/outfile.slcio file:`pwd`/outfile.slcio -d srm-dcache.desy.de
lfc-ls -l /grid/ilc/test/outfile.slcio

JDL File

Now make a file slic.jdl.

Executable         = "slic.sh";
StdOutput          = "out";
StdError           = "err";
InputSandbox       = {"slic.sh"};
OutputSandbox      = {"out","err","outfile.slcio"};

The OutputSandbox contains a list of files that will be cached for retrieval later, including the output LCIO file.

Job Submission, Monitoring, and Output Retrieval

To run the SLIC test job.

edg-job-submit --nogui --config-vo ./edg_wl_ui.conf -o slic.jid slic.jdl

Check the job status.

edg-job-status -i slic.jid

Retrieve the output. By, default the job output will go into a directory in /tmp so we specify the current directory instead.

edg-job-get-output --dir `pwd` -i slic.jid

gLite User Guide
Virtual Data Toolkit (VDT)
CERN AFS UI Setup - setup instructions on lxplus@cern

  • No labels