You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 28 Next »

ILC VO

Firstly, you need to obtain a membership in the ilc Virtual Organization (VO), which is sponsored by DESY. This is a VO on the LCG grid. Alternately, you can become a member of the calice VO, but this will only work if your institution is actually a member of this collaboration.

Follow the workflow at https://grid-voms.desy.de:8443/vo/ilc/vomrs which will require several rounds of confirmations and emails.

Somewhat confusingly, there is another ILC VO run by Fermilab located at https://voms.fnal.gov:8443/vomrs/ilc/vomrs which is on the OSG grid. The two Virtual Organizations are not interchangeable so make sure you get a membership in the DESY one to follow the instructions in this tutorial!

Setup

The rest of the tutorial assumes that you are running from an lxplus node at CERN and using the bash shell.

This script will setup the LCG grid tools at CERN.

source /afs/cern.ch/project/gd/LCG-share/current/etc/profile.d/grid_env.sh

There is also a script on DESY afs.

source /afs/desy.de/group/grid/UI/GLITE-pro/etc/profile.d/grid_env.sh

The DESY afs script seems to work the best. (I am having trouble running EDG commands with the CERN setup script.)

VOMS Server

At the beginning of each session, a proxy certificate needs to be obtained.

voms-proxy-init -verify -debug -voms ilc

This command should be executed at the start of every session in which jobs are going to be submitted.

If you get the message VOMS Server for ilc not known! when trying to run this command, or if other similar error messages occur when running other grid commands, then the information about the ILC VOMS server is missing and needs to be installed at your site.

The ILC VO information can be found on the VOMS at DESY page.

The ilc VOMS information can be downloaded from http://grid.desy.de/etc/vomses/ilc-grid-voms.desy.de. The contents of this file should be added to the list of VOMS configuration data kept in ~/.glite/vomses. Alternately, the site's administrator install the ilc VO information into a central config file.

The following commands will download the VOMS file and add the information to the user config file in the home directory.

cd ~
mkdir .glite # only needed if .glite does not exist already
cd .glite
wget http://grid.desy.de/etc/vomses/ilc-grid-voms.desy.de # fetch the VOMS file
touch vomses # make sure this file exists
cat ilc-grid-voms.desy.de >> vomses # add the ILC VOMS information

Now, the voms-proxy-init command should work correctly.

voms-proxy-init -verify -debug -voms ilc

Test Commands

The GLITE UI will be used to run test jobs. It uses commands starting with "edg-" for job submission, monitoring, etc.

If not running from DESY, many of the edg commands will require a config file for the ilc VO, which can be copied from a DESY afs location.

In another shell on lxplus, execute the following commands.

source /afs/desy.de/group/grid/UI/GLITE-pro/etc/profile.d/grid_env.sh
cp $EDG_LOCATION/etc/ilc/edg_wl_ui.conf ~

The file ~edg_wl_ui.conf needs to be used in place of "--vo ilc" in the test commands at http://grid.desy.de/ilc/.

Now follow the instructions for submitting a test job at http://grid.desy.de/ilc/ under "Submitting jobs to the Grid".

Look for resources to run a job.

edg-job-list-match --config-vo ./edg_wl_ui.conf test.jdl

Run a test job.

edg-job-submit --nogui --config-vo ./edg_wl_ui.conf -o test.jid test.jdl

Check the job status.

edg-job-status -i test.jid

Get the output.

edg-job-get-output -i test.jid

Running SLIC

This is a trivial example of running the SLIC simulator on the LCG grid.

Create a shell script slic.sh.

#!/bin/sh
wget http://www.lcsim.org/dist/slic/slic_v2r3p0_geant4_v9r0_i686_linux-gnu.tar.gz
tar xzf ./slic_v2r3p0_geant4_v9r0_i686_linux-gnu.tar.gz
wget http://www.lcsim.org/detectors/acme0605/acme0605.lcdd
printenv
./SimDist/scripts/slic.sh -g ./acme0605.lcdd -r 1
ls *.slcio

The above script does the following.

  1. Downloads a tarball with the slic binary and untars it.
  2. Downloads a detector XML file.
  3. Prints the environment.
  4. Runs one event (single muon) on the detector file.
  5. Prints a list of LCIO files created.

Now make a file slic.jdl.

Executable         = "slic.sh";
StdOutput          = "out";
StdError           = "err";
InputSandbox       = {"slic.sh"};
OutputSandbox      = {"out","err","outfile.slcio"};

The OutputSandbox contains a list of files that will be cached for retrieval later, including the output LCIO file.

To run the SLIC test job.

edg-job-submit --nogui --config-vo ./edg_wl_ui.conf -o slic.jid slic.jdl

Check the job status.

edg-job-status -i slic.jdl

Retrieve the output. By, default the job output will go into a directory in /tmp so we specify the current directory instead.

edg-job-get-output --dir `pwd` -i slic.jdl

Other Commands

List files on the mass storage system at DESY.

export LFC_HOST=`lcg-infosites --vo ilc lfc`
lfc-ls -l  /grid

Put a file into DESY mass storage.

echo "test" > /tmp/test_file
lcg-cr -v --vo ilc -l lfn:/grid/ilc/test/test_file file:/tmp/test_file -d srm-dcache.desy.de

gLite User Guide

  • No labels