Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin

...

Firstly, you need to obtain a membership in the ilc Virtual Organization (VO), which is sponsored by DESY. This is a VO on the LCG grid. Alternately, you can become a member of the calice VO, but this will only work if your institution is actually a member of this collaboration.

Follow the workflow at https://grid-voms.desy.de:8443/vovoms/ilc/vomrsregister/start.action which will require several rounds of confirmations and emails.

...

...

Somewhat confusingly, there is another ILC VO run by Fermilab located at https://voms.fnal.gov:8443/vomrs/ilc/vomrs which is on the OSG grid. The two Virtual Organizations are not interchangeable so make sure you get a membership in the DESY one to follow the instructions in this tutorial!

Setup

Info

The rest of the tutorial assumes that you are running from an lxplus node at CERN and using the bash shell.

This script will command can be used to setup the LCG grid tools at CERNenvironment using tools on the DESY AFS file system.

No Format
source /afs/cerndesy.chde/group/projectgrid/gdUI/LCGGLITE-sharepro/current/etc/profile.d/grid_env.sh

There is also exists a script on DESY afssetup script maintained at CERN.

No Format
source /afs/desycern.dech/groupproject/gridgd/UI/GLITE-proLCG-share/current/etc/profile.d/grid_env.sh

The DESY afs script seems to work the best. ( I am having trouble running EDG commands with the CERN setup script.)

VOMS Server

At the beginning of each session, a proxy certificate needs to be obtained.

...

The ilc VOMS information can be downloaded from http://grid.desy.de/etc/vomses/ilc-grid-voms.desy.de. The contents of this file should be added to the list of VOMS configuration data kept in ~/.glite/vomses. Alternately, the site's administrator your system administrator can install the ilc VO information into a central config file.

The following commands will download the VOMS file and add the information to the user config file in the home directory.

No Format

cd ~
mkdir .glite # only needed if .glite does not exist already-p ~/.glite/vomses
cd ~/.glite/vomses
wget http://grid.desy.de/etc/vomses/ilc-grid-voms.desy.de # fetch the VOMS file
touch vomses # make sure this file exists
cat ilc-grid-voms.desy.de >> vomses # add the ILC VOMS information

Now, the voms-proxy-init command should work correctly.


Now, the voms-proxy-init command should work correctly.

No Format

voms-proxy-init -verify -debug -voms ilc

LCG File System

In order to use the LCG file system, the LFC_HOST variable must be setup.

No Format

export LFC_HOST=`lcg-infosites --vo ilc lfc`

List files on the mass storage system at DESY.

No Format

lfc-ls -l  /grid

Put a file into DESY mass storage.

No Format

echo "test" > /tmp/test_file
lcg-cr -v --vo ilc -l lfn:/grid/ilc/test/test_file file:/tmp/test_file -d srm-dcache.desy.de

Copy a file from mass storage to local disk.

No Format

lcg-cp -v --vo calice lfn:/grid/ilc/test/test_file file:/tmp/test_file

Copying one of the Standard Model background files from a local NFS disk to the DESY dcache system and registering into the grid catalog.

No Format

> . /afs/desy.de/group/grid/UI/GLITE/etc/profile.d/grid-env.sh
> export LFC_HOST=grid-lfc.desy.de
> voms-proxy-init -debug -verify -voms ilc
> lcg-cr -v --vo ilc -n 10 file:/nfs/slac/g/lcd/ilc_data/ILC500/StandardModel/250fb-1_-80e-_+30e+_polarization-003.stdhep
  -l /grid/ilc/mc/ILC500/SM250fb-1/generated/250fb-1_-80e-_+30e+_polarization-003.stdhep
  -d srm://srm-dcache.desy.de/pnfs/desy.de/ilc/mc/ILC500/SM250fb-1/generated/250fb-1_-80e-_+30e+_polarization-003.stdhep

Listing the available SM files

No Format

lfc-ls -l /grid/ilc/mc/ILC500/SM250fb-1/generated/

Fetching a Grid LCIO File to SLAC from DESY mass storage

Here is a more advanced script showing how to fetch a file to SLAC, assuming that you have an lxplus account at CERN.

No Format

# ssh to cern
ssh lxplus

# setup grid tools
source /afs/cern.ch/project/gd/LCG-share/current/etc/profile.d/grid_env.sh

# start a grid session
No Format

voms-proxy-init -verify -debug -voms ilc

# copy the file to temp location on lxplus (it's fast)
lcg-cp -v --vo ilc lfn:/grid/ilc/mc-2008/simulated/LDC01_05Sc/ZPole/M-6-5p2_ZPoleuds_LDC01_05Sc_LCP_Test_ZPole_0001.slcio file:/tmp//M-6-5p2_ZPoleuds_LDC01_05Sc_LCP_Test_ZPole_0001.slcio

# copy file to slac (slow...go get some coffee)
scp M-6-5p2_ZPoleuds_LDC01_05Sc_LCP_Test_ZPole_0001.slcio jeremym@iris01.slac.stanford.edu:/afs/slac/g/lcd/public_data/ILC/test/mokka/

The user name jeremym should be replaced above with your real SLAC account name.

Test Commands

The GLITE UI will be used to run test jobs. It uses commands starting with "edg-" for job submission, monitoring, etc.

...

No Format
edg-job-get-output -i test.jid

List files on the mass storage system at DESY.

No Format

export LFC_HOST=`lcg-infosites --vo ilc lfc`
lfc-ls -l  /grid

Running SLIC

...

Running SLIC

Simple Shell Script

Create a shell script slic.sh.

...

  1. Downloads a tarball with the slic binary and untars it.
  2. Downloads a detector XML file.
  3. Prints the environment.
  4. Runs one event (single muon) on the detector file.
  5. Prints a list of LCIO files created.

Shell Script Using DESY Mass Storage

Here is another version of the script that uses grid commands to fetch a stdhep file and upload the output LCIO file using DESY mass storage. (This example would only work if you have been granted access to DESY mass storage.)

No Format

#!/bin/sh
printenv
wget http://www.lcsim.org/dist/slic/slic_v2r3p0_geant4_v9r0_i686_linux-gnu.tar.gz
tar xzf ./slic_v2r3p0_geant4_v9r0_i686_linux-gnu.tar.gz
wget http://www.lcsim.org/detectors/acme0605/acme0605.lcdd
lcg-cp -v --vo ilc lfn:/grid/ilc/test/test.stdhep file:`pwd`/test.stdhep
./SimDist/scripts/slic.sh -g ./acme0605.lcdd -i ./test.stdhep -r 1
ls -la
lcg-cr -v --vo ilc -l lfn:/grid/ilc/test/outfile.slcio file:`pwd`/outfile.slcio -d srm-dcache.desy.de
lfc-ls -l /grid/ilc/test/outfile.slcio

JDL File

Now make a file slic.jdl.

No Format
Executable         = "slic.sh";
StdOutput          = "out";
StdError           = "err";
InputSandbox       = {"slic.sh"};
OutputSandbox      = {"out","err","outfile.slcio"};

The OutputSandbox contains a list of files that will be cached for retrieval later, including the output LCIO file.

Job Submission, Monitoring, and Output Retrieval

To run the SLIC test job.

...

No Format
edg-job-status -i slic.jdljid

Retrieve the output. By, default the job output will go into a directory in /tmp so we specify the current directory instead.

No Format
edg-job-get-output --dir `pwd` -i slic.jdljid

gLite User Guide
Virtual Data Toolkit (VDT)
CERN AFS UI Setup - setup instructions on lxplus@cern