ILC VO
Firstly, you need to obtain a membership in the ilc Virtual Organization (VO), which is sponsored by DESY. This is a VO on the LCG grid.
Follow the workflow at https://grid-voms.desy.de:8443/vo/ilc/vomrs which will require several rounds of confirmations and emails.
Somewhat confusingly, there is another ILC VO run by Fermilab located at https://voms.fnal.gov:8443/vomrs/ilc/vomrs which is on the OSG grid. The two Virtual Organizations are not interchangeable so make sure you get a membership in the DESY one to follow the instructions in this tutorial!
Setup
The rest of the tutorial assumes that you are running from an lxplus node at CERN and using the bash shell.
This script will setup the LCG grid tools at CERN.
source /afs/cern.ch/project/gd/LCG-share/current/etc/profile.d/grid_env.sh
There is also a script on DESY afs.
source /afs/desy.de/group/grid/UI/GLITE-pro/etc/profile.d/grid_env.sh
The DESY afs script seems to work the best. (I am having trouble running EDG commands with the CERN setup script.)
VOMS Server
It is my understanding that before you do anything, you need to authenticate with a proxy server called a "VOMS server".
voms-proxy-init -verify -debug -voms ilc
This needs to be done at the start of every session.
If you get the message VOMS Server for ilc not known! when trying to run this command, or any other command, then the information about the ILC VOMS server is missing and needs to be installed at your site.
The ilc VOMS file can be found at http://grid.desy.de/etc/vomses/ilc-grid-voms.desy.de and should be added to the list of VOMS configuration data kept in ~/.glite/vomses, or the site's grid administrator can add this information to a central config file.
These commands will download the VOMS file and add the information to the user's config file.
cd ~ mkdir .glite # only needed if .glite does not exist already cd .glite wget http://grid.desy.de/etc/vomses/ilc-grid-voms.desy.de # fetch the VOMS file touch vomses cat ilc-grid-voms.desy.de >> vomses # add ILC VOMS
Now the voms-proxy-init command should work correctly.
voms-proxy-init -verify -debug -voms ilc
Test Commands
We will use the GLITE UI which uses commands like edg-* for job submission, monitoring, etc.
If not running from DESY, GLITE needs to be told about the ILC VO servers.
In another shell on lxplus, execute the following commands.
source /afs/desy.de/group/grid/UI/GLITE-pro/etc/profile.d/grid_env.sh cp $EDG_LOCATION/etc/ilc/edg_wl_ui.conf ~
The file ~edg_wl_ui.conf needs to be used in place of "--vo ilc" in the test commands at http://grid.desy.de/ilc/.
Now follow the instructions for submitting a test job at http://grid.desy.de/ilc/ under "Submitting jobs to the Grid".
edg-job-list-match --config-vo ./edg_wl_ui.conf test.jdl
Run a test job.
edg-job-submit --nogui --config-vo ./edg_wl_ui.conf -o test.jid test.jdl
Test job status.
edg-job-status -i test.jid
Get the output.
edg-job-get-output -i test.jid
List files on the mass storage system at DESY.
export LFC_HOST=`lcg-infosites --vo ilc lfc` lfc-ls -l /grid
Running SLIC
Create a shell script slic.sh.
#!/bin/sh wget http://www.lcsim.org/dist/slic/slic_v2r3p0_geant4_v9r0_i686_linux-gnu.tar.gz tar xzf ./slic_v2r3p0_geant4_v9r0_i686_linux-gnu.tar.gz wget http://www.lcsim.org/detectors/acme0605/acme0605.lcdd printenv ./SimDist/scripts/slic.sh -g ./acme0605.lcdd -r 1 ls *.slcio
Now make a file slic.jdl.
Executable = "slic.sh"; StdOutput = "out"; StdError = "err"; InputSandbox = {"slic.sh"}; OutputSandbox = {"out","err","outfile.slcio"};
To run the SLIC test job.
edg-job-submit --nogui --config-vo ./edg_wl_ui.conf -o slic.jid slic.jdl
Check the job status.
edg-job-status -i slic.jdl
Retrieve the output.
edg-job-get-output -i slic.jdl