Search/Navigation:
Related:
SLAC/EPP
/HPS Public
Jefferson Lab/Hall B
/HPS Run Wiki
S30XL-LESA/LDMX
First, the event builder, event recorder and ET ring can be started by issuing the following commands from a terminal:
Code Block | ||
---|---|---|
| ||
cd |
Open new terminal
...
/u1/coda_rth |
...
./start_ |
...
Open new terminal
...
coda_xterm.csh |
This will open a dedicated (blue) xterm for each of the processes above. It will also open a terminal with a connection to the TI. This will require a password which can be retrieved from one of the SVT DAQ experts.
Info | |||||
---|---|---|---|---|---|
| |||||
Running the script start_coda_xterm.csh is equivalent to running the following commands with some additional options:
|
...
|
Once the event builder, event recorder and ET ring have started up successfully, the run control can be started as follows:
Code Block | ||
---|---|---|
| ||
sudo tcsh
su -l **** (DAQ expert user name)
cd /u1/coda_rth/
./start_coda_xterm_rc.csh |
Info | |||||
---|---|---|---|---|---|
| |||||
Running the script start_coda_xterm_rc.csh is equivalent to running the following command with some additional options:
|
Warning |
---|
When running using Sergey's binaries, the symbolic link /usr/local needs to be pointing to /usr/new_local. This symbolic link is destroyed every night by Taylor so it needs to be set once daily. |
Connect to DTM/DPM and start ROCs.
There is a script that pops up all windows in xterms:
Open new terminal
If running Sergey's binaries need make sure that /usr/local is a symbolic link pointing to /usr/local_new. This maybe need to be updated every day (wiped by Taylor)
$ ./start_eb
Open new terminal
$ cd /u1/coda_rth
$ ./start_ebConnect to DTM/DPM and start ROCs rce.csh
$ cd /u1/daq
$ source setup_env.csh
$ ./rceScripts/connect_host <slot> <rce> <id> <?> <?> (FIX NAMES OF ID's)
Start control server on DPM7 if not already running
$ /mnt/host/coda/run_roc
Start rce GUI if not running:
$ cd /u1/daq$ ./rceScripts/connect_host 1 3 2
$ cd /mnt/host/day
$ ./rceScripts/start_server gui.csh
If it fails to connect check that server is running on dpm7 and restart if needed
$ ssh dpm7Start rce GUI
$ cd /u1mnt/host/daq
$ source setup_env .csh
$ ./rceScripts/start_guiserver.shcsh
Click 'connectConnect' on RC GUI. Click 'configureConfigure' then select 'pelle08rces' as run type. Click 'Config' and select '/mnt/u1host/coda/rcetest0svtrce0.cfgcnf'. Click 'OK'.
At the moment, during the 'Configure' stage, the configuration file svtrce0.cnf is being overwritten. In order for things to work properly, the overwritten file needs to be replaced with a backup after the configuration has taken place. To do this, issue the following command in a terminal:
Code Block | ||
---|---|---|
| ||
sudo cp /u1/cob_nfs/host/coda/svtrce0.cnf.save /u1/cob_nfs/host/coda/svtrce0.cnf |
Click 'Download'. 'Prestart' button appears if things worked.
Go to TI and Reset (but don't start sending triggers)
Click 'Prestart'.
Click 'Go'.
Go to TI and start sending triggers.
$ cd /u1/daq & source setup_env.csh
$ ./rceScripts/connect_ti.csh (pw ask Pelle/Ryan/Ben)
$ cd ti/test
$ ./tiLibTest
TBI
...
Host | slot | rce? | id? |
---|---|---|---|
dtm | 4 | 1 | 0 |
dpm0 | 1 | 0 | 0 |
1 | 0 | 2 | |
1 | 1 | 0 | |
1 | 1 | 2 | |
1 | 2 | 0 | |
1 | 2 | 2 | |
1 | 3 | 0 | |
dpm7 | 1 | 3 | 2 |
`$ATCA_IP $ATCA_SHELF/1/4/0 --ifname $ATCA_IFNAME` $port \
...
...
1 |
...
`$ATCA_IP $ATCA_SHELF/1/1/0 --ifname $ATCA_IFNAME` \
`$ATCA_IP $ATCA_SHELF/1/1/2 --ifname $ATCA_IFNAME` \
`$ATCA_IP $ATCA_SHELF/1/2/0 --ifname $ATCA_IFNAME` \
...
...
3 |
...
...
2 |
...
Send software generated triggers:
$ cd /u1/daq & soruce setup_env.csh
$ ./rceScripts/connect_ti.csh (pw ask Pelle/Ryan/Ben)
$ cd /home/daq/linuxvme/tid/src
$ ./tidInt_test
Change trigger rate by modifying tidInt_test.c: tidSoftTrig(0xffff,0xFFFF,1);