Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Check the ti crate

tcpClient slac1 tiStatus

Installing Rogue

The instructions about how to install rogue with conda can be found here:

...

No Format
bravo@clonfarm2:/usr/clas12/release/1.4.0/slac_svt$ cat copy_libraries.sh
#!/bin/bash
rm -rf /usr/clas12/release/1.4.0/coda/Linux_x86_64/lib/librogue*
rm -rf /usr/clas12/release/1.4.0/coda/Linux_x86_64/lib/libhps*
\cp -a -v /usr/clas12/release/1.4.0/slac_svt/rogue_lite/lib/* /usr/clas12/release/1.4.0/coda/Linux_x86_64/lib/
\cp -a -v /usr/clas12/release/1.4.0/slac_svt/heavy-photon-daq/software/rogue_coda/install/x86_64-linux/lib/* /usr/clas12/release/1.4.0/coda/Linux_x86_64/lib/

Setting up a VNC for SVT DAQ work - DEPRECATED

SVT DAQ system implies running a gui for configuration and data taking. Setting up a VNC can speed up significantly work time. 
It is suggested to setup a VNC server on rdsrv305 as that is directly accessible via internet. 
General informations on how to setup a VNC are given here:
VNC on Unix

...

You should be able to connect to the vncserver via ssh tunnel through localhost:9999

Basic SVT DAQ instructions for the test bed

Before starting working make sure that the heatsink of the FEB is facing upward and the fan is pointing to it. If the FEB overheats it will reset itself (you will see all the leds halfway up).

Start working/developing:

  • Connect to the rdusr219 machine
  • Setup the general environment: 

    No Format
    source /u1/hps/setup_env.sh
  • Start the conda environment:

    No Format
    conda activate rogue_5_8_0
  • Run the software from this folder.

    No Format
    cd /u1/hps/server/heavy-photon-daq/software/scripts
    python SvtDaqGui.py --env SLAC21BOT

    The env command line is to tell which network configuration to load. The network configuration that gets loaded is stored in python/hps/constants.py
    Check the FebLinkStatus. In the RCE test, you should see FebFpga[0] and check the Link[0] state. If False, Read the variables again (click Read). If now True, means we are talking to the Feb. After the link is established one can load the configuration. 


Setting up a VNC for CODA Running and sharing work

The server machine rdsrv309 has a vncserver running as a service on desktop :2
It is run under clasrun user and it is bound to localhost. In order to access it from remote, an ssh tunnel needs to be setup on the client machine.

No Format
ssh -L <localClientPort>:localhost:5902 -C -N -l clasrun rdsrv309.slac.stanford.edu

Where <localClientPort> can be whatever port that is not used on the client side. To connect to the vncserver just connect to localhost:<localClientPort> after opening the ssh tunnel.

Basic SVT DAQ instructions for the test bed


Before starting working make sure that the heatsink of the FEB is facing upward and the fan is pointing to it. If the FEB overheats it will reset itself (you will see all the leds halfway up).

Start working/developing:

  • Connect to the rdusr219 machine
  • Setup the general environment: 

    No Format
    source /u1/hps/setup_env.sh
  • Start the conda environment:

    No Format
    conda activate rogue_5_8_0
  • Run the software from this folder.

    No Format
    cd 

    Load the configuration for the system configuration (back end: ATCA (advanced telecommunication computing architecture) and front end (FEB+Hybrid+APVs)) . Click on the HpsSvtDaqRoot tab and load settings

    No Format
    /u1/hps/daq/heavy-photon-daq/software/config/rce-test.yml
    There is a copy of this configuration file in
    /u1/hps/server/heavy-photon-daq/software/
    config/

It might be unclear which link channel goes from the RTM to the FEB(s). In the test area, where a single feb is attached to the RTM, one can put in the link map the fill list of RTM channels (12) giving the full list of tuples.
Each index of the list is the feb address of the feb array and in each tuple (X,Y) X is the cob number (0,1) and Y is the link number 0-11.
Example:
Provide

No Format
[(0,1),(0,2),...]

To activate all the links and then check which one reads back to True when reading the variables. The channel mapping between the flange boards and the RTM has been performed on 8th April 2021 and is summarised here:
Flange Board to RTM Slow Control Channel Mapping

Setting on the Hybrids

...

  • scripts
    python SvtDaqGui.py --local --env SLAC21BOT [--epicsEn]

    The env command line is to tell which network configuration to load. The network configuration that gets loaded is stored in python/hps/constants.py
    Check the FebLinkStatus. In the RCE test, you should see FebFpga[0] and check the Link[0] state. If False, Read the variables again (click Read). If now True, means we are talking to the Feb. After the link is established one can load the configuration. 


  • Load the configuration for the system configuration (back end: ATCA (advanced telecommunication computing architecture) and front end (FEB+Hybrid+APVs)) . Click on the HpsSvtDaqRoot tab and load settings

    No Format
    /u1/hps/daq/heavy-photon-daq/software/config/rce-test.yml

    There is a copy of this configuration file in /u1/hps/server/heavy-photon-daq/software/config/


It might be unclear which link channel goes from the RTM to the FEB(s). In the test area, where a single feb is attached to the RTM, one can put in the link map the fill list of RTM channels (12) giving the full list of tuples.
Each index of the list is the feb address of the feb array and in each tuple (X,Y) X is the cob number (0,1) and Y is the link number 0-11.
Example:
Provide

No Format
[(0,1),(0,2),...]

To activate all the links and then check which one reads back to True when reading the variables. The channel mapping between the flange boards and the RTM has been performed on 8th April 2021 and is summarised here:
Flange Board to RTM Slow Control Channel Mapping


Setting on the Hybrids

In Variables tab navigate to the FebArray→FebCore→FebConfig and HybridPwrEn the 0 and 1 (the power supply will not be able to sustain 3 hybrids, but should be OK for 2). 
After turning on the Hybrids, load the configuration again to be sure you are sending the right config to them. 

Sending reset commands to the Hybrids

Navigate to "Commands" tab and in PcieTiDtmArray click the sequence:

1) ApvClkAlign     (to set all the APVs to the same phase)
2) ApvReset101.  (reset and start to wait for a trigger signal)


Start local svt data taking

With the hybrids configured and synced. 

1) Go to HpsSvtDaqRoot tab→Browse and then set the name of the file ( it will set the output name with data and time automatically)
2) Cick on Open. You will see File Open → True when ready.
3) Set run Rate to something reasonable (10Hz should be OK)
4) Click on Run State and select "Running". When successfully running you'll see Run Count going up. 
5) When done with local data taking, first set Run State to "Stopped" and Close the file. 


Configuration files used to configure rogue.
The 2019 CODA (central DAQ software at jLab) configuration files for HPS DAQ are stored on jLab machines at

No Format
/usr/clas12/release/1.4.0/parms/trigger/HPS/Run2019/

The general configuration for the DAQ usually ends in .trg, the relevant block for SVT is under the #SVT Config block in one of those files.
This should point to something like ..../svt/svt_config.cnf, where the configuration should be pointed. 

To access this information, one needs to ssh to the clonfarm machines. In particular, clonfarm2 and clonfarm3 are the SVT DAQ machines.


Running the SVT Daq using CODA

A setup is available to run the SVT DAQ using CODA. As of 25th of June the following configuration has been tested and it is working:

  • Running using rdsrv309 only
  • 1 FEB ON but Hybrids OFF
  • Threshold are not loaded as Daq mapping needs to be taken into account
  • Random trigger rate at 5 kHz

Here are the instructions on how to setup CODA

  • ssh as clasrun to rdsrv309
  • The default shell for clasrun is tcsh. In order to start rogue one has to change to /bin/bash. Then setup the conda environment as shown in "Basic SVT DAQ instructions for the test bed"

    No Format
    /bin/bash
    source /u1/hps/setup_env.sh
    conda activate rogue_5_8_0
    cd /u1/hps/server/heavy-photon-daq/software/scripts
    
  • Run the Rogue instance for data taking. Epics connection has not been tested yet. So is kept off at the moment. in order for the coda system to run, both COBs need to be enabled. So use the full configuration even if a single FEB is attached. In the case the second COB is not populated, the system is smart enough to figure it out
  • No Format
    python SvtCodaRun.py --local --env SLAC21
    
  • This command will bring up the Rogue Gui and start the rogue server


Open a new terminal as clasun. To start CODA it's sufficient to type 

No Format
runcontrol -rocs


This will bring up the GUI interface. The following setup will guide you through the various transitions

Sending reset commands to the Hybrids

Navigate to "Commands" tab and in PcieTiDtmArray click the sequence:

1) ApvClkAlign     (to set all the APVs to the same phase)
2) ApvReset101.  (reset and start to wait for a trigger signal)

Start local svt data taking

With the hybrids configured and synced. 

1) Go to HpsSvtDaqRoot tab→Browse and then set the name of the file ( it will set the output name with data and time automatically)
2) Cick on Open. You will see File Open → True when ready.
3) Set run Rate to something reasonable (10Hz should be OK)
4) Click on Run State and select "Running". When successfully running you'll see Run Count going up. 
5) When done with local data taking, first set Run State to "Stopped" and Close the file. 

Configuration files used to configure rogue.
The 2019 CODA (central DAQ software at jLab) configuration files for HPS DAQ are stored on jLab machines at

No Format
/usr/clas12/release/1.4.0/parms/trigger/HPS/Run2019/

The general configuration for the DAQ usually ends in .trg, the relevant block for SVT is under the #SVT Config block in one of those files.
This should point to something like ..../svt/svt_config.cnf, where the configuration should be pointed. 

To access this information, one needs to ssh to the clonfarm machines. In particular, clonfarm2 and clonfarm3 are the SVT DAQ machines.

DEBUG:

Connect via minicom to the crate controller:

...