Confluence will be unusable 23-July-2024 at 06:00 due to a Crowd upgrade.
Please note, this page is a work in progress, viewing has been restricted to staff with editing privileges and should be removed when it is ready for public viewing.
The LCLS provides a number of tools that allow users and staff to participate remotely in experiments while access to the site is restricted due to COVID-19 safety protocols. These were demonstrated in the LCLS User Town Hall held on July 23, 2020, and available at this link. This page will provide an overview of tools for external (incl. users) and internal (staff only) use, and provides links to pages with additional details such as installation and usage guides, as well as minimum system requirements.
The LCLS provides a number of internally developed and 3rd party tools to share information during an experiment.
Description, Stanford site license, passwords, distribution (via eLog or email), live plots of AMI (camera images, histograms, etc.)
Link to download and install
Experiment Survey
Internally, we are using or developing additional remote access tools. They are described here for information purposes. If you believe your experiment would benefit from the use of these tools, contact your LCLS experiment Point of Contact.
Stanford University maintains an enterprise license to the slack chat application. Guests from other facilities can be added to specific channels.
To support remote operations a dedicated NoMachine Terminal Server has been deployed: psnxopr.slac.stanford.edu.
Then, NoMachine Enterprise Desktop will be installed on all DAQ workstations.
The following link provides guidelines to configure NoMachine Client: Remote Visualization.
To access the DAQ workstation, use the following steps:
LCLS has acquired a number of augmented reality headsets to allow staff technicians and engineers to co-view a workspace while respecting social-distancing protocols.
Hardware: https://www.realwear.com/products/hmt-1/
Software: https://www.amaxperteye.com/
We are using Space1 as the software platform for communicating with the AR headset.
Login here to call the headset and use the communication tools:
Login for operator:
Help1@slac.stanford.edu Help1!
Login for headset (Virtual collaboration – Space1)
User3@slac.stanford.edu User3!
Within the call you can,
We were able to establish calls from psconsole to the headset over EDUROAM wifi in the FEH.
Demo video recorded from psconsole screen (no sound):
We have purchased two remote presence robots to assist remote viewing and debugging of instrumentation at beam height where fixed overhead web cameras may have difficulty looking while an area is locked during beam delivery. https://www.doublerobotics.com/double3.html. At present these
One of the units is enabled in developer mode, this allows development work with the base Unix environment that the robot is running on, as well as full access to the programming of the robot drive, and detectors.
Potential development features include: