You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Current »

This page describes the processes and scripts that provide the transfer of image and meta data from the TEM's.

Everything is documented within an apache airflow instance. This DAG program provides both task management and monitoring.

The airflow instance currently runs on cryoem-airflow.slac.stanford.edu. Specifically, it runs as a docker compose stack (mainly because i couldn't get the docker swarn airflow web server working. Local modifications to the airflow Dockerfile enables file permissions from the TEMs to be maintained (although it's hacky due to the security issues with docker)

In particular, the CIFs file share from the TEM cameras are mounted on cryoem-airflow.slac.stanford.edu and then volume bound onto the airflow workers. The host also mounts a nfs share for the long term storage (GPFS).

The airflow stack is kept under revision control under github.

 

airflow define workflows in DAGs. these are coded in python and provide dependency graphs between tasks. The following table describes the function of each DAG.

DAGPurpose   
*_file-drop.pyReads in a {{tem?-experiment.yaml}} file to determine where to copy files to (NFS) from where (the CIFS share). it will then (currently) rsync the files and the finally deleted all mrc files older than 9 hours and greater than 100MB.   
     
     
  • No labels