Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

The airflow instance currently runs on cryoem-airflow.slac.stanford.edu. Specifically, it runs as a docker compose stack (mainly because i couldn't get the docker swarn airflow web server working. Local modifications to the airflow Dockerfile enables file permissions from the TEMs to be maintained (although it's hacky due to the security issues with docker)

In particular, the CIFs file share from the TEM cameras are mounted on cryoem-airflow.slac.stanford.edu and then volume bound onto the airflow workers. The host also mounts a nfs share for the long term storage (GPFS).

The airflow stack is kept under revision control under github.

 

airflow define workflows in DAGs. these are coded in python and provide dependency graphs between tasks. The following table describes the function of each DAG.

DAGPurpose   
*_file-drop.pyReads in a {{tem?-experiment.yaml}} file to determine where to copy files to (NFS) from where (the CIFS share). it will then (currently) rsync the files and the finally deleted all mrc files older than 9 hours and greater than 100MB.