Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Table of Contents
LCLS1

Information from Valerio on Jan. 27, 2021.

...

update build numbers for all the above for every psana1 release.

Checklist for building psana1 environments from feedstocks

As psreldev on psbuld-rhel7:

...

  • source /cds/sw/ds/ana/conda1/inst/etc/profile.d/conda.sh
  • cd /cds/sw/ds/ana/conda1/manage/jenkins/
  • conda env create -n ana-4.0.11 --file ana-env-py2.yaml
  • conda env create -n ana-4.0.11-py3 --file ana-env-py3.yaml

New Pinned-Approach

Valerio writes: It's not different from the psana2 approach at all.  First you need to package the source in a tar.gz like we always did, as shown in the "Checklist for building psana1 environments from feedstocks" section above.  The py2 and py3 psana recipes have now the pinning file (conda_build_config.yaml) in them. You just need to update the meta.yaml file, run "conda build recipe" (did this as user "cpo", but could also do as psreldev, although psreldev might need "anaconda login" as another user so "anaconda upload" will work) upload and build the environment from the "pinned" YAML file. 

...

Code Block
source /reg/g/psdm/etc/psconda.sh -v2 (v2 to get experimental mamba solver)
conda env create --name ana-4.0.44-py3 --experimental-solver=libmamba --file=/cds/sw/ds/ana/conda1/manage/jenkins/ana-env-py3.yaml 
conda env create --name ana-4.0.44 --experimental-solver=libmamba --file=/cds/sw/ds/ana/conda1/manage/jenkins/ana-env-py2.yaml

LCLS2 Github Actions Approach (Deprecated)

Currently deprecated since conda forge changes versions more rapidly than we would like, so we use the "Pinned Approach" described below.

...

conda env create --name ps-N.Neven.N --file prod_create.yaml (prod env)

Automated feedstock environment building

  1. GitHub Access Token


    A new set of python scripts has been developed to automate the creation of LCLS-II conda environment. They can be found here:
    /cds/sw/ds/ana/conda2/manage/buildenv

    The scripts need a valid personal GitHub token to be used (unfortunately, GitHub caps the number of API requests that can be done without a token). The token can be created on GitHub, after logging in:

    https://github.com/settings/tokens

    The token must be exported and made available as an environment variable called GITHUB_ACCESS_TOKEN.  Technically it is only needed by step (4) below, but the way the scripts are written it should be set for all steps.

    For an example, please see ~valmar's bashrc file (also in psrel bashrc).

  2. Package Version File


    (this file-editing should be run as user "psrel")

    In order to generate packages, a YAML file listing the required version of each package must be created. For a file with the package versions in the latest environments, see:
    /cds/sw/ds/ana/conda2/manage/buildenv/table.yaml

    A few lines as an example:

    ami: 2.4.7
    amityping: 1.1.7
    cameralink-gateway: 7.6.2
    epix: 0.0.3
    lcls2-pgp-pcie-apps: 2.2.0
    lcls2_timetool: 3.3.0
    epix-hr-single-10k: 3.1.1
    lcls2-epix-hr-pcie: 1.2.0
    ....
  3. Preparing source tarballs


    (this step should be run as a normal user like valmar, cpo,...)

    Before building the feedstocks, source tarballs must be created for the rogue-related packages (which are private repos, so need source .tgz files generated) by running the prepare_source.py script.

    The script must be run using the conda_build environment (conda activate conda_build) It must also be run as a normal user because psrel cannot write to the /reg/g/psdm/web/swdoc/tutorials/.
    The script takes the package version file as an argument:
    python /cds/sw/ds/ana/conda2/manage/buildenv/prepare_source.py --package-version-file=/cds/sw/ds/ana/conda2/manage/buildenv/table.yaml

    The source tarballs are automatically generated and copied to /reg/g/psdm/web/swdoc/tutorials/

    PS: The script will clone the required repostiories in the current working directory!!!! Working in a temporary directory that can be later deleted is strongly advised

  4. Generating packages


    (this step should be run as a normal user like valmar, cpo,...)

    The packages can now be built using the build environment script, again in the conda_build environment (conda activate conda_build):
    python /cds/sw/ds/ana/conda2/manage/buildenv/build_environment.py --generate-packages --package-version-file=/cds/sw/ds/ana/conda2/manage/buildenv/table.yaml

    The script will build the packages wave-by wave. For each wave, the script will clone the feedstocks one by one, automatically make the necessary changes to the recipes, run "conda smithy" and push the changes to the git repositories, triggering the building of the packages.  The individual builds can be seen at URLs like https://github.com/slac-lcls/libnl3-feedstock/actions

    The script will then check with GitHub every thirty seconds, and report the status of the build process ("Not started yet", "running", "success" or "failed"). It will waint until all builds have finished and are either in a "success" or "failed" state. If no build has failed, the script will then proceed to the next package "wave". Otherwise it will exit.

    Instead of going through all the waves, one can start from a certain wave (using the --start-from-wave option) and/or stop at a certain wave (using the --stop-at-wave option)

    PS: The script will clone the required repostiories in the current working directory!!!! Working in a temporary directory that can be later deleted is strongly advised

  5. Building environments


    (this step should be run as user psrel)

    The production and development environments can be created using the normal "conda env create" commands (see above), or using the build_environment script:
    (from /cds/sw/ds/ana/conda2/manage directory)
    (NOTE: this command produced the env in a non-standard location, cpo used "conda env create --name ps-4.4.1 -f prod_create.yaml" instead)
    python buildenv/build_environment.py --build-environment --environment-name=ps-4.4.0 --environment-file=prod_create.yaml

    The environments must be created using the psrel user, because only the psrel user can write to the environment directories

    This step and the previous one can be combined in a single command for convenience:
    python build_environment.py --generate-packages --package-version-file=/cds/sw/ds/ana/conda2/manage/buildenv/table.yaml --build-environment --environment-name=ps-4.4.0 --environment-file=prod_create.yaml

LCLS2 Pinned Packages Approach (Current as of 10/9/23)

Building A Package

Need to create a feedstock for every new package.  LCLS2 feedstock packages are in (for example) https://github.com/slac-lcls/epix-feedstock.

...

After creating the package then upload, e.g. with "anaconda upload -u lcls-ii yaml-cpp-0.5.3-h1d15853_72.tar.bz2".  May need to do "anaconda login" (once per year?).

Devel Env Creation

Export the current environment to yml (with includes pinnings).  Then we can do "small tweaks" to versions.  But big version changes will break the solver.  If we have a big change, we have to start from scratch: use /cds/sw/ds/ana/conda2/manage/env_create.yaml in the usual fashion (but use libmamba), e.g.:

...

Sometimes our precise pinned package versions disappear from conda-forge.  In the past when this happened Valerio was able to manually modify the above pinned yaml files to choose a nearby version.  This so far has not broken the fast conda-solver behavior.

New Pinnings

From time to time, an important package with big ramifications in the environment needs to be updated (e.g.: mypy). Or a general refresh of the packages in the environment. is needed. In these cases, new pinnings and new pinning files need to be created. This requires performing the following steps:

  • In all feedstocks, replace conda_build_config.yaml with the version from: conda-forge-pinning-feedstock/conda_build_config.yaml at main · conda-forge/conda-forge-pinning-feedstock (github.com)
  • Rebuild all feedstocks in the following order:
    • libnl
    • libnl3
    • rmda-core
    • libfabric
    • roentdek
    • amityping
    • cameralink-gateway
    • epix-100a-gen2
    • epix
    • lcls2-ephi-hr-pcie
    • lcls2-pgp-pcie-apps
    • lcls2_timetool
    • networkfox
    • prometheus-cpp
    • xtcdata
    • psalg
    • psana
    • psdaq
    • ami
    • psmon 
  • Create new production and dev environment (see above)
  • Create new conda_build_config.yaml using the script in the section above and copy it in every feedstock to create packages with pinned dependecies in the future


LCLS I + AMI2 Environment

This environment is based on the LCLS I but contains all the dependencies needed to run AMI 2. To create it, we follow the strategy of trying to keep all main packages of the LCLS-I environment at the right version, and all the AMI2 dependencies at the version we have in the LCLS-II environment. We let the second level dependencies fluctuate freely, especially the compilers. Currently we follow this procedure (I use the ana-4.0.44-py3 environment as an example):

  • Export the base lcls-i environment to a YAML file:

    Code Block
    conda env export > ana-4.0.44-py3.yaml
  • Run a script similar to the following to pin the main lcls-i packages from  at the right version:

    Code Block
    import yaml
    
    with open("ana-4.0.44-py3.yaml", "r") as fh:
          curr_env = yaml.safe_load(fh)
    
    dep_dictionary = {}
    for entry in curr_env['dependencies']:
        items = entry.split("=")
        dep_dictionary[items[0]] = items[1]
    
    with open("/cds/sw/ds/ana/conda1/manage/jenkins/ana-env-py3.yaml", "r") as fh:
        base_env = yaml.safe_load(fh)
    
    new_dependencies = []
    for entry in base_env['dependencies']:
        items = entry.split("=")
        new_dependencies.append(f"{items[0]}={dep_dictionary[items[0]]}")
    
    base_env["dependencies"] = new_dependencies
    
    with open("ana-4.0.44-py3-ami2.yaml", "w") as fh:
        yaml.dump(base_env, fh)
  • Add to the ana-4.0.44-py3-ami2.yaml file the following dependencies, at the version that the are in the current lcls-II version:
    • networkfox
    • asyncqt
    • amityping
    • mypy
    • setproctitle.
    • jedi
    • sympy
    • p4p
    • pyqode.python1
    • pint
    • pyfftw

      For example, in the case of 4.0.44, the following lines were added to the file (the version numbers come from the current lcls-ii development environment ps-4.5.16):

      Code Block
      - networkfox=0.0.8
      - asyncqt=0.8.0
      - amityping=1.1.11
      - mypy=0.961
      - setproctitle=1.2.3
      - jedi=0.17.2
      - sympy=1.10.1
      - p4p=3.5.5
      - pyqode.python=2.12.1
      - pint=0.18
      - pyfftw=0.13.0
  • Remove the version from the compiler dependency, so that it looks just like the following, without any version number:

    Code Block
    - compilers
  • Use the resulting yaml file to generate the environment, using the libmamba solver:

    Code Block
    conda env create --experimental-solver=libmamba -n ana-4.0.44-py3-ami2 --file ana-4.0.44-py3-ami2.yaml

Environments on S3DF

When creating an LCLS-II release on S3DF, the following lines

...

NOTE / TODO: DIR_PSDM should point to the directory containing the "detector" directory. We should make sure that the content of the detector directories on psana (/cds/group/psdm/detector) and sdf (/sdf/group/lcls/ana/detector) match. Currently, their content seem different: the size of the directory on psana is 1.3Tb on psana and 2.3TB on S3DF

Utility Scripts

The following two python scripts can be used to explore the content of a repodata.json file, which can be downloaded from a conda online package repository and lists in detail the content of the repository

...