Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Table of Contents

LCLS1

New Pinned-Approach

List of commands to build psana1 release on s3df

Login and set environment

> s3dflogin

ssh psana -l psreldev

source /sdf/group/lclsNOTE: (Oct. 10, 2023) we are renaming the .condarc to .condarc_dontuse files in home-directories and directories like /cds/sw/ds/ana/sw/conda2conda1-v2v3/manage/inst/.condarc to not implicitly depend on those.  So channels must be specified explicitly in the conda commands.

To create a source-code .tar.gz file:

ssh psbuild-rhel7-01 -l psreldev

Current as of Oct, 9, 2023

Valerio writes: It's not different from the psana2 approach at all.  First you need to package the source in a tar.gz like we always did, as shown in the "Checklist for building psana1 environments from feedstocks" section below:

  • source /cds/sw/ds/ana/conda1/inst/etc/profile.d/conda.sh
  • conda activate conda_build
  • cd /cds/sw/ds/ana/conda1/manage/
  • bin/ana-rel-admin --force --cmd psana-conda-src --name 4.0.53 --basedir `pwd`   (this command assembles all the source code from the tags into a .tar.gz file)

(no longer necessary since we don't build with GitHub-actions at the moment): cp .tar.gz (from manage/downloads/anarel/) to where GitHub can see it: /reg/g/psdm/web/swdoc/tutorials/.   Instead, put this sort of line in recipe/meta.yaml: "url: file:///cds/sw/ds/ana/conda1/manage/downloads/anarel/psana-conda-4.0.56.tar.gz"

All of the next steps to build the binary .tar.bz1 conda package should be done as user "cpo" or "valmar" or "dubrovin" (make sure you have enough disk space: I believe the build will go in ~/.conda and ~/conda-bld).  

checkout github feedstock repository:
many feedstocks (all in slac-lcls) e.g.
https://github.com/slac-lcls/psana1-feedstock

update version and checksum in

https://github.com/slac-lcls/psana1-feedstock/blob/main/recipe/meta.yaml

generate new checksum with: sha256sum .tar.gz like this:

sha256sum /reg/g/psdm/web/swdoc/tutorials/psana-conda-4.0.10.tar.gz

the 3 important files in the psana1-feedstock git repo are:  conda_build_config.yaml (specifies the versions of all package dependencies as recommended by condo) recipe/meta.yaml (specifies location of source code and psana build and run-time dependencies) recipe/build.sh (the actual scons build instructions).

NOTE: cpo thinks the psana1 recipe needs to be built with the python3 conda_build env from LCLS2 (source /cds/sw/ds/ana/conda2/manage/bin/psconda.sh).  If you don't do this I think we get this error from "conda build recipe/" since conda_build env in LCLS1 is py2.  built the recipes under your own username.  Use these commands to get the build environment (it is LCLS2!):

source /cds/sw/ds/ana/conda2/manage/bin/psconda.sh
conda deactivate
conda activate conda_build

The py2 and py3 psana recipes have now the pinning file (conda_build_config.yaml) in the top level directories. You just need to update the meta.yaml file, run "conda build recipe/".  NOTE: need to do this above the recipe/ directory, to pick up the conda_build_config.yaml pinnings. (did this as user "cpo", but could also do as psreldev, although psreldev might need "anaconda login" as another user so "anaconda upload" will work) upload and build the environment from the "pinned" YAML file. 

The py3 recipe is in https://github.com/slac-lcls/psana1-feedstock while the py2 recipe is in https://github.com/slac-lcls/psana1-py2-feedstock.

Build the source code in the psana1-feedstock directory with the command "conda build recipe/" (effectively executes recipe/build.sh which has scons commands in it)

Code Block
TypeError: apply_pin_expressions() argument after ** must be a mapping, not unicode

Upload the file to anaconda lcls-i channel with:

...

bin/psconda.sh  # conda1-v3 is a latest version of conda 23.10.0
conda deactivate
conda activate conda_build

Generate .tar.gz file with source code

cd /sdf/group/lcls/ds/ana/sw/conda1-v3/manage/  # if needed git clone git@github.com:slaclab/anarel-manage.git manage

bin/ana-rel-admin --force --cmd psana-conda-src --name 4.0.58 --basedir `pwd`

Build release

cd ~psreldev/git/psana1-feedstock/   # if needed git clone git@github.com:slac-lcls/psana1-feedstock.git

Update in  recipe/meta.yaml fields for set version and sha256

conda build -c lcls-i -c conda-forge recipe

Debugging

In case of problem with tests look at log file like

/sdf/group/lcls/ds/ana/sw/conda1/inst/envs/conda_build2/conda-bld/psana_<13-digit-build number>/test_tmp/work/<log-file-name>

Upload .tar.bz2 file with release to anaconda lcls-i channel

The name of the .tar.bz2 file can be found at the end of conda build response. Then use command like

anaconda upload -u lcls-i /sdf/group/lcls/ds/ana/sw/conda1/inst/envs/conda_build2/conda-bld/linux-64/psana-4.0.57-py39hed0727e_1.tar.bz2

Code Block
titleresponse on anaconda upload
collapsetrue
(conda_build2) [psreldev@sdfiana001 psana1-feedstock]$ pwd
/sdf/home/p/psreldev/git/psana1-feedstock
(conda_build2) [psreldev@sdfiana001 psana1-feedstock]$ anaconda upload -u lcls-i /sdf/group/lcls/ds/ana/sw/conda1/inst/envs/conda_build2/conda-bld/linux-64/psana-4.0.57-py39hed0727e_1.tar.bz2
Using Anaconda API: https://api.anaconda.org
Using "lcls-i" as upload username
Processing "/sdf/group/lcls/ds/ana/sw/conda1/inst/envs/conda_build2/conda-bld/linux-64/psana-4.0.57-py39hed0727e_1.tar.bz2"
Detecting file type...
File type is "Conda"
Extracting conda attributes for upload
Creating package "psana"
Creating release "4.0.57"
The action you are performing requires authentication, please sign in:
Using Anaconda API: https://api.anaconda.org
Username: dubrovin
dubrovin's Password: 
login successful
Using Anaconda API: https://api.anaconda.org
Using "lcls-i" as upload username
Processing "/sdf/group/lcls/ds/ana/sw/conda1/inst/envs/conda_build2/conda-bld/linux-64/psana-4.0.57-py39hed0727e_1.tar.bz2"
Detecting file type...
File type is "Conda"
Extracting conda attributes for upload
Creating package "psana"
Creating release "4.0.57"
Uploading file "lcls-i/psana/4.0.57/linux-64/psana-4.0.

...

57-

...

py39hed0727e_

...

1.tar.bz2

...

"
15.4MB [00:01, 12.6MB/s]                                                                                                                                                                                  
Upload complete

conda located at:
  https://anaconda.org/lcls-i/psana

(conda_build2) [psreldev@sdfiana001 psana1-feedstock]$

Create new environment

conda create --name ana-4.0.58-py3 --clone ana-4.0.57-py3

Code Block
titleresponse on conda create
collapsetrue
(base) [psreldev@sdfiana002 psana1-feedstock]$ conda create --name ana-4.0.58-py3 --clone ana-4.0.57-py3
Retrieving notices: ...working... done
Source:      /sdf/group/lcls/ds/ana/sw/conda1/inst/envs/ana-4.0.57-py3
Destination: /sdf/group/lcls/ds/ana/sw/conda1/inst/envs/ana-4.0.58-py3
Packages: 488
Files: 11

Downloading and Extracting Packages:

Downloading and Extracting Packages:

Preparing transaction: done
Verifying transaction: done
Executing transaction: /  
For Linux 64, Open MPI is built with CUDA awareness but this support is disabled by default.
To enable it, please set the environment variable OMPI_MCA_opal_cuda_support=true before
launching your MPI processes. Equivalently, you can set the MCA parameter in the command line:
mpiexec --mca opal_cuda_support 1 ...
 
In addition, the UCX support is also built but disabled by default.
To enable it, first install UCX (conda install -c conda-forge ucx). Then, set the environment
variables OMPI_MCA_pml="ucx" OMPI_MCA_osc="ucx" before launching your MPI processes.
Equivalently, you can set the MCA parameters in the command line:
mpiexec --mca pml ucx --mca osc ucx ...
Note that you might also need to set UCX_MEMTYPE_CACHE=n for CUDA awareness via UCX.
Please consult UCX's documentation for detail.
 
done
#
# To activate this environment, use
#
#     $ conda activate ana-4.0.58-py3
#
# To deactivate an active environment, use
#
#     $ conda deactivate

(base) [psreldev@sdfiana002 psana1-feedstock]$

conda deactivate
conda activate ana-4.0.58-py3
conda install -c lcls-i -c conda-forge psana=4.0.58

Code Block
titleresponse on conda install
collapsetrue
(ana-4.0.58-py3) [psreldev@sdfiana002 psana1-feedstock]$ conda install -c lcls-i -c conda-forge psana=4.0.58
Channels:
 - lcls-i
 - conda-forge
 - defaults
 - lcls-ii
 - cogsci
Platform: linux-64
Collecting package metadata (repodata.json): done
Solving environment: done

## Package Plan ##

  environment location: /sdf/group/lcls/ds/ana/sw/conda1/inst/envs/ana-4.0.58-py3

  added / updated specs:
    - psana=4.0.58


The following packages will be downloaded:

    package                    |            build
    ---------------------------|-----------------
    psana-4.0.58               |   py39hed0727e_1        15.4 MB  lcls-i
    ------------------------------------------------------------
                                           Total:        15.4 MB

The following packages will be UPDATED:

  psana                               4.0.57-py39hed0727e_1 --> 4.0.58-py39hed0727e_1 


Proceed ([y]/n)? y


Downloading and Extracting Packages:
                                                                                                                                                                                                    
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
(ana-4.0.58-py3) [psreldev@sdfiana002 psana1-feedstock]$

Deprecated New Pinned-Approach

NOTE: (Oct. 10, 2023) we are renaming the .condarc to .condarc_dontuse files in home-directories and directories like /cds/sw/ds/ana/conda2-v2/inst/.condarc to not implicitly depend on those.  So channels must be specified explicitly in the conda commands.

To create a source-code .tar.gz file:

ssh psbuild-rhel7-01 -l psreldev

Current as of Oct, 9, 2023

Valerio writes: It's not different from the psana2 approach at all.  First you need to package the source in a tar.gz like we always did, as shown in the "Checklist for building psana1 environments from feedstocks" section below:

  • source /cds/sw/ds/ana/conda1/inst/etc/profile.d/conda.sh
  • conda activate conda_build
  • cd /cds/sw/ds/ana/conda1/manage/
  • bin/ana-rel-admin --force --cmd psana-conda-src --name 4.0.53 --basedir `pwd`   (this command assembles all the source code from the tags into a .tar.gz file)

(no longer necessary since we don't build with GitHub-actions at the moment): cp .tar.gz (from manage/downloads/anarel/) to where GitHub can see it: /reg/g/psdm/web/swdoc/tutorials/.   Instead, put this sort of line in recipe/meta.yaml: "url: file:///cds/sw/ds/ana/conda1/manage/downloads/anarel/psana-conda-4.0.56.tar.gz"

All of the next steps to build the binary .tar.bz1 conda package should be done as user "cpo" or "valmar" or "dubrovin" (make sure you have enough disk space: I believe the build will go in ~/.conda and ~/conda-bld).  

checkout github feedstock repository:
many feedstocks (all in slac-lcls) e.g.
https://github.com/slac-lcls/psana1-feedstock

update version and checksum in

https://github.com/slac-lcls/psana1-feedstock/blob/main/recipe/meta.yaml

generate new checksum with: sha256sum .tar.gz like this:

sha256sum /reg/g/psdm/web/swdoc/tutorials/psana-conda-4.0.10.tar.gz

the 3 important files in the psana1-feedstock git repo are:  conda_build_config.yaml (specifies the versions of all package dependencies as recommended by condo) recipe/meta.yaml (specifies location of source code and psana build and run-time dependencies) recipe/build.sh (the actual scons build instructions).

NOTE: cpo thinks the psana1 recipe needs to be built with the python3 conda_build env from LCLS2 (source /cds/sw/ds/ana/conda2/manage/bin/psconda.sh).  If you don't do this I think we get this error from "conda build recipe/" since conda_build env in LCLS1 is py2.  built the recipes under your own username.  Use these commands to get the build environment (it is LCLS2!):

source /cds/sw/ds/ana/conda2/manage/bin/psconda.sh
conda deactivate
conda activate conda_build

The py2 and py3 psana recipes have now the pinning file (conda_build_config.yaml) in the top level directories. You just need to update the meta.yaml file, run "conda build recipe/".  NOTE: need to do this above the recipe/ directory, to pick up the conda_build_config.yaml pinnings. (did this as user "cpo", but could also do as psreldev, although psreldev might need "anaconda login" as another user so "anaconda upload" will work) upload and build the environment from the "pinned" YAML file. 

The py3 recipe is in https://github.com/slac-lcls/psana1-feedstock while the py2 recipe is in https://github.com/slac-lcls/psana1-py2-feedstock.

Build the source code in the psana1-feedstock directory with the command "conda build recipe/" (effectively executes recipe/build.sh which has scons commands in it)

Code Block
TypeError: apply_pin_expressions() argument after ** must be a mapping, not unicode

Upload the file to anaconda lcls-i channel with:

anaconda upload -u lcls-i /cds/home/c/cpo/conda-bld/linux-64/psana-4.0.53-py39hb869b97_2.tar.bz2

The yaml files for creating the environment are in /cds/sw/ds/ana/conda1/manage/jenkins/.

To create the env's without changing the pinnings, clone the previous environment and install the newly upload psana version like this:

Code Block
conda install --experimental-solver=libmamba -c lcls-i -c conda-forge psana=4.0.53

To create new pinnings (which may require also changing conda_build_config.yaml in the py2/py3 recipe repos) you can create two env's like this:

Code Block
source /reg/g/psdm/etc/psconda.sh -v2 (v2 to get experimental mamba solver)
conda env create --name ana-4.0.44-py3 --experimental-solver=libmamba --file=/cds/sw/ds/ana/conda1/manage/jenkins/ana-env-py3.yaml 
conda env create --name ana-4.0.44 --experimental-solver=libmamba --file=/cds/sw/ds/ana/conda1/manage/jenkins/ana-env-py2.yaml

The yaml files for creating the environment are in /cds/sw/ds/ana/conda1/manage/jenkins/.

To create the env's without changing the pinnings, clone the previous environment and install the newly upload psana version like this:

Code Block
conda install --experimental-solver=libmamba -c lcls-i -c conda-forge psana=4.0.53

To create new pinnings (which may require also changing conda_build_config.yaml in the py2/py3 recipe repos) you can create two env's like this:

Code Block
source /reg/g/psdm/etc/psconda.sh -v2 (v2 to get experimental mamba solver)
conda env create --name ana-4.0.44-py3 --experimental-solver=libmamba --file=/cds/sw/ds/ana/conda1/manage/jenkins/ana-env-py3.yaml 
conda env create --name ana-4.0.44 --experimental-solver=libmamba --file=/cds/sw/ds/ana/conda1/manage/jenkins/ana-env-py2.yaml

Deprecated Instructions

Information from Valerio on Jan. 27, 2021.

...

            github.com/slac-lcls/sz-feedstock
            github.com/slac-lcls/stdcompat-feedstock
            github.com/slac-lcls/ndarray-psana-feedstock
            github.com/slac-lcls/ndarray-psana-py2-feedstock
            github.com/slac-lcls/libpressio-feedstock
            github.com/slac-lcls/xtcav2-feedstock
            github.com/slac-lcls/psgeom-feedstock
            github.com/slac-lcls/psocake-feedstock
            github.com/slac-lcls/psana1-py2-feedstock           
            github.com/slac-lcls/psgeompsana1-feedstock
            github.com/slac-lcls/psocake-feedstock
            github.com/slac-lcls/psana1-py2-feedstock           
            github.com/slac-lcls/psana1-feedstock            

As normal user:

  • Modify recipe/meta.yaml
    • Check if new version of packages has been released 
    • If yes, bump up version, update sha256 checksum and reset build number
    • Otherwise, bump up build number
  • Commit
  • conda smithy rerender -c auto
  • Push

After all packages have finished building on GitHub (Unfortunately, GitHub Actions do not provide a "dashboard" to check if the packages have been built or when they are finished. There are only two ways to check this: 1) Manually check every feedstock repository (Under the "Actions" tab) 2) Wait for either the package to appear on anaconda.org/lcls-i or for an email notifying a failure to arrive):

As psreldev:

  • source /cds/sw/ds/ana/conda1/inst/etc/profile.d/conda.sh
  • cd /cds/sw/ds/ana/conda1/manage/jenkins/
  • conda env create -n ana-4.0.11 --file ana-env-py2.yaml
  • conda env create -n ana-4.0.11-py3 --file ana-env-py3.yaml

LCLS2 Github Actions Approach (Deprecated)

Currently deprecated since conda forge changes versions more rapidly than we would like, so we use the "Pinned Approach" described below.

First, tag lcls2 and ami repos.

  

As normal user:

  • Modify recipe/meta.yaml
    • Check if new version of packages has been released 
    • If yes, bump up version, update sha256 checksum and reset build number
    • Otherwise, bump up build number
  • Commit
  • conda smithy rerender -c auto
  • Push

After all packages have finished building on GitHub (Unfortunately, GitHub Actions do not provide a "dashboard" to check if the packages have been built or when they are finished. There are only two ways to check this: 1) Manually check every feedstock repository (Under the "Actions" tab) 2) Wait for either the package to appear on anaconda.org/lcls-i or for an email notifying a failure to arrive):

As psreldev:

  • source /cds/sw/ds/ana/conda1/inst/etc/profile.d/conda.sh
  • cd /cds/sw/ds/ana/conda1/manage/jenkins/
  • conda env create -n ana-4.0.11 --file ana-env-py2.yaml
  • conda env create -n ana-4.0.11-py3 --file ana-env-py3.yaml

LCLS2 Github Actions Approach (Deprecated)

Currently deprecated since conda forge changes versions more rapidly than we would like, so we use the "Pinned Approach" described below.

First, tag lcls2 and ami repos.

checkout the 20 feedstock packages (in github.com/slac-lcls/): 

Code Block
python /cds/sw/ds/ana/conda2/manage/bin/feedstock.py --clone

After these repos are cloned, you can use this to do commands in all repos:

Code Block
python /cds/sw/ds/ana/conda2/manage/bin/feedstock.py --cmd "pwd; git pull --rebase"

Either update version (with new git tag) or update the build number in each meta.yaml.  conda-forge enforces manual maintenance of version numbers and sha256 hashes (no automatic determination from latest GitHub tag).  If version has changed, compute new sha256 from .tar.gz from GitHub with:

sha256sum 1.1.7.tar.gz

For a new version, remember to reset build number to zero.  Commit the changes.  This command allows you to modify build numbers.  It will prompt you for each package, and you can enter "y" (increment build number) "n" (don't increment, default) or "0" (set to 0):checkout the 20 feedstock packages (in github.com/slac-lcls/): 

Code Block
python /cds/sw/ds/ana/conda2/manage/bin/feedstock.py --clone

After these repos are cloned, you can use this to do commands in all repos:

Code Block
python /cds/sw/ds/ana/conda2/manage/bin/feedstock.py --cmd "pwd; git pull --rebase"

Either update version (with new git tag) or update the build number in each meta.yaml.  conda-forge enforces manual maintenance of version numbers and sha256 hashes (no automatic determination from latest GitHub tag).  If version has changed, compute new sha256 from .tar.gz from GitHub with:

sha256sum 1.1.7.tar.gz

For a new version, remember to reset build number to zero.  Commit the changes.  This command allows you to modify build numbers.  It will prompt you for each package, and you can enter "y" (increment build number) "n" (don't increment, default) or "0" (set to 0):

Code Block
python /cds/sw/ds/ana/conda2/manage/bin/feedstock.py --incbuildnum

The next step requires an environment where "conda-smithy" is installed (we have put this in psrel's conda_build env).  Then rerender with "conda smithy rerender -c auto" (again using feedstock.py --cmd option).  This last command does its own git-commit for you with the changes.  When must this be run?  Valerio writes: "In general any change in conda forge config (for example, when we want to upload to a different channel, or to switch to a different CI). However, it is also needed to pick up the newest version of the conda-forge pinning file, so I run it every time. In the worst case, it tells me everything is up to date and does not create a commit".

The final "git push" of the above changes must be done carefully because it triggers the GitHub build, and order of the GitHub builds matters because of conda's "build:, host:" section dependencies (run-time dependencies do not affect this order).  We believe pure-python packages can go in the first wave, since they have no complex dependencies in "build:, host:" sections of meta.yaml. These are the waves of builds that can be launched in parallel:

  • libnl, libnl3, roentdek, amityping, prometheus-cpp, libusdusb4, psmon, networkfox, rogue, epix, lcls2_timetool, cameralink-gateway, lcls2-pgp-pcie-apps, xtcdata, ami
  • rmda-core (depends on libnl), psalg (depends on xtcdata)
  • fabric (depends rmda-core), psana (depends on psalg)
  • psdaq (depend on psalg, fabric)

The 4 build waves above can be launched with a command like this.

Code Block
python /cds/sw/ds/ana/conda2/manage/bin/feedstock.py --cmd "git push" --wave 1

If a package is built for python build matrix tells it to build for the officially supported conda-forge versions (3.6,3.7,3.8).

If we were in conda-forge officially, they take care of these build-order dependencies (but not the versions/sha256/buildnumber).  They may have a bot that tells you that the version was updated, but we still have to update the version number by hand. We haven't gone the conda-forge route because (1) we don't know how much work it is, and (2) cannot upload source .tar.gz to conda-forge, usually get it from GitHub, but the rogue packages are not public.

Unfortunately, GitHub Actions do not provide a "dashboard" to check if the packages have been built or when they are finished. There are only two ways to check this: 1) Manually check every feedstock repository (Under the "Actions" tab) 2) Wait for either the package to appear on anaconda.org/lcls-ii or for an email notifying a failure to arrive.

To create the environment as psrel, use these commands:

conda env create --name ps-N.Nodd.N --file env_create.yaml (devel env)

conda env create --name ps-N.Neven.N --file prod_create.yaml (prod env)

Automated feedstock environment building

incbuildnum

The next step requires an environment where "conda-smithy" is installed (we have put this in psrel's conda_build env).  Then rerender with "conda smithy rerender -c auto" (again using feedstock.py --cmd option).  This last command does its own git-commit for you with the changes.  When must this be run?  Valerio writes: "In general any change in conda forge config (for example, when we want to upload to a different channel, or to switch to a different CI). However, it is also needed to pick up the newest version of the conda-forge pinning file, so I run it every time. In the worst case, it tells me everything is up to date and does not create a commit".

The final "git push" of the above changes must be done carefully because it triggers the GitHub build, and order of the GitHub builds matters because of conda's "build:, host:" section dependencies (run-time dependencies do not affect this order).  We believe pure-python packages can go in the first wave, since they have no complex dependencies in "build:, host:" sections of meta.yaml. These are the waves of builds that can be launched in parallel:

  • libnl, libnl3, roentdek, amityping, prometheus-cpp, libusdusb4, psmon, networkfox, rogue, epix, lcls2_timetool, cameralink-gateway, lcls2-pgp-pcie-apps, xtcdata, ami
  • rmda-core (depends on libnl), psalg (depends on xtcdata)
  • fabric (depends rmda-core), psana (depends on psalg)
  • psdaq (depend on psalg, fabric)

The 4 build waves above can be launched with a command like this.

Code Block
python /cds/sw/ds/ana/conda2/manage/bin/feedstock.py --cmd "git push" --wave 1

If a package is built for python build matrix tells it to build for the officially supported conda-forge versions (3.6,3.7,3.8).

If we were in conda-forge officially, they take care of these build-order dependencies (but not the versions/sha256/buildnumber).  They may have a bot that tells you that the version was updated, but we still have to update the version number by hand. We haven't gone the conda-forge route because (1) we don't know how much work it is, and (2) cannot upload source .tar.gz to conda-forge, usually get it from GitHub, but the rogue packages are not public.

Unfortunately, GitHub Actions do not provide a "dashboard" to check if the packages have been built or when they are finished. There are only two ways to check this: 1) Manually check every feedstock repository (Under the "Actions" tab) 2) Wait for either the package to appear on anaconda.org/lcls-ii or for an email notifying a failure to arrive.

To create the environment as psrel, use these commands:

conda env create --name ps-N.Nodd.N --file env_create.yaml (devel env)

conda env create --name ps-N.Neven.N --file prod_create.yaml (prod env)

Automated feedstock environment building

  1. GitHub Access Token


    A new set of python scripts has been developed to automate the creation of LCLS-II conda environment. They can be found here:
    /cds/sw/ds/ana/conda2/manage/buildenv

    The scripts need a valid personal GitHub token to be used (unfortunately, GitHub caps the number of API requests that can be done without a token). The token can be created on GitHub, after logging in:

    https://github.com/settings/tokens

    The token must be exported and made available as an environment variable called GITHUB_ACCESS_TOKEN.  Technically it is only needed by step (4) below, but the way the scripts are written it should be set for all steps.

    For an example, please see ~valmar's bashrc file (also in psrel bashrc).

  2. Package Version File


    (this file-editing should be run as user "psrel")

    In order to generate packages, a YAML file listing the required version of each package must be created. For a file with the package versions in the latest environments, see:
    /cds/sw/ds/ana/conda2/manage/buildenv/table.yaml

    A few lines as an example:

    ami: 2.4.7
    amityping: 1.1.7
    cameralink-gateway: 7.6.2
    epix: 0.0.3
    lcls2-pgp-pcie-apps: 2.2.0
    lcls2_timetool: 3.3.0
    epix-hr-single-10k: 3.1.1
    lcls2-epix-hr-pcie: 1.2.0
    ....
  3. Preparing source tarballs


    (this step should be run as a normal user like valmar, cpo,...)

    Before building the feedstocks, source tarballs must be created for the rogue-related packages (which are private repos, so need source .tgz files generated) by running the prepare_source.py script.

    The script must be run using the conda_build environment (conda activate conda_build) It must also be run as a normal user because psrel cannot write to the /reg/g/psdm/web/swdoc/tutorials/.
    The script takes the package version file as an argument:
    python /cds/sw/ds/ana/conda2/manage/buildenv/prepare_source.py --package-version-file=/cds/sw/ds/ana/conda2/manage/buildenv/table.yaml

    The source tarballs are automatically generated and copied to /reg/g/psdm/web/swdoc/tutorials/

    PS: The script will clone the required repostiories in the current working directory!!!! Working in a temporary directory that can be later deleted is strongly advised

  4. Generating packages

    GitHub Access Token

    A new set of python scripts has been developed to automate the creation of LCLS-II conda environment. They can be found here:
    /cds/sw/ds/ana/conda2/manage/buildenv

    The scripts need a valid personal GitHub token to be used (unfortunately, GitHub caps the number of API requests that can be done without a token). The token can be created on GitHub, after logging in:

    https://github.com/settings/tokens

    The token must be exported and made available as an environment variable called GITHUB_ACCESS_TOKEN.  Technically it is only needed by step (4) below, but the way the scripts are written it should be set for all steps.

    For an example, please see ~valmar's bashrc file (also in psrel bashrc).

    Package Version File

    (this file-editing should be run as user "psrel")
    In order to generate packages, a YAML file listing the required version of each package must be created. For a file with the package versions in the latest environments, see:
    /cds/sw/ds/ana/conda2/manage/buildenv/table.yaml

    A few lines as an example:

    ami: 2.4.7
    amityping: 1.1.7
    cameralink-gateway: 7.6.2
    epix: 0.0.3
    lcls2-pgp-pcie-apps: 2.2.0
    lcls2_timetool: 3.3.0
    epix-hr-single-10k: 3.1.1
    lcls2-epix-hr-pcie: 1.2.0
    ....
    Preparing source tarballs


    (this step should be run as a normal user like valmar, cpo,...)
    Before building the feedstocks, source tarballs must be created for the rogue-related packages (which are private repos, so need source .tgz files generated) by running the prepare_source.py script.
    The script must be run using , cpo,...)

    The packages can now be built using the build environment script, again in the conda_build environment (conda activate conda_build) It must also be run as a normal user because psrel cannot write to the /reg/g/psdm/web/swdoc/tutorials/.
    The script takes the package version file as an argument:
    python /cds/sw/ds/ana/conda2/manage/buildenv/preparebuild_sourceenvironment.py --generate-packages --package-version-file=/cds/sw/ds/ana/conda2/manage/buildenv/table.yaml

    The source tarballs are automatically generated and copied to /reg/g/psdm/web/swdoc/tutorials/
    PS: The script will clone the required repostiories in the current working directory!!!! Working in a temporary directory that can be later deleted is strongly advised

    Generating packages

    (this step should be run as a normal user like valmar, cpo,...)
    The packages can now be built using the build environment script, again in the conda_build environment (conda activate conda_build):
    python /cds/sw/ds/ana/conda2/manage/buildenv/build_environment.py --generate-packages --package-version-file=/cds/sw/ds/ana/conda2/manage/buildenv/table.yaml
    The script will build the packages wave-by wave. For each wave, the script will clone the feedstocks one by one, automatically make the necessary changes to the recipes, run "conda smithy" and push the changes to the git repositories, triggering the building of the packages.  The individual builds can be seen at URLs like https://github.com/slac-lcls/libnl3-feedstock/actions
    The script will then check with GitHub every thirty seconds, and report the status of the build process ("Not started yet", "running", "success" or "failed"). It will waint until all builds have finished and are either in a "success" or "failed" state. If no build has failed, the script will then proceed to the next package "wave". Otherwise it will exit.
    Instead of going through all the waves, one can start from a certain wave (using the --start-from-wave option) and/or stop at a certain wave (using the --stop-at-wave option)
    script will build the packages wave-by wave. For each wave, the script will clone the feedstocks one by one, automatically make the necessary changes to the recipes, run "conda smithy" and push the changes to the git repositories, triggering the building of the packages.  The individual builds can be seen at URLs like https://github.com/slac-lcls/libnl3-feedstock/actions

    The script will then check with GitHub every thirty seconds, and report the status of the build process ("Not started yet", "running", "success" or "failed"). It will waint until all builds have finished and are either in a "success" or "failed" state. If no build has failed, the script will then proceed to the next package "wave". Otherwise it will exit.

    Instead of going through all the waves, one can start from a certain wave (using the --start-from-wave option) and/or stop at a certain wave (using the --stop-at-wave option)

    PS: The script will clone the required repostiories in the current working directory!!!! Working in a temporary directory that can be later deleted is strongly advised

  5. Building environments


    (this step should be run as user psrel)

    The production and development environments can be created using the normal "conda env create" commands (see above), or using the build_environment script:
    (from /cds/sw/ds/ana/conda2/manage directory)
    (NOTE: this command produced the env in a non-standard location, cpo used "conda env create --name ps-4.4.1 -f prod_create.yaml" instead)
    python buildenv/build_environment.py --build-environment --environment-name=ps-4.4.0 --environment-file=prod_create.yaml

    The environments must be created using the psrel user, because only the psrel user can write to the environment directories

    This step and the previous one can be combined in a single command for convenience:
    python build_environment.py --generate-packages --package-version-file=/cds/sw/ds/ana/conda2/manage/buildenv/table.yaml
    PS: The script will clone the required repostiories in the current working directory!!!! Working in a temporary directory that can be later deleted is strongly advised

    Building environments

    (this step should be run as user psrel)
    The production and development environments can be created using the normal "conda env create" commands (see above), or using the build_environment script:
    (from /cds/sw/ds/ana/conda2/manage directory)
    (NOTE: this command produced the env in a non-standard location, cpo used "conda env create --name ps-4.4.1 -f prod_create.yaml" instead)
    python buildenv/build_environment.py --build-environment --environment-name=ps-4.4.0 --environment-file=prod_create.yaml
    The environments must be created using the psrel user, because only the psrel user can write to the environment directories
    This step and the previous one can be combined in a single command for convenience:
    python build_environment.py --generate-packages --package-version-file=
    file=prod_create.yaml

LCLS2 Pinned Packages Approach (Current as of 10/9/23)

Building A Package

NOTE: (Oct. 10, 2023) we are renaming the .condarc to .condarc_dontuse files in home-directories and directories like /cds/sw/ds/ana/conda2-v2/inst/.condarc to not implicitly depend on those.  So channels must be specified explicitly in the conda commands.  Addition on Jan. 5, 2024 by cpo: I found that when I was building cameralink-gateway (as user cpo) that it was trying to write into /cds/sw/ds/ana/conda2

...

LCLS2 Pinned Packages Approach (Current as of 10/9/23)

Building A Package

NOTE: (Oct. 10, 2023) we are renaming the .condarc to .condarc_dontuse files in home-directories and directories like /cds/sw/ds/ana/conda2-v2/inst/.condarc to not implicitly depend on those.  So channels must be specified explicitly in the conda commands/inst/envs/conda_build/pkgs/python-3.9.18-h0755675_1_cpython.  So I restored my .condarc to get it to work.  I guess I could have tried to run as “psrel” instead.  Valerio writes: "I think this is another of the bug in conda. I think it is tied to the environment being group writable.  I don't think that having a condarc file is bad per se. I also have one with the paths pointing to non-home directories with more space. I think what we should avoid is having channels in the condarc, because when we run a conda build, conda also takes those channels into consideration. To have better control on what channels I am using, I removed all the channels from my condarc file and I specify the channels manually, but I still use the condarc for everything else!".

Need to create a feedstock for every new package.  LCLS2 feedstock packages are in (for example) https://github.com/slac-lcls/epix-feedstock.

...

  • Checkout the package (remove the ".git" subdirectory because it is large).
  • Create .tar.gz file with "tar cvfz epix-quad-1.2.0.tar.gz epix-quad/"
  • Copy the .tar.gz to the directory where it can be seen: "cp epix-quad-1.2.0.tar.gz /reg/g/psdm/web/swdoc/tutorials/"
  • Compute the sha256sum with "sha256sum epix-quad-1.2.0.tar.gz"
  • Put this sha256 in the epix-quad-feedstock/recipe/meta.yaml/meta.yaml

NOTE: To generate a small .tar.gz file:

  • use the following command to ignore the large git-lfs files when cloning:  "GIT_LFS_SKIP_SMUDGE=1 git clone git@github.com:slaclab/cameralink-gateway cameralink-gateway-8.2.2 --recursive"
  • remove the ".git" subdirectory because it is large

The pinning is in conda_build_config.yaml (our packages are towards the end of the file, the conda-forge pinnings are earlier).  The conda forge pinnings are obtained from files like: https://github.com/conda-forge/conda-forge-pinning-feedstock/blob/main/recipe/conda_build_config.yaml.  Valerio wrote a small throwaway python script to take the packages from our environment and them to the conda_build_config.yaml.  This script also pins the low-level underlying packages that conda-forge does not explicitly pin and adds them to conda_build_config.yaml.  This conda_build_config.yaml cannot trivially be used by GitHub actions (because it will pull the latest version from conda-forge) so we are using this file and building locally on psbuild-rhel7 (with infinite time we could write a custom action).

...

After creating the package then upload, e.g. with "anaconda upload -u lcls-ii yaml-cpp-0.5.3-h1d15853_72.tar.bz2".  May need to do "anaconda login" (once per year?)upload, e.g. with "anaconda upload -u lcls-ii yaml-cpp-0.5.3-h1d15853_72.tar.bz2".  May need to do "anaconda login" (once per year?).

Rogue Package Recipe Creation

Since we have moved the rogue package creation to be done with pip it is necessary to explicitly list all the packages in setup.py similar to this: https://github.com/slac-lcls/epix-hr-m-320k-feedstock/blob/main/recipe/setup.patch.  This patch file gets run when the conda builds the recipe.  All the relevant packages to be included in this patch file can be found by looking for directories containing __init__.py.

Devel Env Creation

Export the current environment to yml (with includes pinnings).  Then we can do "small tweaks" to versions.  But big version changes will break the solver.  If we have a big change, we have to start from scratch: use /cds/sw/ds/ana/conda2/manage/env_create.yaml in the usual fashion (but use libmamba), e.g.:

...