You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 45 Next »

LCLS1

Information from Valerio on Jan. 27, 2021.

(see also checklist below for which env to activate and how to access ana-rel-admin)

cd /cds/sw/ds/ana/conda1/manage/
ana-rel-admin --force --cmd psana-conda-src --name 2.0.6 --basedir `pwd`

cp .tar.gz to where GitHub can see it:
/reg/g/psdm/web/swdoc/tutorials/

psreldev can't write to the above, do as user "cpo" or "valmar"

checkout github feedstock repository:
many feedstocks (all in slac-lcls) e.g.
https://github.com/slac-lcls/psana1-feedstock

update version and checksum in

https://github.com/slac-lcls/psana1-feedstock/blob/main/recipe/meta.yaml

generate new checksum with: sha256sum .tar.gz

sha256sum /reg/g/psdm/web/swdoc/tutorials/psana-conda-4.0.10.tar.gz

git push launches the build on GitHub (using GitHub Actions) when it finishes it uploads it
to anaconda.org/lcls-i

this is our real control of the build process (e.g. the GitHub Actions Workflow yaml file gets
generated automatically)

https://github.com/slac-lcls/psana1-feedstock/blob/main/conda-forge.yml

if the above file is modified have to run "conda smithy rerender -c
auto" "-c auto" means automatically make a git commit. but still have
to push to git.

There are a few other cases in which the feedstock needs to be rerendered.
See the conda-forge documentation:

https://conda-forge.org/docs/maintainer/updating_pkgs.html#when-to-rerender

If no relevant changes have taken place, the command will simply not create a
commit. so maybe it is a good practice to run it every time a feedstock is
updated.

"targets" first entry: the conda channel to upload to, second entry is
called a "label" (e.g. a devel and a stable). conda forge always looks
for "main" label.

then "conda create" as usual (as seen in the jenkins CI).

for testing the feedstock, can run the run_docker_build.sh script locally
but requires docker to be installed. The file  "build_locally.py" included in
the feedstock takes care of running run_docker_build.sh

Valerio uses this to test the feedstock on his laptop.

If any dependency is updated (e.g. boost) then feedstocks are rebuilt
automatically, but ONLY if our packages are in conda-forge, which they
are not. To emulate this, Valerio bumps all the feedstock build
numbers for each release (even ndarray because it depends on boost) so
they will get built against the latest versions in conda-forge.

separate feedstock for py27 since everything is pinned in the recipe:
https://github.com/slac-lcls/psana1-py2-feedstock
recipe/conda_build_config.yaml overrides the equivalent in the conda forge
pinning file (most recent version pulled automatically from conda-forge)

in recipe/conda_build_config.yaml (ONLY for psana1-py2-feedstock),
zip_keys: override specification for the 3 special packages python,
python_impl, numpy. conda-forge people helped specify these, in
particular Billy Poon.

feedstocks:
libpressio-feedstock psana1-py2-feedstock sz-feedstock
ndarray-psana-feedstock psgeom-feedstock ztcav-py2-feedstock xtcav2-feedstock
ndarray-psana-py2-feedstock psocake-py2-feedstock psocake-feedstock
psana1-feedstock stdcompat-feedstock

libpressio,sz,stdcompat are all needed by chuck/sz.

update build numbers for all the above for every psana1 release.

Checklist for building psana1 environments from feedstocks

As psreldev on psbuld-rhel7:

  • source /cds/sw/ds/ana/conda1/inst/etc/profile.d/conda.sh
  • conda activate conda_build
  • cd /cds/sw/ds/ana/conda1/manage/
  • bin/ana-rel-admin --force --cmd psana-conda-src --name 4.0.11 --basedir `pwd`

As normal user:

  • cd /cds/sw/ds/ana/conda1/manage/downloads/anarel
  • cp psana-conda-4.0.11.tar.gz /reg/g/psdm/web/swdoc/tutorials
  • sha256sum psana-conda-4.0.11.tar.gz <copy the checksum>

Activate an environment that contains the 'conda smithy' package. For example:

conda activate /cds/home/v/valmar/.conda/envs/conda_forge_build3.9

Then for each of these feedstock (more or less in this order):

            github.com/slac-lcls/sz-feedstock
            github.com/slac-lcls/stdcompat-feedstock
            github.com/slac-lcls/ndarray-psana-feedstock
            github.com/slac-lcls/ndarray-psana-py2-feedstock
            github.com/slac-lcls/libpressio-feedstock
            github.com/slac-lcls/xtcav2-feedstock
            github.com/slac-lcls/psgeom-feedstock
            github.com/slac-lcls/psocake-feedstock
            github.com/slac-lcls/psana1-py2-feedstock           
            github.com/slac-lcls/psana1-feedstock            

As normal user:

  • Modify recipe/meta.yaml
    • Check if new version of packages has been released 
    • If yes, bump up version, update sha256 checksum and reset build number
    • Otherwise, bump up build number
  • Commit
  • conda smithy rerender -c auto
  • Push

After all packages have finished building on GitHub (Unfortunately, GitHub Actions do not provide a "dashboard" to check if the packages have been built or when they are finished. There are only two ways to check this: 1) Manually check every feedstock repository (Under the "Actions" tab) 2) Wait for either the package to appear on anaconda.org/lcls-i or for an email notifying a failure to arrive):

As psreldev:

  • source /cds/sw/ds/ana/conda1/inst/etc/profile.d/conda.sh
  • cd /cds/sw/ds/ana/conda1/manage/jenkins/
  • conda env create -n ana-4.0.11 --file ana-env-py2.yaml
  • conda env create -n ana-4.0.11-py3 --file ana-env-py3.yaml

Addition For New Pinned-Approach

Valerio writes: It's not different from the psana2 approach at all. All the feedstocks have now the pinning file in them. You just need to update the meta.yaml file, run conda build recipe, upload and build the environment from the "pinned" YAML file. (First you need to package the source in a tar.gz like we always did)

The yaml files for creating the environment are in /cds/sw/ds/ana/conda1/manage/jenkins/.

LCLS2 Github Actions Approach (Deprecated)

Currently deprecated since conda forge changes versions more rapidly than we would like, so we use the "Pinned Approach" described below.

First, tag lcls2 and ami repos.

checkout the 20 feedstock packages (in github.com/slac-lcls/): 

python /cds/sw/ds/ana/conda2/manage/bin/feedstock.py --clone

After these repos are cloned, you can use this to do commands in all repos:

python /cds/sw/ds/ana/conda2/manage/bin/feedstock.py --cmd "pwd; git pull --rebase"

Either update version (with new git tag) or update the build number in each meta.yaml.  conda-forge enforces manual maintenance of version numbers and sha256 hashes (no automatic determination from latest GitHub tag).  If version has changed, compute new sha256 from .tar.gz from GitHub with:

sha256sum 1.1.7.tar.gz

For a new version, remember to reset build number to zero.  Commit the changes.  This command allows you to modify build numbers.  It will prompt you for each package, and you can enter "y" (increment build number) "n" (don't increment, default) or "0" (set to 0):

python /cds/sw/ds/ana/conda2/manage/bin/feedstock.py --incbuildnum

The next step requires an environment where "conda-smithy" is installed (we have put this in psrel's conda_build env).  Then rerender with "conda smithy rerender -c auto" (again using feedstock.py --cmd option).  This last command does its own git-commit for you with the changes.  When must this be run?  Valerio writes: "In general any change in conda forge config (for example, when we want to upload to a different channel, or to switch to a different CI). However, it is also needed to pick up the newest version of the conda-forge pinning file, so I run it every time. In the worst case, it tells me everything is up to date and does not create a commit".

The final "git push" of the above changes must be done carefully because it triggers the GitHub build, and order of the GitHub builds matters because of conda's "build:, host:" section dependencies (run-time dependencies do not affect this order).  We believe pure-python packages can go in the first wave, since they have no complex dependencies in "build:, host:" sections of meta.yaml. These are the waves of builds that can be launched in parallel:

  • libnl, libnl3, roentdek, amityping, prometheus-cpp, libusdusb4, psmon, networkfox, rogue, epix, lcls2_timetool, cameralink-gateway, lcls2-pgp-pcie-apps, xtcdata, ami
  • rmda-core (depends on libnl), psalg (depends on xtcdata)
  • fabric (depends rmda-core), psana (depends on psalg)
  • psdaq (depend on psalg, fabric)

The 4 build waves above can be launched with a command like this.

python /cds/sw/ds/ana/conda2/manage/bin/feedstock.py --cmd "git push" --wave 1

If a package is built for python build matrix tells it to build for the officially supported conda-forge versions (3.6,3.7,3.8).

If we were in conda-forge officially, they take care of these build-order dependencies (but not the versions/sha256/buildnumber).  They may have a bot that tells you that the version was updated, but we still have to update the version number by hand. We haven't gone the conda-forge route because (1) we don't know how much work it is, and (2) cannot upload source .tar.gz to conda-forge, usually get it from GitHub, but the rogue packages are not public.

Unfortunately, GitHub Actions do not provide a "dashboard" to check if the packages have been built or when they are finished. There are only two ways to check this: 1) Manually check every feedstock repository (Under the "Actions" tab) 2) Wait for either the package to appear on anaconda.org/lcls-ii or for an email notifying a failure to arrive.

To create the environment as psrel, use these commands:

conda env create --name ps-N.Nodd.N --file env_create.yaml (devel env)

conda env create --name ps-N.Neven.N --file prod_create.yaml (prod env)

Automated feedstock environment building

  1. GitHub Access Token


    A new set of python scripts has been developed to automate the creation of LCLS-II conda environment. They can be found here:
    /cds/sw/ds/ana/conda2/manage/buildenv

    The scripts need a valid personal GitHub token to be used (unfortunately, GitHub caps the number of API requests that can be done without a token). The token can be created on GitHub, after logging in:

    https://github.com/settings/tokens

    The token must be exported and made available as an environment variable called GITHUB_ACCESS_TOKEN.  Technically it is only needed by step (4) below, but the way the scripts are written it should be set for all steps.

    For an example, please see ~valmar's bashrc file (also in psrel bashrc).

  2. Package Version File


    (this file-editing should be run as user "psrel")

    In order to generate packages, a YAML file listing the required version of each package must be created. For a file with the package versions in the latest environments, see:
    /cds/sw/ds/ana/conda2/manage/buildenv/table.yaml

    A few lines as an example:

    ami: 2.4.7
    amityping: 1.1.7
    cameralink-gateway: 7.6.2
    epix: 0.0.3
    lcls2-pgp-pcie-apps: 2.2.0
    lcls2_timetool: 3.3.0
    epix-hr-single-10k: 3.1.1
    lcls2-epix-hr-pcie: 1.2.0
    ....
  3. Preparing source tarballs


    (this step should be run as a normal user like valmar, cpo,...)

    Before building the feedstocks, source tarballs must be created for the rogue-related packages (which are private repos, so need source .tgz files generated) by running the prepare_source.py script.

    The script must be run using the conda_build environment (conda activate conda_build) It must also be run as a normal user because psrel cannot write to the /reg/g/psdm/web/swdoc/tutorials/.
    The script takes the package version file as an argument:
    python /cds/sw/ds/ana/conda2/manage/buildenv/prepare_source.py --package-version-file=/cds/sw/ds/ana/conda2/manage/buildenv/table.yaml

    The source tarballs are automatically generated and copied to /reg/g/psdm/web/swdoc/tutorials/

    PS: The script will clone the required repostiories in the current working directory!!!! Working in a temporary directory that can be later deleted is strongly advised

  4. Generating packages


    (this step should be run as a normal user like valmar, cpo,...)

    The packages can now be built using the build environment script, again in the conda_build environment (conda activate conda_build):
    python /cds/sw/ds/ana/conda2/manage/buildenv/build_environment.py --generate-packages --package-version-file=/cds/sw/ds/ana/conda2/manage/buildenv/table.yaml

    The script will build the packages wave-by wave. For each wave, the script will clone the feedstocks one by one, automatically make the necessary changes to the recipes, run "conda smithy" and push the changes to the git repositories, triggering the building of the packages.  The individual builds can be seen at URLs like https://github.com/slac-lcls/libnl3-feedstock/actions

    The script will then check with GitHub every thirty seconds, and report the status of the build process ("Not started yet", "running", "success" or "failed"). It will waint until all builds have finished and are either in a "success" or "failed" state. If no build has failed, the script will then proceed to the next package "wave". Otherwise it will exit.

    Instead of going through all the waves, one can start from a certain wave (using the --start-from-wave option) and/or stop at a certain wave (using the --stop-at-wave option)

    PS: The script will clone the required repostiories in the current working directory!!!! Working in a temporary directory that can be later deleted is strongly advised

  5. Building environments


    (this step should be run as user psrel)

    The production and development environments can be created using the normal "conda env create" commands (see above), or using the build_environment script:
    (from /cds/sw/ds/ana/conda2/manage directory)
    (NOTE: this command produced the env in a non-standard location, cpo used "conda env create --name ps-4.4.1 -f prod_create.yaml" instead)
    python buildenv/build_environment.py --build-environment --environment-name=ps-4.4.0 --environment-file=prod_create.yaml

    The environments must be created using the psrel user, because only the psrel user can write to the environment directories

    This step and the previous one can be combined in a single command for convenience:
    python build_environment.py --generate-packages --package-version-file=/cds/sw/ds/ana/conda2/manage/buildenv/table.yaml --build-environment --environment-name=ps-4.4.0 --environment-file=prod_create.yaml

LCLS2 Pinned Packages Approach

Building A Package

Need to create a feedstock for every new package.  LCLS2 feedstock packages are in (for example) https://github.com/slac-lcls/epix-feedstock.

For rogue packages only, need to create a .tar.gz since their git repos are not public:

  • Checkout the package (remove the ".git" subdirectory because it is large).
  • Create .tar.gz file with "tar cvfz epix-quad-1.2.0.tar.gz epix-quad/"
  • Copy the .tar.gz to the directory where it can be seen: "cp epix-quad-1.2.0.tar.gz /reg/g/psdm/web/swdoc/tutorials/"
  • Compute the sha256sum with "sha256sum epix-quad-1.2.0.tar.gz"
  • Put this sha256 in the epix-quad-feedstock/recipe/meta.yaml

The pinning is in conda_build_config.yaml (our packages are towards the end of the file, the conda-forge pinnings are earlier).  The conda forge pinnings are obtained from files like: https://github.com/conda-forge/conda-forge-pinning-feedstock/blob/main/recipe/conda_build_config.yaml.  Valerio wrote a small throwaway python script to take the packages from our environment and them to the conda_build_config.yaml.  This script also pins the low-level underlying packages that conda-forge does not explicitly pin and adds them to conda_build_config.yaml.  This conda_build_config.yaml cannot trivially be used by GitHub actions (because it will pull the latest version from conda-forge) so we are using this file and building locally on psbuild-rhel7 (with infinite time we could write a custom action).

Then: "conda deactivate", "conda activate conda_build".  Need a .condarc that looks like this (otherwise goes to conda "defaults" instead of conda-forge for packages). Need the "pkgs_dirs" variable in .condarc, otherwise conda-build tries to write to /cds/sw/ds/ana/conda2/inst and has permissions issues.

(conda_build) psbuild-rhel7-01:epix-quad-feedstock$ more ~/.condarc
channels:
  - lcls-ii
  - lcls-i
  - conda-forge
  - defaults
  - tidair-tag
  - tidair-packages
pkgs_dirs:
  - ~/.conda/pkgs
(conda_build) psbuild-rhel7-01:epix-quad-feedstock$ 

Then build package with "conda build recipe/".  Important to launch from this directory since it has the conda_build_config.yaml.

For the setup.py (e.g. epix-quad/setup.py) the list of python modules can be determined by finding all the directories with __init__.py in them, e.g. with "find . -name __init__.py".  Then the "package_dir" variable in setup.py is filled in with the directories where all the modules are.

After creating the package then upload, e.g. with "anaconda upload -u lcls-ii yaml-cpp-0.5.3-h1d15853_72.tar.bz2".  May need to do "anaconda login" (once per year?).

Devel Env Creation

Export the current environment to yml (with includes pinnings).  Then we can do "small tweaks" to versions.  But big version changes will break the solver.  If we have a big change, we have to start from scratch: use /cds/sw/ds/ana/conda2/manage/env_create.yaml in the usual fashion (but use libmamba)

Prod Env Creation

As for the devel env, but with a file like "/cds/sw/ds/ana/conda2/manage/prod_create.yml" and when we do "conda create" it will pick up the versions that were specified in the feedstock package conda_build_config.yml.

Disappearing Packages

Sometimes our precise pinned package versions disappear from conda-forge.  In the past when this happened Valerio was able to manually modify the above pinned yaml files to choose a nearby version.  This so far has not broken the fast conda-solver behavior.

  • No labels