You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 31 Next »

Todo

Deployment involves the following:

  1. Deployment database to keep track of all host machines, and what applications are hosted where
    1. Using the database schema from Component & Deployment Database - LCLSControls - SLAC Confluence (stanford.edu).
  2. Ansible for deployment automation, system configuration, rollback to previous versions
  3. Github code reviews for pull requests / official tags
  4. CLI to initiate deployment 
  5. End-Goal: we want the user to have a minimal amount of manual steps in deployment (ideally a user just does a 'bs run deployment' for their component, and the rest is done automatically, of course once a user adds deployment information to the database). We will predefine common playbooks (with args), with the intention of 90%+ of users can just use those, but user can define their own deployment playbook / script as a last resort.

Types of Deployment

Developer/Test Deployment

note - We may need 3 places, dev / test / prod for deployment to go, and if we can could create a lightweight override that would be helpful for a temporary test location possibly using local screeniocs component

How deployment works currently (for a brand new app, for an existing app its the same steps minus a few): 

  1. Resources:
    1. CRAM migration guide (stanford.edu)
    2. Deploying Software with cram (and eco) - Ng, Alexander - SLAC Confluence (stanford.edu)
    3. https://confluence.slac.stanford.edu/display/~ktkim/How+to+use+cram+and+eco+for+the+software+release+across+mutiple+sites+or+facilities

  2. Depending on the type of software you will deploy it to a certain location
  3. Note - There are different conventions for the different types of software such as the name of the softlink and what they point to 
    1. SIOC/HIOC
      1. Initial deploy readiness procedure
        1. (This step wouldn't be part of deployment) Create your repo, initialize with git
        2. Add your project to eco

          $ git clone /afs/slac/g/cd/swe/git/repos/slac/buildtools/eco_modulelist.git
          $ cd eco_modulelist
          $ vi modulelist.txt
          # Add your line
          <project-name>        <path-to-project-repo-.git>
          # Push to eco repo
          $ git add modulelist.txt
          $ git commit -m "Added <project-name> to eco."
          $ git push
          # Pull from eco repo remote
          $ cd $TOOLS/eco_modulelist
          $ git pull
        3. Add your project to cram

          $ cd <project-name>/<project-name>-git
          $ cram describe                             # This should open a prompt to enter the project name and type. Then will create .cram information
          # Commit the .cram
          $ git add .cram
          $ git commit -m "Add cram functionality"
          $ git push
          # Can test with
          $ cram ls
        4. Add your component to $EPICS_IOC_TOP (/afs/slac/g/lcls/epics/iocTop)

          $ cd $EPICS_IOC_TOP
          # make directory for your app if doesn't exist
          $ mkdir <app> && cd <app>
          # Create symlink to point to a tag, in this case it doesn't exist 
          # TODO: not sure if this part works cant find docs on brand new app
          $ ln -s <app>-<tag> current   # Make a softlink to the folder
        5. Add your IOC to $IOC (/afs/slac/g/lcls/epics/iocCommon)

          # Create the directory
          $ cd $IOC && mkdir <ioc>
          # Create symlink to point to the symlink you made before
          $ cd <ioc> && ln -s ../../iocTop/<app>/current iocSpecificRelease
          # Add in the startup.cmd
        6. Update screeniocs with your ioc entry
      2. Deployment procedure
        1. After finishing changes to component, tag it like R1.0.1, push the tag

          $ git tag <tag name>                                    # Create a tag. Your tag should follow the RX.Y.Z convention.
          $ git push origin <tag name>                            # Push a tag to a shared server.
          $ git push origin --tags                                # If you have a lot of tags that you want to push up at once. This will push all of your tags to the remote server that are not already there.
        2. Then clone the tag using eco, cd into that tagged version, and do a make
        3. Now ready to push to the deployment locations in all facilities using cram push

          [mshankar@lcls-dev2 MatlabSupport-R3-2-0-0]$ cram push
          Pushed release MatlabSupport-R3-2-0-0 to LCLS
          ...
        4. Then update the symbolic link(s) (current will now point to your new tag, iocSpecificRelease always points to current (on prod at least)) using cram upgrade -f <facility> <release-name>
    2. HLA/Matlab/Tools
    3. PYDM
  4. Plan to automate these deployment steps
    1. TODO for IOC deployment (add to Jira once done):
      1. done - switch over this development to s3df, and lets use /scratch, so this way we dont interefere with real directories. And make a bashrc with the necessary environment variables, and ideally env variables that depend on others, like if on s3df $OS-ENV=/scratch, then like $IOC=$OS-ENV/slac/g/lcls/epics/iocCommon
      2. in the CLI for deployment, we should make it call a script to set up environment variables (for deployment locations, etc.) depending on the os, like s3df / afs. And logic is set the variables if they don't already exist
      3. replace eco with 'bs checkout' and itll git clone and prompt user for the same env variables as eco.
      4. Emulate entirety of cram in a playbook(s)
      5. for regular deployment, start a build, backend logic copies build results to known location, then deploy the build results into correct locations for each facility
        1. for deployment 'bs deploy' ad-build will do the manual steps, and arguments will just be the <tag> and <facility>
      6. Current example of it running (with adbs_playbooks_dir in run_commands.py hardcoded in)
        1. (adbs-env) [pnispero@sdfiana018 test-ioc]$ bs run deployment
          Checking current directory if a component...
          INFO-root:[run_commands.py:36 - parse_manifest() ] {'format': 1, 'repo': 'test-ioc', 'organization': 'ad-build-test', 'build': 'build.sh', 'deploy': 'test-deployment.yaml', 'environments': ['rocky9', 'rhel7'], 'dependencies': [{'epics-base': 'R7.0.3.1-1.0'}, {'asyn': 'R4.39-1.0.1'}], 'python': 'requirements.txt'}
          == ADBS == At the moment, deployment only for IOCs is supported
          == ADBS ==
          
           ****** if testing please source BuildSystem/other/test-env.bash ******
          
          [?] Specify type of deployment:
           > DEV
             PROD
          
          [?] Specify type of ioc:
           > SIOC
             HIOC
             VIOC
          
          [?] Initial deployment?:
           > True
             False
          
          [?] Specify name of ioc to deploy: sioc-test-bs
          [?] Specify host user account used to run screen
          (ex: laci@lcls-dev1): adbuild
          [?] Specify executable path
          (ex:/afs/slac/g/lcls/epics/iocCommon/sioc-sys0-al02/iocSpecificRelease/bin/rhel7-x86_64/alhPV): /sdf/home/p/pnispero/tes
          t-ioc
          {'initial': 'True', 'component_name': 'test-ioc', 'deploy_type': 'DEV', 'user': 'pnispero', 'iocTop': '/sdf/scratch/ad/build/lcls/epics/iocTop', 'iocCommon': '/sdf/scratch/ad/build/lcls/epics/iocCommon', 'iocData': '/sdf/scratch/ad/build/lcls/epics/ioc/data', 'ioc_type': 'SIOC', 'ioc_name': 'sioc-test-bs', 'host_user': 'adbuild', 'server_user_node_port': 'None', 'executable_path': '/sdf/home/p/pnispero/test-ioc', 'output_path': '/sdf/home/p/pnispero/test-ioc/ADBS_TMP'}
          [WARNING]: No inventory was parsed, only implicit localhost is available
          [WARNING]: provided hosts list is empty, only localhost is available. Note that
          the implicit localhost does not match 'all'
          
          PLAY [localhost] ***************************************************************
          
          TASK [Gathering Facts] *********************************************************
          ok: [localhost]
          
          PLAY [Initial IOC Deployment] **************************************************
          
          TASK [Gathering Facts] *********************************************************
          ok: [localhost]
          
          TASK [Add component to $EPICS_IOC_TOP] *****************************************
          ok: [localhost]
          
          TASK [Create sioc directory in $IOC if it doesn't exist] ***********************
          changed: [localhost]
          
          TASK [Create symbolic link for iocSpecificRelease for DEV] *********************
          changed: [localhost]
          
          TASK [Create symbolic link for iocSpecificRelease for PROD] ********************
          skipping: [localhost]
          
          TASK [Create sioc directory in $IOC_DATA if it doesn't exist] ******************
          changed: [localhost]
          
          TASK [Create multiple directories in $IOC_DATA/sioc-test-bs if they doesn't exist] ***
          changed: [localhost] => (item=/sdf/scratch/ad/build/lcls/epics/ioc/data/sioc-test-bs/archive)
          changed: [localhost] => (item=/sdf/scratch/ad/build/lcls/epics/ioc/data/sioc-test-bs/autosave)
          changed: [localhost] => (item=/sdf/scratch/ad/build/lcls/epics/ioc/data/sioc-test-bs/autosave-req)
          changed: [localhost] => (item=/sdf/scratch/ad/build/lcls/epics/ioc/data/sioc-test-bs/iocInfo)
          changed: [localhost] => (item=/sdf/scratch/ad/build/lcls/epics/ioc/data/sioc-test-bs/restore)
          changed: [localhost] => (item=/sdf/scratch/ad/build/lcls/epics/ioc/data/sioc-test-bs/yaml)
          
          PLAY RECAP *********************************************************************
          localhost                  : ok=7    changed=4    unreachable=0    failed=0    skipped=1    rescued=0    ignored=0
          successful: 0
          (adbs-env) [pnispero@sdfiana018 test-ioc]$

          Update:

          Wanted to highlight the differences between deploying with CRAM vs new bs (build system) deployment

          1. bs only pushes the app's build results (what's needed for app to run), whereas CRAM pushes entire app. Saving some space.
          2. bs uses tarballs/rpms to install build results, whereas CRAM is rsync entire directory.
          3. bs combines CRAM push and CRAM upgrade to one command
            • But doesn't lose ability to individually choose releases for an ioc since ansible is idempotent.
          4. bs has additional logic for new ioc's to automatically create the required directories, sym links, st.cmd (tentative) for an ioc to run. These things are traditionally done manually or some scripts from Ken Brobeck.
          5. bs will ensure a tag is code reviewed before deploying to production. This is new.
          6. bs will be more user-friendly and provide more useful output to user deploying (like what and where exactly is deployed).

          I have got a working prototype of a deployment Build and Deploy build results for IOC app · Issue #7 · ad-build-test/BuildSystem (github.com, and I've almost finished points 1-4. Among other things, the deployment aspect of this new system consolidates all cram and manual steps to just one command. 

What - When a developer needs to host their app on a certain test stand(s). So, they can test functionality. 
When - Anytime a developer wants to test, or automatically done on pushes to main branch / pull requests
How - Options:

  1. Through the CLI, using 'bs run deployment', then once confirmed app deployed successfully, then any automated tests are automatically ran

Production/Official Deployment

What - When an app is updated for bug fix / feature, code reviewed, tested through build system, scheduled in PAMM, is now ready to deploy in production.
When - When an app is code reviewed, tested, scheduled to release. 
How - Options:

  1. Through the CLI, using 'bs run deployment prod', once deployed, but not replacing the current app, because some tests will be done while app is in production. If they pass then the latest tag will become the official tag, and will be officially deployed and recorded to deployment database. 


3 Way meet Jerry, Claudio, Patrick about app approval logic

Notes:
9-6-2024 discussion of functionality wanted

  1. w
  2. yes that sounds good, these are some sample use cases we can cover:
    Use cases:
    1. if a pull request is created, then backend triggers a build/test, report back to the comments if successful
    2. if a reviewer requested changes, developer updates the pull request, and backend triggers the build/test again, report back to comments
    3. if a pull request is approved, merge onto main and backend trigger the build/test on main
    what do you think?
  3. type of logic, where we do it
  4. for pr, if we want backend to be monitoring the repo. 
  5. what is the workflow? backend to cli user or to the repo
    1. typical case: user request build from backend (remote)
    2. but have option for local build, 
    3. nightly builds which a robot will request a build on its own
    4. should cli or backend to watch the pull request. 
    5. as a user if i use the cli to request a new branch, cli creates branch and calls to backend to the databae add that branch, then backend can monitor the pull request
    6. if the user want to fix something, create with cli a branch
    7. get workflow going from user to repo pull request to merge 
    8. we want some extra variables with approval behavior like who approves and whats requires
    9. can put the approval test rules in the backend, where backend can read the output test in like JSON format. 
    10. can use backend or github review
    11. example, when user starts remote build with cli, they can also pass in the approval rule like 80% tests pass etc.
  6. but we want to have a small menu of possible rules, and each component will choose among them
  7. another workflow: trusted commiters - has access to backdoor to pass own code review.
  8. in the database or manifest, who is approval rules
  9. TODO: send to Claudio the documentation we create for the exact workflow of how this going to work
  10. can we have backend support github issues and Jira issues, to add coments to github/jira. It is possible.
  11. unlike traditional ci/cd we dont want to deploy until the pamm starts, deployment to production should always be started by hand. 
  12. have cater and github/issue cross-refrence
  13. want feature to put a gate, deployment only allowed during time specified in the cater for the issue
  14. new cater not problem but won't be rolled out until january/february
  15. we also want a workflow where if someone in acr creates a cater
  • No labels