Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Overview of SAS-related activities
    • scope of support
    • already known areas of shortfall
  • Mission Planning tagup (Date/Time: Monday Feb 6, 10AM - 12noon. Location: TBD).
    • optimization of ongoing transition
    • walkthrough/demo of weekly planning
    • review of documentation
  • ISOC software
  • Code development infrastructure
    • python distributions
    • repository
    • release management 
    • code distribution
  • Science Tools
    • external packages
    • fermiPy support
  • L1 processing & halfpipe 
    • operational support
    • system tests, new version validation and approval
    • reprocessing 
  • VMs, containers
    • RHEL6 is end of road for GR
    • automatic creation of VMs (do we need containers too?)
  • Data Issues
    • change datasets delivered to FSSC?
    • are data servers at SLAC still needed?
    • change of storage model at SLAC?
  • GlastRelease futures
    • GR code development
    • any issues on long term support of ROOT files etc?

...

  •  ISOC/Ops/Planning software
  •  Robin, Jerry and Elizabeth will be at SLAC and can hopefully resolve some of the connectivity issues we’re having
  •  Face-to-face training for planning process
  •  Decide on a timeline to perform shadow operations
  •  Discuss leap second process since no shadowing occurred this time
  •  SAS development
  •  Incorporating more python
  •  Removing external packages
  •  Repositories, Issue Tracking, Release Managers, etc.
  •  Data retention
  •  Discuss sending critical data sets (like electron data) to GSFC for long-term retention
  •  Future reprocessing
  •  Potential change of storage model at SLAC (more tape reliant)
  •  L1 Pipeline
  •  Monitoring & issue resolution
  •  Test process as updates are required
  •  Misc items
  •  All systems - update for new OSs? Or maintain current on VMs?
  •  Data server - Only at GSFC?
  •  GLEAM (aka reconstruction chain)
  •  System Tests
  •  Investigating conda and conda-forge as a distribution channel for the STs
  •  Using docker for software distribution
  •  Universal linux binaries for Science Tools (i.e. distribution agnostic)
  •  Cloud-based CI services (Travis, CircleCI) for compiling and/or testing

Meeting process:

- Should have lots of time for breakouts. Spread topics across multiple days. Working meeting. Not lots of presentations.
- Use SLACK channel for discussion. Make new channel. Name?
- List people being lost and their tasks, people still available. Map tasks to names as appropriate.
- Coordinate the meeting through the existing DRSC list. Seems to hold most of the important members. Add as needed. Or software mailing list.

...