GlueX Singularity Container Notes

OSG experience

  • Ran with Richard’s osg-container.sh
  • Invoked using Thomas’s MCwrapper
  • Jobs fail for lack of RCDB access

Singularity work

  • singularity ext3 gluex image
    • start from Docker, centos:latest
    • initial build into “sandbox” (standard directory tree)
    • do package additions via build_scripts
    • convert from sandbox to ext3
    • container size: 1.1 GB
  • complete build of gluex stack using container, but external to container
    • built with version_2.26.xml (most recent is 2.27)
    • starts at 28 GB
    • after trimming: 8.4 GB with everything (below, sizes in kB)
      • 4914204 sim-recon
      • 1613104 geant4
      • 1146088 root
      • 383184 hdgeant4
      • 189660 jana
      • 152736 cernlib
      • 105552 lapack
      • 65732 ccdb
      • 58660 xerces-c
      • 58376 rcdb
      • 10268 sqlitecpp
      • 6988 hdds
      • 4344 evio
      • 3460 amptools
      • 2924 gluex_root_analysis
      • 1728 hd_utilities
      • 428 build_scripts-latest
      • 48 latest.tar.gz
      • 4 version.xml
      • 4 setup.sh
      • 4 setup.csh
      • 0 build_scripts
  • questions:
    • how to put on oasis?
      • proposal: use build_scripts directory structure
    • will it run in existing container?
      • likely yes
    • what to do about CCDB, RCDB, resources?
      • proposal: reproduce /group/halld directory structure
        • can update in an rsync-like manner
Advertisements

Singularity Notes

  • need to add which and sqlite-devel to packages on centos7 so bash has a which command
  • converted from sandbox to ext3:
    • sudo /usr/local/bin/singularity build gluex_centos7.img centos7
    • centos7 is the directory with the sandbox image
    • gluex_centos7.img is the output ext3 image
    • must be run as root
    • tested ext3 image with b1pi test, 10 events, both with mysql and sqlite for RCDB.
  • convention seems to be
    • ext3: *.img
    • sandbox: * (directory name, no extension)
    • squashfs: *.simg
find gluex_centos7 -type d \( -name ".Linux*" -o -name include -o -name src \
    -o -name build -o -name build_dir -o -name tmp -o -name TESTING \
    -o -name SRC -o -name doc -o -name documentation \) -exec rm -rfv {} \;
find . -name \*.a -exec rm -fv {} \;
find . -name \*.o -exec rm -fv {} \;

Needed changes:

  • ROOT includes needed
  • *.tgz, *.tar.gz *.d can go

Original sandbox: 27 GB
After deletion as above: 8.5 GB

Here is what is left:

971192 gluex_centos7_no_includes/gluex_top/root/root-6.08.06
1123660 gluex_centos7_no_includes/gluex_top/root
1506160 gluex_centos7_no_includes/gluex_top/geant4/geant4.10.02.p02
1537644 gluex_centos7_no_includes/gluex_top/geant4
4344068 gluex_centos7_no_includes/gluex_top/sim-recon/sim-recon-2.22.0/Linux_CentOS7-x86_64-gcc4.8.5/bin
4903456 gluex_centos7_no_includes/gluex_top/sim-recon/sim-recon-2.22.0/Linux_CentOS7-x86_64-gcc4.8.5
4903488 gluex_centos7_no_includes/gluex_top/sim-recon/sim-recon-2.22.0
4909180 gluex_centos7_no_includes/gluex_top/sim-recon
8503780 gluex_centos7_no_includes/
8503780 gluex_centos7_no_includes/gluex_top

GlueX Meeting Report

    1. Improved parameters for track matching to the FCAL. See New sim-recon release: version 2.23.0
    2. Disk space under /work increased from 66 TB to 110 TB
    3. Restoring RCDB reads of SQLite files, new library introduced SQLiteCpp. See New releases: build_scripts 1.26, rcdb 0.03, sqlitecpp 2.2.0, sim-recon 2.26.0, hdgeant4 1.6.0
    4. GlueX reconstruction bench-marked on Cori I & II. See GlueX + NERSC
    5. Not-the-TOF Anomaly in Monitoring Histograms is being worked on. May force a re-do of reconstruction of Spring 2017 data.
    6. Several of us have started meeting to discuss use of containers (Docker, Singularity) in various computing contexts (NERSC, OSG, JLab farm, personal laptops). All welcome.