Design, build, test, and maintain robust software tools for high-throughput analysis pipelines, initially focused on ALS and ALS-U beamlines for Scanning Transmission X-ray Microscopy (STXM) and Resonant Inelastic X-ray Scattering (RIXS).
Work closely with scientists and technical staff to capture requirements for new software tools and computing infrastructure. Engage in regular testing and feedback sessions to refine and validate tools in operational settings.
Ensure the reliable performance of computational workflows through modern unit and integration testing.
Collaborate with other Scientific User Facilities by contributing to shared open-source software projects, including code development, peer review, and maintaining communication to synchronize efforts across facilities.
Design workflows for deployment across diverse computational environments, including local setups, high-performance computing (HPC) clusters, and cloud platforms.
Document technical development comprehensively, including detailed code comments, issue tracking, code reviews, and drafting design and architecture specifications.
Communicate the impact and results of technical projects both internally and externally, including publishing results in peer-reviewed journals, presenting findings at workshops and conferences, and creating end-user documentation and tutorials.
Requirements & Skills:
Bachelor’s degree and a minimum of 5 years of related experience; master’s degree and a minimum of 3 years of related work experience, or an advanced degree without experience; or equivalent work experience
Ability to work collaboratively with a diverse team of scientists and engineers
At least 3 years of development experience with Python
Experience using the open-source scientific Python software stack
Experience contributing to a collaborative software project, including co-developing an internal project or contributing to community-based open-source software
Experience creating data analysis methods and procedures
Experience in data acquisition and analysis at a synchrotron light source, neutron source, or other major scientific user facility
Experience creating data analysis methods and procedures specifically for high data volumes
Experience creating user-interfaces or interactive dashboards
Familiarity with widely used AI/ML libraries such as scikit-learn, PyTorch, and TensorFlow
Experience with one or more container deployment systems such as Podman, Docker and Kubernetes
Experience with workflow orchestration systems like Prefect, Airflow, Globus Flows
Experience with software source control and related team software tools and processes
Experience with configuring and maintaining GitHub Actions
Experience with profiling tools and scaling analysis to identify bottlenecks and ideal compute configurationsÂ