- Indico style
- Indico style - inline minutes
- Indico style - numbered
- Indico style - numbered + minutes
- Indico Weeks View
This event is part of the Laser-Plasma Accelerator Seminars. Click here for more information, including data protection.
Note: Registration for remote participation is now open. Click here to register. To register for hands-on tutorials, click here (limited to on-site participants).
We are proud to announce that the 2nd LPA Special Workshop with its focus on "Intelligent Systems" will be held at the Department of Physics of the University of Oxford from January 13 to 16, 2025. The event will be held in conjunction with the THRILL workshop on adaptive optics integration.
The workshop aims to cover the full spectrum of automation and intelligent control in high-power laser and laser plasma acceleration (LPA) facilities.
Our scope encompasses both hierarchical viewpoints - including facility-level, experiment, and laser automation - and conceptual aspects such as advanced diagnostics, storage and extended data access, control system interfaces, as well as feedback and tuning. The workshop builds up on the 2022 event.
Participants can look forward to a diverse program featuring plenary talks, invited contributions, interactive discussion sessions, and informative tutorials. Additionally, attendees will have the opportunity to present their work through poster contributions (click here to submit your abstract). We welcome submissions related to all of the above topics, including but not limited to:
- Bayesian optimization for LPA parameter tuning
- Machine learning approaches for real-time diagnostics
- Adaptive feedback control systems for laser stability
- Automated alignment and beam steering techniques
- Data-driven modeling of plasma acceleration processes
- AI-assisted simulations and diagnostics
- Intelligent fault detection and predictive maintenance in LPA facilities
A limited number of submissions may be upgraded to contributed talks upon availability. You can actively contribute to the conference organization via early registration (before October 1). By sharing your interests at registration, you will help us tailor the workshop program to better suit the community's needs. For those interested, an optional visit to the Extreme Photonics Applications Centre at STFC’s Harwell Campus will be arranged on Friday, January 17.
We are pleased to offer free admission to the workshop. We extend our gratitude to the Department of Physics at Oxford, the Central Laser Facility at STFC, AI for Realistic Science, and the THRILL project for their support. This workshop is supported by
For questions, please contact Ms. Ivy Nandongwa (ivy.nandongwa@stfc.ac.uk).
...
The EU-funded THRILL project (Technology for High-Repetition-Rate Intense Laser Laboratories) [1] gathers the forces of several institutions within a consortium to develop technologies, which will enable the operation of high-energy lasers at increased repetition rates. The overall goal of the project is to identify the most appropriate architecture of the next generation high-energy-laser systems to be used in combination with the large-scale European research facilities Eu-XFEL and FAIR.
A central aspect of THRILL is the mitigation of beam quality deterioration arising from either thermal loading, or increased system complexity due to the cooling strategy (or both). For this aspect, active beam control techniques will need to be employed in future system. Within THRILL a particular emphasis is put on the active control architecture and management, where some level of intelligent automation will help the overall laser system resilience against perturbations.
Here we will give an introduction and an overview of the THRILL project, the consortium and the current status.
[1] https://www.thrill-project.eu/
The field structure of ultra-intense laser pulses plays a critical role in their applications. Spatiotemporal couplings (STCs), though often undetectable by traditional diagnostics, can significantly impact such pulses. Undesirable STCs may reduce the peak focused intensity, while deliberate STCs, such as those in the "flying focus," underpin advancements in structured light. Furthermore, as evidenced by a growing body of simulation-based literature, the polarisation state of the pulse also can add a valuable dimension of control and flexibility. A key barrier to the experimental realisation of these techniques of structuring ultra-intense light is the fact that no method currently exists to characterize the spatiotemporal vector field of individual ultra-intense pulses.
This challenge stems from the mismatch between the high dimensionality of the vector field and the two-dimensional nature of standard measurement devices like CMOS sensors. Consequently, existing techniques rely on taking multiple laser shots, causing them to be time consuming and blind to shot-to-shot fluctuations. Presented is the development and realisation of a robust method for the single-shot characterization of the spatiotemporal vector field, which also provides uncertainty estimates. Its efficacy is demonstrated by characterizing the ATLAS 3000 PW laser and vectorial pulses such as circularly polarised optical vortices. This technique holds promise for advancing structured light applications in ultra-intense laser physics.
In this talk, we discuss a deep learning model approach that uses messy data to learn the mapping between experimental parameters -> electron spectra. Many laser facilities, e.g. ZEUS at University of Michigan, have pre-existing operational procedures that produce "real-world” datasets where data are recorded manually and with assumptions and omissions. These do not necessarily provide clean and structured data to enable machine learning and often, the first step is to "clean" these data. However, data cleaning is often a bespoke procedure that can be cumbersome.
We circumvent (a significant part of the) data cleaning step by using open-source pre-trained Large Language Models in our deep learning model pipeline. The LLM can provide embeddings which effectively translate natural language to a semantically informed numerical representation. To map the numerical representation of the experimental parameters to observed electron spectra, we train convolutional neural networks. This results in a deep learning model that is a combination of a pre-trained LLM that feeds into an untrained convolutional neural network. Early training results are promising and suggest that this approach can be an effective method by which to work with experimental parameters that are challenging to represent numerically in a structured fashion. Because we put the pre-trained LLM directly into the model pipeline rather than calling a web-API, we can also utilize gradients acquired from automatic differentiation to perform sensitivity analysis as well as gradient-based optimization.
Bivoj laser system is located at Hilase center near Prague, Czechia. While built around DiPOLE technology (cryogenically cooled Yb:YAG multi-slab laser system), it has been upgraded to reach higher output energies up to 1.5 KW of average power (150 J / 10 Hz) and the problem of so called depolarization has been successfully addressed, which led to reduction of depolarization losses from ~30% to less than 4%. Such improvement enabled efficient use of its output beam as a source for harmonic conversion stages which demonstrated stable operation at close to 1 kW (95 J / 10 Hz) @515 nm and 0.5kW (50 J / 10 Hz) @343 nm. Such laser system could serve as a suitable pump source for ultra-fast lasers envisioned for laser particle accelerators. Possibilities for future development of such pump lasers will be discussed.
The Centre for Advanced Laser Applications (CALA) in Munich is home to the ATLAS-3000 high power laser dedicated to research on laser-driven electron as well as ion acceleration and applications thereof.
The “Advanced Ti:Sapphire Laser” (ATLAS) is designed to deliver around 70J of pulse energy after compression in a ~27fs short pulse at a repetition rate of 1Hz. At this repetition rate, the system is well suited for parameter studies and optimizations with good statistics at state-of-the-art peak intensities. In addition to a DAZZLER and three deformable mirrors along the laser chain for spatial and temporal beam control, a growing number of online diagnostics are implemented for better monitoring. Efforts are currently ongoing to also install motorized mirrors for automated drift stabilization as well as prospectively stabilize the wavefront at high frequency. Most of the mentioned hardware capabilities, including the DAZZLER, are integrated into a central control system based on the open-source Tango controls framework [1]. This makes operations easier for the experimenter, facilitates unified data acquisition for later offline evaluation, and also enables online feedback and automated optimization which has been demonstrated for electron performance [2] as well as ion acceleration in first studies.
In the laser ion acceleration experiment area (LION), these laser pulses are focused to intensities exceeding $10^{21} W/cm²$ for ion acceleration, mainly protons. The large target chamber allows for versatile setups, but the current workhorse is a liquid leaf target comprised of two colliding water jets positioned in the laser focus in vacuum. This makes for a highly stable, repetitive solid density target of few-micrometer to submicron thickness. The work group also focuses on online detection methods such as a magnetic ion spectrometer with large size CMOS detector for online readout, as well as the I-BEAT detector (“ion-bunch energy acoustic tracing”) measuring the acoustic waves generated by ion bunches stopping in a water volume to characterize the bunch properties [3]. Again, many diagnostics are integrated into the central control system for unified data acquisition and first studies on online optimization of both laser and experiment parameters have been conducted.
This contribution will showcase the current capabilities of the laser, the LION experiment area, and the control system with emphasis on current and future machine automation and optimization projects. This work was funded by BMBF (projects 01IS17048, 01IS24028), DFG (grants 416702141, 491853809), GSI (project GSI-LMSCH2025) and China Scholarship Council.
[1] https://www.tango-controls.org/
[2] F Irshad, et al. Pareto Optimization and Tuning of a Laser Wakefield Accelerator. Phys. Rev. Lett. 133, 085001. doi:10.1103/PhysRevLett.133.085001
[3] S Gerlach, et al. Three-dimensional acoustic monitoring of laser-accelerated protons in the focus of a pulsed-power solenoid lens. High Power Laser Science and Engineering. 2023;11:e38. doi:10.1017/hpl.2023.16
Laser-plasma acceleration (LPA) aims to accelerate particles by exploiting the large electric field that can be achieved in a plasma. This field exceeds its counterparts in the rf-linacs and thus promises compact alternatives for the conventional accelerators.
The LPA process is highly non-linear and depends on a large number of laser and plasma parameters that make its optimization challenging. To be able to exploit the full potential of these accelerators, we need to use machine learning. In this field, Bayesian optimization is well suited to find the optimum of Particle-In-Cell (PIC) simulations, the main modeling technique for LPA.
In this presentation I will describe the properties of a LPA system in which we look for tuning and for which we currently consider the density profile of the plasma rather than the properties of the laser. We used the Multi-Objective Bayesian Optimization (MOBO) approach in which we are looking for the balance between energy spread and mean energy. This allows us to draw the Pareto Front which characterizes the best compromise achievable by our system.
The identification of prospective scenarios for observing quantum vacuum signals in high-intensity laser experiments requires both accurate theoretical predictions and the exploration of high-dimensional parameter spaces. Numerical simulations address the first requirement, while optimization provides an efficient solution for the second one. In the present work, we put forward Bayesian optimization as a new and powerful means to optimize photonic quantum vacuum signals. We demonstrate its great potential on the example of the well-studied case of two-beam collisions. Apart from providing an ideal benchmark case, this immediately gives new physics results. Namely, Bayesian optimization allows us to find the optimal waist sizes for beams with elliptic cross sections, and to identify the specific physical process leading to a discernible signal in a coherent harmonic focusing configuration scenario.
Based on "M. Valialshchikov, F. Karbstein, D. Seipt, and M. Zepf. Numerical optimization of quantum vacuum signals. Phys. Rev. D 110, 076009".
Peking University is developing a proton radiotherapy system based on a petawatt-class laser accelerator (Compact laser plasma accelerator-II, CLAPA-II). Given the ultrashort pulse duration of laser-accelerated proton beams, the dose rate per pulse can reach up to billions of grays per second. This gives laser proton radiotherapy systems unique advantages in FLASH radiotherapy for malignant tumors, which requires dose rates exceeding 40 Gy/s. However, for a medical accelerator, the stability and controllability of the proton beam are very strict. Acting as the nervous system of the entire apparatus, the control system is crucial for ensuring reliability, stability, and overall efficiency. It must also meet both technical and medical standards, making the development process quite challenging. To this end, we have applied artificial intelligence algorithms to optimize the performance of the CLAPA-II, including automatic correction of target position, automatic optimization of beam envelopes along the beamline system, enhancement of the function of the safety interlocks, and accuracy dose irradiation in the case of unstable beam flow. Results have validated that relevant AI algorithms are effective tools for enhancing the operational efficiency and stability of the laser accelerator.
An essential part of the high-energy laser system PHELIX is its front-end which allows to achieve highest contrast levels of about twelve orders of magnitude with respect to the amplified spontaneous emission. Similar to other high-contrast systems, its technology heavily relies on the exploitation of nonlinear effects, specifically ultrafast optical parametric amplification (uOPA), rendering the output highly susceptible to fluctuations of its input. By seeding the output of the uOPA into a subsequent high-gain regenerative amplifier, a total gain of approximately seven orders of magnitude is achieved. Therefore, a precise control and stabilization of the front-end amplifiers is necessary to avoid transporting these fluctuations into the PHELIX high-energy amplifiers.
Here, I report on the current measures on automated front-end stabilization and control as well as potential automation upgrades at PHELIX.
High-speed and high-resolution image capture is a fundamental component of the analysis of interactions between lasers and plasmas in laser plasma accelerator experiments. However, traditional methods of image transmission are subject to significant bandwidth and latency restrictions. In this paper, we propose an Ethernet-based Remote Direct Memory Access (RDMA) transmission system, which enables the direct transfer of raw image data from the camera to the specialized network card. By offloading the packing and processing of data packets to the network card, the workload on the CPU is significantly reduced, enhancing the efficiency of data transmission. The system is thus capable of processing and transmitting volumes of image data with enhanced efficiency. Furthermore, the paper will illustrate the enhanced performance of RDMA in the transmission of images captured by on-site experimental cameras. It will also examine the potential benefits of RDMA for data processing in laser acceleration experiments, providing new insights and technical support for future high-frequency data acquisition and processing.
Constructing an Inertial Fusion Energy (IFE) power plant faces major challenges, especially in managing high-power drive lasers subject to intense thermal loads and dynamic beam distortions. These issues degrade performance and exceed the capacity of current localized control methods, underscoring the need for integrated, systemic solutions. The Adaptive Laser Architecture Development and INtegration (ALADIN) project addresses this by shifting from localized to global beam control via Adaptive Laser Architecture (ALA). This presentation will introduce the ALADIN project, its foundational concepts, and its path toward achieving globalized beam control for IFE applications.
The complexity of modern scientific facilities, particularly cleanrooms, requires precise control over environmental parameters such as temperature and humidity to ensure experimental accuracy. These facilities, often energy-intensive, face additional challenges due to climate change and the growing demand for energy efficiency. This drives the need for a simulation framework capable of dynamic regulation and sustainable building management solutions.
This work explores the potential for developing a digital twin framework aimed at improving building management across three key scenarios: initial device configuration, real-time reconfiguration, and anomaly detection. The digital twin would serve as a virtual model that collects real-time data from physical systems to predict behavior and optimize control. In alignment with the EU FlexRICAN project, the goal is to support energy flexibility, reduce consumption, and lower CO2 emissions.
To assess the viability of this framework, we will evaluate several methods for their applicability in the targeted scenarios. Mixed-Integer Linear Programming (MILP), using OMEGAlpes software, will be explored for discrete energy modeling, while greedy algorithms will be considered for real-time system adjustments. Additionally, machine learning, including reinforcement learning, will be assessed for its adaptability in dynamic environments and effectiveness in anomaly detection. This evaluation will provide insights into the strengths and limitations of each approach in improving energy efficiency and environmental management in scientific facilities.
One of the goals of the laser ion acceleration research at Helmholtz-Zentrum Dresden – Rossendorf, Germany, is to develop a reliable accelerator system based on a cryogenic jet target. To ensure a high reliability, these jets must have a high position stability. Therefore, the position fluctuations of such a cryogenic cylindrical hydrogen target are investigated. For this purpose, an evaluation algorithm for position determination is developed. It is capable of reliably and quickly evaluating the position of the jet from images from different diagnostics. With the help of this automated evaluation, the dependence of the position stability on the distance to the source was investigated. In addition, the impact of a mechanical separation system on the position stability was investigated for the first time. The presented work marks the first step towards an automated and stable operation of a cryogenic hydrogen target at high repletion rates.
In past years, novel methods for generating ultralow emittance electron beams have been developed, offering compact particle sources with exceptional beam quality ideal for future high-energy physics experiments and free-electron lasers. Recent theoretical work has proposed a laser-based technique capable of resolving emittances below the 0.1 mm mrad regime by modulating the electron phase space ponderomotively [1]. This work presents the first experimental demonstration of this scheme using a laser wakefield accelerator [2]. The observed emittance and source size are consistent with published values. Additionally, it is shown through calculations that tight bounds on the upper limit for emittance and source size can be derived from the “laser-grating” method even in the presence of low signal to noise and uncertainty in laser-grating parameters.
1) A. Seidel et al. PRAB 24, 02803 (2021)
2) F. Salgado et al. PRAB 27, 052803 (2024)
Laser-plasma interactions generate x-ray radiation via numerous mechanisms. The x-ray spectral distribution is typically broadband and can span from keV to tens or hundreds of MeV energies with minor changes to the interaction conditions. Characterising this emission in detail provides greater understanding to the underlying physics and paves the way to optimising these novel sources for secondary applications. To date the most prevalent method of characterising this emission is the use of absorption spectroscopy, wherein the x-ray radiation is passed through different metal filters to attenuate the spectra and then a sensitive medium to record any energy deposited. We can model the response matrix, $\Gamma$, using established cross-sections of interaction or Monte-Carlo simulations to understand how single photons contribute energy to the individual detectors leaving us with a system of linear equations of the form: $$\Gamma x = B$$ Where $x$ and $B$ are column vectors of the incident spectra and measurements respectively. Numerically inverting this, in the case where $\Gamma$ is non-square is challenging, instead solutions to this are found by assuming a spectral distribution for $x$ and performing a least-squares optimisation of a critical variable for $x$.
In this work, we present an alternative machine learning-based approach, specifically a deep learning-based method, to perform spectral deconvolution without a-priori information about the spectral distribution. The proposed approach is non-parametric, unconstrained, and, to the best of our knowledge, is the first such approach, differing from existing literature where a priori information, such as spectral shapes or their parameter-space distributions, is required. Our results show that the proposed model is suitable for deployment at the edge to handle real-time processing needs, being able to reconstruct spectra at rates exceeding 1 Hz, making it directly applicable to high-repetition-rate experimental environments. With some further work, we are hopeful it will be suitable for a range of applications in plasma physics, high-energy-density physics, and laser-driven experiments.
Abstract:
Applications in the field of laser ion acceleration demand stable and reproducible ion bunches with well-controlled properties. To facilitate this, most high-power laser systems allow manipulation of the laser’s physical properties, such as its temporal or focal shape, by adjusting the control parameters of specific hardware devices. However, the complex physical effects involved in the laser target interaction make it challenging to identify control parameter settings that support desired acceleration properties.
A method that can efficiently search for the optimum operation setting in large parameter spaces is Bayesian optimization (BO). This machine learning technique is specifically designed to find the optimum of unknown, multi-dimensional parameter spaces with a limited number of evaluations.
For laser-ion acceleration, BO has been proven to optimize the laser wavefront of a 12TW laser system [1]. We will present our plans to establish such methods at our multi-PW-capable ATLAS 3000 laser system. A preliminary study will showcase the first fully automated BO of our laser-driven ion source. This optimization aims at maximizing the number of accelerated protons with energy around 14±2MeV by finding the optimal target position along the optical axis. A specifically developed software framework manages the automated operation and performs user-specific scanning and optimization routines. As a prerequisite, we also study the influence of machine control parameters on the temporal and focal shape of the laser pulse and the accelerated ions' properties. Conducted work paves the way to perform a simultaneous and automated optimization of the laser’s temporal and spatial properties to identify promising acceleration regimes and to optimize the ion bunch properties towards an application’s specifications.
Our presentation will contain the results of this study, as well as our plans for the simultaneous optimization of the laser’s temporal and focal shape. Our work was supported by the Bundesministerium für Bildung und Forschung (BMBF) within project 01IS17048 and the Centre for Advanced Laser Applications (CALA).
[1] Loughran, B., et al. “Automated Control and Optimization of Laser-Driven Ion Acceleration.” High Power Laser Science and Engineering 11 (2023): e35. https://doi.org/10.1017/hpl.2023.23.
HELPMI is a 2-year project with the framework of the Helmholtz Metadata Collaboration, conducted by GSI, HI Jena and HZDR. The aim is to start the development of a F.A.I.R. data standard for experimental data of the entire laser-plasma (LPA) community. Such standard does not yet exist, but it would facilitate management and analysis of usually quite heterogeneous experimental data and logs by rich and machine-actionable metadata,
allowing automated processing of broad and long data sets.
To date, the LPA community is widely using openPMD, an open meta-standard, for simulation data. NeXus is a similarly hierarchical and extensible standard for various experimental methods of the Photon and Neutron science community.
Within HELPMI, we are adopting the NeXus standard for LPA experimental data and simultaneously extend openPMD and its API for custom hierarchies like NeXus. Thereby we can achieve interoperability of the standards,
circumventing the need for another standard. Alongside we started developing a glossary of LPA experimental terms in order to achieve re-usability.
We will present the current status with demonstration datasets in order to showcase the possibilities. Further information can be found at https://laser-plasma-metadata.org/ .
Laser-driven plasma accelerators (LPA) are compact sources of ultra-short, intense proton pulses in the multi-10-MeV energy range. These unique parameters predestine LPAs as powerful tools for ultra-high dose rate radiobiology research. To promote further sophisticated radiobiological studies at LPAs, automated setups for proton acceleration and beamline operation are required.
Key ingredient here are diagnostic tools for a realtime characterization of laser-driven proton pulses at the LPA source as well as in different application scenarios.
In our contribution, we present the solutions for source-to-sample characterization implemented at the ALBUS-2S beamline at the Draco Petawatt laser system at Helmholtz-Zentrum Dresden – Rossendorf. With the OCTOPOD and miniSCIDOM, two devices for online, single pulse characterization of volumetric dose distributions are presented, applicable for the primary LPA source and mm-scale dose distributions at the sample site, respectively. Both devices are based on volumetric scintillators as active detector material and rely on tomographic reconstruction for signal retrieval. The contribution will outline the detector principle and focus on the optimization of the detector signal evaluation program to achieve near-realtime performance despite the required complex deconvolution steps.
Plasma-based accelerators hold the potential to achieve mulit-giga-volt-per-metre accelerating gradients, offering a promising route to more compact and cost-effective accelerators for future light sources and colliders. However, plasma wakefield acceleration (PWFA) is often a nonlinear, high-dimensional process that is sensitive to jitters in multiple input parameters, making the setup, operation and diagnosis of a PWFA stage a challenging task. To tackle some of these issues, Machine Learning techniques have gained popularity in the field of plasma acceleration. Specifically, advanced algorithms such as Bayesian Optimisation have proved useful for the setup and tuning of plasma accelerators. Moreover, neural networks trained on experimental data have enabled the shot-to-shot prediction of beam parameters based on noninvasive measurements, simultaneously providing valuable insights into the different dependencies of the acceleration process. We present progress in deploying such methods at FLASHForward, a beam-driven plasma wakefield accelerator test-bed based at DESY, Hamburg, and explore future directions for further integration of these techniques at the facility.
We introduce a method that significantly enhances the resolution of spatiospectral sensors through a modified version of Imaging Fourier Transform Spectroscopy (FTS), utilizing a minimal number of laser shots.
The relevance of spatiospectral characterization for ultrahigh intensity laser beams lies in its ability to measure the spatially-varying temporal properties of a laser pulse, which are crucial for assessing the impact of chromatism on the beam’s focus and intensity, and for controlling laser-matter interaction processes.
Many existing techniques require and undesirable number of measurements. Approaches to use fewer or even a single shot are either complex to experimentally implement or make use of a constraining amount of prior information on the spectral properties of the pulse. Our approach leverages only a minimal amount of prior knowledge to optimize the number of measurements in a simple experimental implementation.
We use tunable liquid crystal retarders to encode spectral information into the field autocorrelation, akin to traditional FTS. However, our method diverges by utilizing Bayesian Experiment Design to determine the optimal sampling delays, contrasting the fixed intervals of conventional FTS.
This measurement modality is especially effective in combination with existing spatiospectral sensors. The additional prior information obtained from these sensors allows for an effective and efficient retrieval of spectral information within any of their channels, resulting in a resolution enhancement of the device. In this way the usability and application range of existing devices can be greatly expanded, spanning not only laser diagnostics but also hyperspectral imaging.
Most characteristics of ultra-short laser pulses can be obtained by measuring the integrated spectrum or only slices in space. However, Spatio-Temporal Couplings (STCs) may influence the laser performance. Common measurement techniques usually either rely on scanning over hundreds of shots, are only sensitive to low order effects like the Pulse-Front Tilt (PFT) or have the trade-off of significantly reduced spatial and spectral resolution.
With the introduction of FALCON [1] we already presented a simple device that is based on narrow bandpass filters in front of a Shack-Hartmann (SH) sensor. With the robust modal Zernike-Taylor coefficient reconstruction it allows to reconstruct STCs in only a few shots. With Single Shot FALCON, we present a complementary follow up, that can be used in the same setup (Fig.1), and consists of nine bandpass filters stacked in a mosaic like manner. This allows for single-shot reconstructions although there is only incomplete spatial data for each filter, due to the modal reconstruction working also on sub apertures of the beam.
We overcome the disadvantage of having less data points for each reconstruction by implementing a Bayesian inference scheme to include prior knowledge that we acquire from prior FALCON or Single Shot FALCON measurements. We use the information of the prior mean and uncertainty to influence the result of each single shot measurement. For a steady state laser system, this allows to gain more confidence on the results for subsequent single shot measurements. For a moving system, this prior knowledge sets the parameter space at which STC coefficients may change as typically laser pulses do not drastically vary in all parameters at the same time. For the case of drastic changes in single parameters or accumulated strong drifts we included competing Bayesian models with different uncertainties, where we evaluate which model describes the current system best for each shot via the Bayes factor.
We show the functionality of this monitoring scheme for experiments with the ATLAS-3000 Laser at the Centre for Advanced Laser Applications in Garching.
In order to solve the problem of the large beam divergence in fast ignition schemes and increase the laser energy deposition on the target core. We systematically studied the relationship between the guiding and acceleration of target surface electrons (TSE) and laser parameters in previous works[1]. The beam quality is found to depend critically on the intensity ratio of the laser prepulse and laser incidence. By optimizing the system, highly collimated mono-energetic TSE beams at MeV level with low divergence and high charge per shot has been observed along the target surface following the interaction of bulk target irradiation by femtosecond laser pulses at relativistic intensity[2]. Then, energy enhancement of the TSE beam was significantly improved by employing a ultra-intense sub-picosecond laser system and we obtained high-quality, tens of MeV class TSE beam with a large total charge[3]. Recently, we studied the terahertz transit radiation driven by TSE experimentally and theoretically and obtained preliminary results.
Related References:
[1] J. Y. Mao, L. M. Chen, X. L. Ge et al., Phys. Rev. E. 85, 025401 (R) (2012)
[2] J. Y. Mao, L. M. Chen, K. Huang et al., Appl. Phys. Lett. 106, 1311058 (2015)
[3] J. Y. Mao, O. Rosmej, Y. Ma et al., Opt. Lett. 43, 3909 (2018)
Simpkin Lee seminar room
Laser electron accelerators are emerging as compact sources for high-quality relativistic electron beams, following the increasing demand from areas such as material science, health, particle physics, and astrophysics. Each application, such as driving a Free Electron Laser, requires specific electron beam properties and their stability, both shot-to-shot and long-time scale. As laser wakefield acceleration is driven by high-intensity lasers, it relies on a complex, non-linear interplay between drive laser and plasma parameters, which gives rise to ample parameter space. In addition, because of the system's nonlinearity, statistical fluctuations in the laser or the plasma significantly impact the stability of the accelerated electron beam properties.
For finding a suitable experiment configuration in the ample parameter space, machine learning tools such as Bayesian optimization can be valuable. However, their implementation is often challenging because of hidden underlying parameters as well as statistical noise, making correlations needed for optimization less pronounced.
To quantify and mitigate the influence of these statistical fluctuations on the system stability, we have gathered a large dataset with systematic studies of the magnitude and nature of the statistical fluctuations and their impact on each future optimization parameter. This work will contribute to a better understanding of the underlying sources of instability of laser-plasma acceleration experiments and pave the road toward successful implementation of an automatized implementation of Bayesian Optimization
In this contribution, we present results of an extensive study of LWFA driven by ATLAS-3000 at CALA in Garching, yielding GeV-scale electrons. By monitoring numerous laser and electron diagnostics in parallel during LWFA experiments at a 0.25 Hz repetition rate, we find that, among all parameters monitored, the laser wavefront exhibits the highest correlation with electron energy. The largest contribution to this correlation stems from fluctuations in the defocus coefficient, which display a very similar dynamics to the electron energy fluctuations on a time scale of minutes.
We additionally report on efforts to search for the origin of the defocus variations in the behavior of the temperature distribution of the main amplifier crystal and on the effects of lowering the overall crystal temperature by enhancing its cooling system. Our observations raise the question to which extent the laser wavefront in petawatt laser systems can be passively stabilized or whether active wavefront correction becomes practically inevitable when aiming for stable LWFA operation driven by laser pulses exceeding the 10 J-level.
We present an overview of the ebeam4therapy project at the Weizmann Institute of Science aimed at developing the technology for very-high-energy electron (VHEE) radiotherapy based on a laser-plasma accelerator and focus on the optimization of simulations for the project. A core objective of the project is to reduce the cost of potential medical devices by reducing the required laser energy. By utilizing the genetic algorithm in conjunction with particle-in-cell simulations of the laser-plasma interaction with the FBPIC code, we investigate methods to achieve a VHEE beam suitable for radiotherapy at the lowest possible laser pulse energy. The optimization process is performed with respect to the gas target parameters such as the gas density profile and the gas mixture.
The work is supported by the European Innovation Council (ebeam4therapy grant) and the Schwartz Reisman Center for Intense Laser Physics.
In this talk, we will explore the current development of PIConGPU in machine learning-based simulations for plasma acceleration and highlight three key applications. These projects mark significant advances in the integration of AI and advanced data workflows into plasma physics research with PIConGPU. They illustrate not only our current methods, but also our vision for future in-transit AI-assisted simulation analysis.
First, we present automated optimizations of large three-dimensional particle-in-cell (PIC) simulations to find parameters consistent with self-truncated ionization injection experiments during laser wakefield acceleration. By combining Bayesian optimization with the PIConGPU framework and automating the entire parameter tuning workflow with Snakemake, we efficiently achieve excellent convergence with experimental measurements.
Second, we present radINN, an invertible neural network designed to use emitted radiation spectra to quantify injected charge. Trained with synthetic data from PIConGPU simulations, radINN excels at identifying injection processes and provides valuable insights for improving future experimental diagnostic.
Finally, we present a new heterogeneous streaming workflow that streams plasma simulation data directly into a machine learning application, without writing any data to disk. This approach leverages openPMD and ADIOS2 to facilitate real-time model training during data transfer, overcoming the challenges of petabyte-scale data storage and I/O bottlenecks. We demonstrate this workflow by tackling the inverse problem of predicting particle dynamics from radiation in a PIConGPU simulation of the Kelvin-Helmholtz instability.
Accelerator physics relies on numerical algorithms to solve optimization problems in online accelerator control and tasks such as experimental design and model calibration in simulations. The effectiveness of optimization algorithms in discovering ideal solutions for complex challenges with limited resources often determines the problem complexity these methods can address. The accelerator physics community has recognized the advantages of Bayesian optimization algorithms, which leverage statistical surrogate models of objective functions to effectively address complex optimization challenges, especially in the presence of noise during accelerator operation and in resource-intensive physics simulations. In this presentation, we offer a conceptual overview of applying Bayesian optimization techniques toward solving complex optimization problems in accelerator physics. We begin by providing a straightforward explanation of the essential components that make up Bayesian optimization techniques. We then give an overview of current and previous work applying and modifying these techniques to solve accelerator physics challenges and discuss practical implementation strategies for Bayesian optimization algorithms to maximize their performance.
This talk reviews intelligent automation systems being developed for laser plasma accelerators (LPAs) at the Berkeley Lab Laser Accelerator (BELLA) center in collaboration with Berkeley Accelerator Controls and Instrumentation (BACI) program. For the next generation high average power fiber lasers we showed efficient and stable coherent laser combining in space and time with Field Programmable Gate Arrays based accelerated machine learning (ML) controllers. For the BELLA petawatt system we have shown that we can halve laser pointing fluctuations using ML to predict and correct for pointing errors. For our FEL, we have improved electron beam stability with both transverse and longitudinal focus stabilization systems and demonstrated 1000x FEL gain. With Baysian optimization using Xopt we are looking to further increase electron beam brightness and gain. These developments underscore the transformative potential of ML and advanced control systems in enhancing LPA performance and its practical application in high-power laser systems.
Plasma acceleration has seen tremendous progress over the past years demonstrating competitive beam quality from compact setups. However, plasma accelerators live on a very complex non-linear parameter space, which makes it very challening to, first, identify an optimum working point, and then, second, to operate the plasma accelerator reliably at this point with reproducible beams.
The advent of high-repetition plasma accelerators, powered by a new generation of drive lasers or high repetition rate electron beams, addresses this challenge by enabling advanced control techniques, active stabilization and feedback, as deployed in any modern accelerator facility.
Here, we will discuss our recent activities at DESY, developing machine-learning-assisted software tools to design and optimize plasma accelerator setups, and discuss the deployment of machine-learning technqiues to enhance the performance of beam- and laser-driven plasma accelerator experiments.
Simpkin Lee seminar room
Simpkin Lee seminar room
Laser-driven plasma accelerators (LPA) are compact sources of ultra-short, intense proton pulses in the multi-10-MeV energy range. These unique parameters predestine LPAs as powerful tools for ultra-high dose rate radiobiology research. To promote further sophisticated radiobiological studies at LPAs, automated setups for proton acceleration and beamline operation are required.
In this contribution, we will present our concept for an automated, repetion rate-capable LPA setup for radiobiological applications, extrapolating from radiobiological studies performed at the ALBUS-2S beamline at Draco PW (HZDR). Our concept combines a cryogenic jet target as proton source, a 1-Hz pulsed solenoid beamline for spectral shaping of proton bunches and dedicated realtime source-to-sample diagnostics for the proton bunch. These tools will be combined to enable the irradiation of radiobiological samples in scanning mode, i.e. spot-wise irradiation, with an automated feedback loop between LPA source, beamline and irradiation pattern.
Laser plasma based particle accelerators have attracted great interest in fields where conventional accelerators reach limits based on size, cost or beam parameters. However, laser accelerators have not yet reached their full potential in producing simultaneous high-radiation doses at high particle energies. To overcome limitations a high degree of control of the plasma conditions is needed, making methods for optimizing and stabilizing the performance of the accelerator essential.
In this talk, we will present techniques for the optimization of laser-driven proton beams using the DRACO laser system of the Helmholtz-Zentrum Dresden-Rossendorf. With its ultra-short laser pulses of up to 23J energy on target yielding intensities on the order of 5*10e21W/cm², DRACO enables exciting research on ion acceleration, as it has been shown in recent publications [Ziegler2024, Rehwald2023, Kroll2022]. This talk will provide an overview of our workflows to optimize and stabilize the acceleration process, including manipulating the laser intensity contrast (e.g. by changing the spectral laser phase) or tailoring the target evolution during the interaction (e.g. by varying the initial target thickness or by introducing an intentional pre-expansion). Our view on the experiment automation and automated laser-plasma systems in this context will conclude the presentation.
[Ziegler2024] T. Ziegler et al. "Laser-driven high-energy proton beams from cascaded acceleration regime" Nature Physics, 20, pages 1211–1216 (2024)
[Rehwald2023] M. Rehwald et al. ”Ultra-short pulse laser-driven acceleration of protons to 80 MeV from density tailored cryogenic hydrogen jets” Nature Communications 14, 4009 (2023).
[Kroll2022] F. Kroll et al. ”Tumor irradiation in mice with a laser-accelerated proton beam” Nature Physics 18, 316-322 (2022)
Advances in laser technology are pushing the repetition-rate of petawatt scale lasers into the multi-Hz regime. As repetition-rates increase, novel target and diagnostic solutions are required to fully exploit the latest laser facilities. Nanometer scale solid-density foils currently exhibit the most promising ion acceleration mechanisms for applications, reaching > 100 MeV, however slow manual alignment limits shot rates to multiple minutes.
In this presentation, we will present progress towards developing a platform for high repetition-rate (HRR, > 0.1 Hz) laser-plasma interactions utilising nano-meter scale thin, solid-density foil targets. Multiple ion acceleration mechanisms compete in this regime and measurements are typically highly variable and sensitive to laser and target conditions, motivating the need for increased volume of data and as a result an increase in target alignment rates.
We present an automated alignment and positioning system for nanometer thickness foils which enables laser-driven ion acceleration to reach the highest possible accelerated energies whilst operating in a HRR mode with minimal human operation. This system delivers automated alignment at a repetition-rate of a few minutes per shot without human operation and further development will progress towards shot rates under a minute, advancing studies probing fundamental plasma physics, radiobiology, radiation damage and ultrafast imaging.
...
...
Laser-Plasma Acceleration (LPA) is a highly non-linear process sensitively dependent on parameters of gas flow and laser which are hard to control or simultaneously measure in experiments. Understanding of such parameter dependencies can be driven by simulations which offer control and observability, but are more expensive the more physical details are included. In the case of LPA, full 3D particle-in-cell (PIC) simulations are far to expensive to allow detailed scans of the available parameter space. Deep-learning based surrogate models are promising for guiding parameter optimizations, enable fast result estimation or inversion and compile information about vast parameter spaces more effectively. In this talk we are going to review state-of-the-art approaches to surrogate modeling in plasma physics to show perspectives and challenges in leveraging approaches like neural operator and physically-informed neural networks (PINNs) to enhance simulation-driven progress in LPA.
The ISIS Neutron and Muon Source, located at Harwell Campus, is a pulsed neutron source used to study the structure and dynamics of materials. This talk will explore ongoing efforts to leverage machine learning to improve efficiency and reliability of the accelerator sections used to deliver high energy protons to the targets. Examples will demonstrate applications of machine learning techniques such as anomaly detection, surrogate modelling and optimisation that are relevant to all complex accelerator and laser facilities.
Simpkin Lee seminar room
Thermal effects in optical elements as well as subtle changes in the experimental environment (e.g. airflow, humidity, vibrations) are well-known challenges affecting laser alignment. These effects tend to scale with the size and complexity of the laser systems. For high-power lasers comprising a multitude of amplification stages, the resulting long-term drifts (occurring over minutes to hours) affect both beam and laser parameters. For instance, a change in the beam pointing does not only shift the position of the laser focus in the experimental chamber but also translates into a variation of the laser pulse energy as the amplification dynamics change as well.
Monitoring these drifts at the Petawatt-class ATLAS laser at CALA prompted us to develop a modular solution for long-term beam stabilization. The stabilization system consists of separate diagnostic and control modules in between the individual amplification stages of the laser chain. Each module measures the laser near- and farfield and is capable of stabilizing both position and angle of the beam using motorized mirror mounts. The feedback loop activates automatically when a significant deviation is detected and comprises a variety of safety features. Since the stabilization algorithm is self-calibrating, each module adapts to the individual geometric settings resulting from the relative positioning of the diagnostic and control modules. Currently, a total of four stabilization units are installed in the ATLAS frontend, and (supervised) stabilization on the minute timescale has been successfully implemented. The modular software is integrated into our Tango Controls [1] based control system and is available open-source under [2]. In the next step, the modules will be used for a fully automated startup of the ATLAS frontend without manual alignment. Overall, the system improves the stability, precision, and reproducibility of the laser alignment and is therefore essential for high-class laser-plasma accelerators. This work was supported by the BMBF within project 01IS24028 and the Centre for Advanced Laser Applications.
[1] https://www.tango-controls.org/
[2] https://gitlab.lrz.de/cala-public/tangodeviceservers/pyds_pointingstabilization
Controlling the delivery of kHz-class pulsed lasers is of interest in a variety of industrial and scientific applications, from next-generation laser-plasma acceleration to laser-based x-ray emission and high-precision manufacturing. The transverse position of the laser pulse train on the application target is often subject to fluctuations by external drivers (e.g., room cooling and heating systems, motorized optics stages and mounts, vacuum systems, chillers, and/or ground vibrations). For typical situations where the disturbance spectrum exhibits discrete peaks on top of a broad-bandwidth lower-frequency background, traditional PID (proportional-integral-derivative) controllers may struggle, since as a general rule PID controllers can be used to suppress vibrations up to only about 5%–10% of the sampling frequency. Here, a predictive feed-forward algorithm is presented that significantly enhances the stabilization bandwidth in such laser systems (up to the Nyquist limit at half the sampling frequency) by online identification and filtering of one or a few discrete frequencies using optimized Fourier filters. Furthermore, the system architecture demonstrated here uses off-the-shelf CMOS cameras and piezo-electric actuated mirrors connected to a standard PC to process the alignment images and implement the algorithm. To avoid high-end, high-cost components, a machine-learning-based model of the piezo mirror’s dynamics was integrated into the system, which enables high-precision positioning by compensating for hysteresis and other hardware-induced effects. A successful demonstration of the method was performed on a 1 kHz laser pulse train, where externally-induced vibrations of up to 400 Hz were attenuated by a factor of five, far exceeding what could be done with a standard PID scheme.
We have demonstrated real-time automatic compensation of measured angular dispersion, achieving a stability within 1.4 nrad/nm. The applied algorithm uses the calculated dispersion from a new diagnostic, on shot, and adjusts the positions of a set of two glass prisms to achieve a zero, or non-zero, constant angular dispersion magnitude and direction. The project has been extended to compensate for a bandwidth beyond 500 nm by using an additional two prisms to control linear angular dispersion. The system allows flexibility on the experimental outcome, or minimisation of the pulse-front tilt for the pulse on target. The method is intended for use on both the EPAC and Vulcan 20-20 lasers and can implemented in any CPA laser system.
As part of the THRILL project, we are working on different architectures for high-energy, high-repetition-rate laser amplification. We are particularly interested in coherent beam combination, an original scaling approach already proving its worth for fiber or small-diameter laser systems. Some of the main challenges when trying to adapt this technology to high-energy laser chains, include the large beam aperture inhomogeneities and wavefront instabilities, potentially strong on-shot phase modulations in the amplifiers, and difficulties in active phase stabilization due to the extremely low repetition rate of such laser chains.
In this workshop, we would like to share our motivations for a coherent combination of high-energy laser chains and present our needs for automation, feedback, and prediction of results on such chains.
The PHELIX laser system utilizes multiple wavefront control systems, which require seamless integration into the primary laser control system for daily operation. Over the years, significant effort has gone into optimizing this integration. We detail the initial development stages with our standalone software, WOMBAT, and the gradual incorporation of these capabilities into the PHELIX Control System (PCS).
The ATLAS-3000 is a Petawatt class Ti:Sa laser located at the Centre for Advanced Laser Applications (CALA) in Munich, Germany. The laser operates at a 1Hz repetition rate and the beam can be guided into several different experiment areas for e.g., laser particle acceleration experiments.
The laser chain contains a total of three deformable mirrors before the main amplification stage, before guiding the beam into the grating compressor and finally in the beamline towards the experiment areas to optimize the focus on target, respectively. The first two mirrors are paired with commercial Phasics wavefront sensors for closed loop operation. Since the final mirror is used for focus optimization in different experiment areas, homemade Shack-Hartmann wavefront sensors based on commercially available microlens arrays were developed both for cost reasons as well as to allow integration into the focus diagnostics operating in vacuum.
A large number of diagnostics both in the laser as well as the experiments are integrated into a central control system based on the open-source Tango controls framework [1]. This allows easier operation by the experimenter and unified data acquisition for offline as well as online evaluation. Currently, the wavefront data measured by the Phasics sensors is available in the control system via the manufacturer supplied Tango module. The homemade Shack-Hartmann sensors are partially integrated but some data is only available within the evaluation software directly. The evaluation routines are available as an open-source Python package on the institute public repository [2].
In this contribution I will review the technical details of the current setup with an emphasis on the integration with the Tango control system and the capabilities of the published Shack-Hartmann evaluation routines. This work was funded by the Bundesministerium für Bildung und Forschung (BMBF) within project 01IS17048.
[1] https://www.tango-controls.org/
[2] https://gitlab.lrz.de/cala-public/packages/physicsbox
Bivoj laser system is located at Hilase center near Prague, Czechia. It incorporates two adaptive optics systems (AOS). One for each of its two cryogenically cooled main amplifiers, both of which work with a square flat-top beam profile. First AOS deals with aberrations of the 10J amplifier with beam size of 21.5 x 21.5 mm2 and composes of a bimorph Deformable Mirror (DM) and a Quadri-wave Lateral Shearing Interferometer (QLSI). Second AOS is incorporated in the 100J amplifier and deals with beam size of 75 x 75 mm2. While the 10J loop performance is stable, the 100 J loop is not. The standing problem of the latter amplifier is related to helium gas fluctuations of its cooling system that dynamically changes the output beam wavefront and corresponding far field. To address this problem a detailed analysis of these fluctuations was performed using a CW probe beam in order to obtain a higher framerate signal. Results of this analysis as well as relevant simulations will be discussed.
Many applications require an ultrafast laser source with excellent long-term stability. One example is laser plasma acceleration (LPA), where the properties of the drive laser pulse directly determine those of the accelerated electron beam. Therefore, the highest possible laser stability is required for reliable long-term operation of the LPA. In the high-energy drive lasers of LPA, even the low-energy seed plays a role in defining the final pulse and the performance of the accelerator. While optical parametric chirped pulse amplifiers (OPCPA) are an excellent choice for seeding such high-intensity lasers due to their inherent spectral flexibility, excellent temporal contrast and compactness, achieving high stability has traditionally been a challenge in OPCPA. The main reason for this is that the pulse characteristics are intricately coupled, making precise control of these pulses difficult. For example, one amplifier stage can influence subsequent stages, making it often unintuitive to tune an OPCPA system for specific output characteristics, and simple feedback loops that stabilise individual laser properties - such as pulse energy - can interfere and introduce unwanted fluctuations in other properties - such as spectral characteristics.
As an alternative approach, we present results from the optimisation of a multi-stage OPCPA system using evolutionary strategies, and demonstrate centralised feedback control of the laser system that simultaneously stabilises multiple output parameters. Enabled by this approach, the laser exhibits exceptional long-term spectral and energy stability, with <50 pm wavelength stability and <0.2% rms energy jitter over several days of operation.
In this presentation we will first go through an overview of the Apollon multi-PW laser facility, discussing in some detail the architecture and the current performances of the system. In the second part of this talk we will focus on the focal spot quality stability requirements of Apollon and present our first results on what is to our knowledge the first active wavefront stabilization demonstration of a PW laser system based on a fast adaptive optics loop.
Adaptive Optics (AO) have revolutionized astronomy and enabled optical imaging down to ~15 mas resolution on today's largest telescopes. The resulting image stability and contrast have, in turn, allowed us to probed the close-in environment of neighboring stars, peeking at dust and debris disks, and blossoming exoplanets. In this talk, I will present recent results and technical capabilities (in wavefront sensing and real-time wavefront control) that make and are enabled by AO at the 8.2 meter Subaru Telescope, located atop the Maunakea volcano on the island of Hawai'i -- the best astronomical site in the world.
The Subaru Coronagraphic Extreme Adaptive Optics platform (SCExAO) leverages the power of two cascaded high-order AO systems, for a total of more than 5000 DM actuators, driven by two high-order Pyramid wavefront sensors at up to 3.5 kHz.
SCExAO is able to provide up to 94% Strehl ratio astronomical images to supported astronomers, perfoming spectroscopy, imaging, and polarimetry across its collection of scientific modules operating in the visible and near infrared (600-2400 nm). With a 3x3 arcsecond field of view, SCExAO focuses on close circumstellar features, an exciting environment as we find ourselves discovering so many stellar systems that may be just like ours; with planet-to-star contrasts currently in the 10^6, we are able to image young, hot exo-Jupiters. As the next decades bring about 40-m class telescopes, we will be within range of detecting a nearby exo-earth.
To get there, SCExAO is always on the lookout to improve wavefront control and stability, real-time control for AO, and novel ways to do astronomy. We hope to foster corss-disciplinary collaborations about and around AO, teach, and learn.
Laser beam alignment is a non-trivial and time-consuming problem native to a multitude of present-day experiments. We introduce a reinforcement learning-based laser beam alignment system that learns to align a Mach-Zehnder interferometer and an off-axis parabolic mirror with live optimization correcting for beam drift or externally introduced mirror misalignment. The algorithm manages to find a technique for recovering a beam lost from its field of vision. This technique allows for the use of open-loop motors as the agent is agnostic of the orientation of the component it controls.
We will explore different algorithms such as DQNs, Actor-Critic, SARSA and show their respective advantages and introduce a new way of simulating beam propagation to train focal spot optimisation.
Simpkin Lee seminar room
For an autonomous system like high frequency Adaptative loop, the problem of interfacing with facility SCADA/Tango can be difficult. We have two main options: restrict the interactions for only settings and monitoring the correction parameters. The other option demands more resource as we should manage a separated physical machine on dedicated network to insure the 10ms computation and communication time between local attributes and other Tango bases in the facility.
The PLANET project aims at developing the first laser-driven neutron source optimized for commercial application. The system exploits the unique properties of a laser-source by utilizing simultaneously both fast neutrons and MeV x-rays to image through heavy shielding material. The goal of the project is to demonstrate imaging and material discrimination of the content of a medium level radioactive waste container. For commercial viability we aim at repetition rates up to 100 Hz during several hours of operation, posing high requirements on performance, reliability and stability of the laser-driven ion source. Therefore, the system requires a high level of integration and automation. The machine development is accompanied by the development of a dedicated facility on site of the former nuclear power plant in Biblis, Germany. We will present the status of the project and how we plan to meet the control and automation challenges for this machine.
The DRACO Ti:Sa laser system at HZDR employs various automated procedures. We will report from daily operation experience on
- pump and shutter control
- beam stabilization
- spectral shaping
- trigger and delay management
We will also discuss current developments of data acquisition and live processing as well as the transition process to AI-based beam stabilization, spectral shaping and wavefront control.
Simpkin Lee seminar room