Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Apr 11:7:1366469.
doi: 10.3389/fdata.2024.1366469. eCollection 2024.

Graph learning for particle accelerator operations

Affiliations

Graph learning for particle accelerator operations

Song Wang et al. Front Big Data. .

Abstract

Particle accelerators play a crucial role in scientific research, enabling the study of fundamental physics and materials science, as well as having important medical applications. This study proposes a novel graph learning approach to classify operational beamline configurations as good or bad. By considering the relationships among beamline elements, we transform data from components into a heterogeneous graph. We propose to learn from historical, unlabeled data via our self-supervised training strategy along with fine-tuning on a smaller, labeled dataset. Additionally, we extract a low-dimensional representation from each configuration that can be visualized in two dimensions. Leveraging our ability for classification, we map out regions of the low-dimensional latent space characterized by good and bad configurations, which in turn can provide valuable feedback to operators. This research demonstrates a paradigm shift in how complex, many-dimensional data from beamlines can be analyzed and leveraged for accelerator operations.

Keywords: Graph Neural Network; graph learning algorithm; particle accelerator; self-supervised learning (SSL); supervised training.

PubMed Disclaimer

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Figures

Figure 1
Figure 1
Schematic of the CEBAF accelerator. Electrons are generated in the injector. Multiple passes through the north and south linacs accelerate beam to multi-GeV energies. Beam is then sent to the four nuclear physics experimental halls (A, B, C, and D).
Figure 2
Figure 2
Illustrations that showcase an arbitrary accelerator beamline (top) and our approach for constructing a corresponding graph (bottom). Here, each node represents an individual element, while the node features correspond to the relevant parameters of the respective element. The edges between nodes are determined by a user-defined window size of 2. These edges are directed to reflect the fact that an element cannot impact upstream elements in the beamline.
Figure 3
Figure 3
The overall workflow of our framework. We first conduct self-supervised training (SST) on unlabeled data via contrastive learning (top) and then perform supervised fine-tuning (SFT) on a smaller set of labeled data (bottom).
Figure 4
Figure 4
Results under the semi-supervised setting.
Figure 5
Figure 5
The visualization of 353 unlabeled beamline graphs, representing operations in January 2022 (black markers), using their learned latent embeddings. Green and red contours denote regions of good and bad configurations, respectively.

References

    1. Akkas S., Azad A. (2022). “Jgcl: joint self-supervised and supervised graph contrastive learning,” in WWW '22: Companion Proceedings of the Web Conference 2022.
    1. Barshan E., Ghodsi A., Azimifar Z., Jahromi M. Z. (2011). Supervised principal component analysis: visualization, classification and regression on subspaces and submanifolds. Patt. Recognit. 44, 1357–1371. 10.1016/j.patcog.2010.12.015 - DOI
    1. Brüning O., Myers S. (2016). Challenges and Goals for Accelerators in the XXI Century. World Scientific.
    1. Chang S., Han W., Tang J., Qi G.-J., Aggarwal C. C., Huang T. S. (2015). “Heterogeneous network embedding via deep architectures,” in SIGKDD.
    1. Fey M., Lenssen J. E. (2019). “Fast graph representation learning with PyTorch Geometric,” in ICLR Workshop on Representation Learning on Graphs and Manifolds.