Enhancing X-ray light source capabilities through augmented and virtual reality: a perspective from the Advanced Photon Source
Driven by dramatic changes in (i) work, home and travel because of SARS-CoV-2; (ii) the increasingly complex nature of experiments that can or will be performed at next-generation X-ray light sources such as X-ray free-electron lasers (XFELs) and multi-bend achromat (MBA) synchrotrons; and (iii) the need to upgrade and operate such facilities with a changed workforce, we see striking opportunities for augmented and virtual reality tools and technologies at X-ray facilities like the Advanced Photon Source (APS) at Argonne National Laboratory. In the remainder of this introduction, we summarize these changes and introduce these technologies at a high level. We also describe emerging and anticipated uses for such technology at the APS and then conclude with our perspective on future opportunities. This contribution focuses on recent work and thoughts at the APS, but we emphasize that many facilities are exploring and implementing similar technologies for the same reasons listed above.
The power and abundance of augmented or mixed (AR or MR) and virtual or meta (VR or MR) devices and capabilities are expected to grow dramatically over the next few years, driven by a confluence of increasingly powerful cloud-computing technologies, advanced wireless capabilities such as 5G and WiFi 6 and increasingly sophisticated edge-computing technologies, especially those powered by artificial intelligence and machine learning (AI and ML). In fact, Statista documents approximately USD 6 billion of consumer spending in the US in 2020 on such technology and an additional USD 2 billion of industrial spending in the same period. By 2024, however, total spending in the US is expected to increase nearly tenfold, with the current consumer–industry split of 75% and 25% expected to flip by that time (The Economist).
What are augmented and virtual reality devices?
For those not familiar with the terminology, AR or MR devices project computer-generated imagery or sensory information onto a real-world environment as seen through, for example, a cell-phone screen, automobile windshield or dedicated headset. VR or MR replaces the physical or real world with entirely, or largely, computer-generated sensory information often realized through special-purpose immersive headsets but also found in, for example, flight simulators for commercial or military pilots. Some recent examples of AR and VR are as broad as the Pokémon Go game (AR) that was incredibly popular just a few years ago [Fig. 1(a)], the IKEA Place app (AR) that allows consumers to examine virtual furniture in their homes, head-up displays for cars and purpose-built devices such as the Microsoft HoloLens 2 (AR) [Fig. 1(b)], Vuzix Blade smart glasses (AR) or Oculus Quest (VR) developed by Facebook [Fig. 1(c)]. Facebook has recently rebranded itself as Meta in an indication of its strength and commitment to developing capabilities and tools for virtual reality or the metaverse. Apple and Samsung have also announced plans to release VR devices of their own.
Implications for X-ray user facilities
Concurrent with the emergence of AR and VR technology, both the world at large and the world of state-of-the-art X-ray user facilities are undergoing large changes. The past two years or so have brought unprecedented change as the SARS-CoV-2 virus (IUCr Newsletter) has continued to play havoc with people’s lives and upended our previous notions of work, education and family. The situation with X-ray user facilities like the APS is no different as, traditionally, most work in physical sciences was performed by on-site users who traveled to a user facility for their beam time and worked closely with resident facility staff to perform and interpret their measurements. [This should be compared with macromolecular crystallography (MX), which has been highly automated and with steadily increased remote operations over recent decades (Abola et al., 2000).] Travel and proximity restrictions have made for ongoing challenges, with user facilities switching to a continuum of mail-in and remote access operations augmented by an increasing but still variable and limited on-site presence of users (DOE, 2021). Of course, there are notable exceptions in the physical sciences like mail-in programs for powder diffraction at many synchrotron facilities (Toby et al., 2009), but by and large, prior to SARS-CoV-2, on-site visits by physical science users were the norm. This is not the case now and is not expected to fully revert to the previous arrangement, even after the pandemic (DOE, 2021). Moreover, even as site-access restrictions are eventually eased for staff and users, we believe to expect both greater flexibility and efficiency around work location and site presence. We are confident that AR and VR offer significant opportunities to train and support users and fellow staff members better. Below, we describe in more detail specific directions and opportunities for safety, experimental support and training at the APS.
The science capabilities of X-ray user facilities are also changing markedly, driven by large changes to the facilities themselves. For instance, the APS is undergoing a major upgrade to APS-U that will produce synchrotron X-ray beams hundreds of times brighter than those available today. The APS is currently scheduled to shut down in April 2023 for a one-year removal, installation and storage ring commissioning project that will result in far brighter beams after completion. The aggressive schedule will require multiple crews working simultaneously in the storage ring tunnel. We see AR with apparatus and location-sensitive training and documentation materials critical to achieving the goals of the APS-U project.
Lastly, significantly brighter X-ray beams are a common deliverable for many X-ray sources throughout the world (Weckert, 2015). We anticipate that lensless imaging techniques (Miao et al., 2015) and lensless imaging combined with other measurements like fluorescence, for example, will produce vast quantities of data and results that will be near impossible to visualize using traditional means. Because of the anticipated scale and complexity, we also think that interpreting data and results from these facilities will require deep collaborations among geographically and scientifically diverse experiment teams. We are confident that AR and especially VR will prove invaluable in this regard.
The remainder of this overview is organized as follows. First, we discuss recent work with AR to support user experiments. Second, we discuss opportunities and work in progress for AR to support installation, maintenance and design activities. Third, we discuss some of the possibilities for AR and VR to enhance data visualization, help with outreach and education, and further collaboration among geographically diverse experiment teams. Finally, we offer our perspective for longer-term opportunities. We note that, as all the authors of this article are from the APS, our examples and perspective are largely confined to one AR device on which we have focused our investments and effort to date, namely the Microsoft HoloLens 2. We have been impressed with this device's capabilities to meet our needs but this article should not serve as an endorsement, and we encourage anyone interested in the technology to conduct their own evaluation of which AR and VR platforms best meet their requirements: there are many on the market and more forthcoming.
Using AR to support user experiments
The onset of the SARS-CoV-2 pandemic and the subsequent switch to an increasingly remote work paradigm have upended the classical workflow at synchrotron beamlines worldwide. This is especially true for in situ and operando measurements that involve complex setups and integration of various experiment-specific equipment with beamline infrastructure. The successful execution of such experiments relies heavily on multiple aspects of preparation and operation but arguably the most important is clear communication and close interaction between the beamline personnel and users. Lockdowns and site access restrictions in response to the pandemic initially made carrying out these types of experiments impossible. As a first step, the APS, like many user facilities, developed and provided capabilities so that remote operation of a beamline and experiment was possible by both users and staff. This included the addition of equipment used to support remote operations, such as internet-enabled cameras inside and outside hutches, remote access to beamline computers, robotics and automation, and the use of various communications platforms. Nevertheless, the quality of support that beamline personnel could offer to remote users was far from ideal.
The emergence and popularization of AR devices has significantly changed the way experiments – both remote and on-site – are being set up and carried out at beamlines. Implementation of the Microsoft HoloLens 2 at APS beamlines, for example, has opened new ways for beamline scientists to communicate and interact with their remote users. The first-person perspective offered by the device is valuable for both the device user and the remote user. The head-mounted camera offers outstanding image quality and allows for unrestricted hand movement. This makes regular beamline tasks, such as building experimental setups, sample manipulation, assembly and disassembly of sample environments etc. not only easier but also much safer. In addition to allowing for a seamless, constant two-way communication between the device operator and the user, the device offers the capability of real-time annotating of the so-called holo-field (field-of-view for both parties). This provides additional, non-verbal levels of communication that are extremely helpful in troubleshooting a problem or in simply explaining the details of tasks being undertaken. As examples, Fig. 2 shows screenshots that capture the first-person view seen by the on-site staff person providing electrochemical-cell assembly support for a remote participant.
The device has also changed the way resident staff provide out-of-hours user support. Until recently, in the event of a failure or experiment problem at the beamline, users were limited to telephone or videoconference communication with the support staff, and the beamline scientists were forced to troubleshoot the problem via the same channels. In the case of a more complex issue, a resident staff member would have to return to the facility to resolve the issue. With an AR device, the complexity of issues that can be solved remotely has increased significantly. In addition to being deployed at the beamlines, the HoloLens 2 device has been used by the APS Electrochemical Facility for the preparation of the samples for electrochemical experiments at various beamlines of the APS. The first step of these experiments requires the assembly of a custom-made testing cell consisting of several small pieces: this work needs to be performed within an inert atmosphere of a glovebox. Because of the complex nature of the assembly process and restrictions related to working within the glovebox, it was virtually impossible for users to follow the preparation using traditional videoconferencing tools. The AR device provides a first-person view of the work in the glovebox [Fig. 2(a)] and the remote users greatly appreciate the ability to follow every stage of the preparation while remaining in real-time communication with the scientist assembling the cell. Another feature of AR devices like the HoloLens 2 is in mixed-reality applications that assist device operators with work by providing holographic instruction in the form of documents, images, videos or 3D holographic models. Instructions are triggered by 'anchors' strategically placed in relevant beamline areas or near equipment. In this example, the scientist preparing the samples can project holographic sample preparation instructions and display them on the virtual screen in their field of vision [Fig. 2(a), white 'paper' in the top right of the image]. This feature greatly reduces errors and simplifies the execution of the work.
Using AR to support installation, maintenance and design activities
The APS is undergoing a USD 815 million project funded by the DOE to replace its third-generation synchrotron storage ring with an MBA storage ring (many beamline replacements and upgrades are also included in the scope of the project). The project will remove all the existing equipment from the 1.1 km storage ring tunnel and environs and replace it with the latest technology. The APS storage ring, both before and after the upgrade, has 40 sectors. Many sub-assemblies comprise the infrastructure in each sector. Each sub-assembly consists of about a dozen magnets, a dozen vacuum chambers with support systems, absorbers, valves and crosses, each with their own support systems, water supply and return headers with branch tubing for each magnet and vacuum chamber, along with required grounding bus bars and grounding straps.
To accomplish the storage ring rebuild in the short one-year timeline for removal, installation and initial commissioning, the APS is increasing its (temporary) technical staff. This has led to training and support needs that conventional media cannot meet. Instead of requiring Subject Matter Experts (SMEs) to walk each new hiree through the technical requirements of an assembly, we have developed instruction sets that will train these new employees using a workflow application that is available through the HoloLens 2 interface. This approach will allow the SMEs to advance the training and support of new hirees without overburdening the SMEs. Fig. 3 displays screenshots from a HoloLens 2 showing some of the real-life sub-assemblies being assembled with instructions and visual cues provided by AR overlays.
After the APS upgrade is complete and we move into operation of the new facility, we anticipate AR assisting in the future maintenance of the storage ring. For example, maintenance procedures for each magnet type require many complex tasks that can be documented and then referred to by technicians while they perform the work. This opportunity will ease the transfer of information from senior staff to the next generation of SMEs, keeping the APS a world-class machine far into the future.
In addition to the storage ring installation and maintenance activities described above, we see additional opportunities for AR and VR to assist in X-ray beamline debugging and maintenance. The Component Database (CDB) is an application that has been designed and built to aid in documenting and managing all the parts that will go into APS-U. The CDB allows design information about the storage ring and beamline components to be retrieved quickly using a QR code that can be scanned using an AR device. With this information, beamline staff and engineers will visualize the digital twin alongside its physical counterpart while simultaneously displaying a dedicated hologram panel of records including technical parameters, procurement status and use history.
Fig. 4 shows an example of enhanced perception and record-keeping using a digital twin enabled by an AR headset. The metallic piece in the transparent plastic box in Fig. 4(a) contains a cylindrical channel that leads to an inner circular chamber. This structure resembles enclosed beamline components (e.g. monochromator, vacuum sample chamber etc.) whose inner structures cannot be easily examined, and where CAD drawings of one or all components may not be readily available or accessible. By scanning the QR code printed on the box, the HoloLens user can open the digital twin of the physical component from within the CDB, where a full record of the component is available [Fig. 4(b)]. In addition, the digital twin contains a hologram of the physical object that can be visualized next to the physical object for enhanced perception [Fig. 4(c)]. It is worth noting that with the HoloLens, the contour of all inner structures becomes visible when the hologram is viewed from inside [Fig. 4(d)].
Using AR and VR to enhance data visualization
An increasing number of instruments at light sources generate high-dimensional data. These include instruments performing various 3D imaging techniques or from the simultaneous acquisition of multimodal data. Interacting with and extracting meaningful scientific insight from such datasets is challenging when data can only be visualized in 2D. AR devices provide a natural way of interacting with 3D data and provide more flexibility with even higher-dimensional data. In addition, to enhance human–data interaction, AR devices can increase the efficiency of scientific collaboration. Two people each wearing an AR device can simultaneously view a hologram along with remote collaborators. Fig. 5 shows an example of visualizing 3D imaging data with the Microsoft HoloLens 2.
Using AR and VR to help with outreach and education
The HoloLens 2 device has also proven to be an essential tool for continuing educational programs at the APS during the pandemic. We have used it as a telepresence tool for remote students, with the most notable examples being the National X-ray School on Neutron and X-ray Scattering (NX School) and Argonne’s Exemplary Student Research Program (ESRP). In the pre-pandemic era, these programs offered graduate-, college- and high-school-level students unique opportunities to design, work on and conduct hands-on experiments with scientists at the APS. The pandemic has interrupted the decades-long tradition of hosting users on-site, but the HoloLens 2 provided an immersive experience for the participants. In both cases, scientists from Argonne hosted virtual experiments and used the AR devices during the experiments to augment online lectures.
Longer term opportunities
While still a nascent technology, AR devices have already made a mark on the user experience at the APS, and with significant investment from industry, we can expect to see the capability and versatility of these devices grow. Future AR devices that provide more processing power and an expanded software ecosystem can further enhance beamline experience and productivity. One such avenue could be through making human–machine interaction more efficient. Modern beamlines at synchrotron and XFEL sources are extremely complex instruments with a great many control variables. A typical beamline desktop has windows for dozens of motor controls required for standard operation that can be overwhelming to new users. Rather than a non-intuitive 2D array of motor controls, users could use 3D holograms of the beamline to target and move motors, run scans etc. through hand gestures. We are working on an in-house application that can be used to view the current state of the equipment inside the experimental hutch. This is the first step in further facilitating setup changes by both staff and users. The goal is to keep the framework flexible and low maintenance. In this way, the staff can accommodate modifications to the existing equipment without having to completely re-do the programming. An early mock-up of such a display is shown in Fig. 6. Positioner names are listed on the left of the display and their current values to the right.
Looking further ahead, by combining visualization with gesture-controlled scanning, users could simply identify regions of interest in high-dimensional data, select the appropriate scan ranges from the display and launch follow-on scans much faster and more intuitively than possible with a conventional computer.
This research used resources of the Advanced Photon Source, a U.S. Department of Energy (DOE) Office of Science User Facility at Argonne National Laboratory and is based on research supported by the U.S. DOE Office of Science-Basic Energy Sciences, under Contract No. DE-AC02-06CH11357. We acknowledge helpful conversations with and contributions and effort from Anthony Avarca, Ross Harder, Arthur Glowacki, Byeongdu Lee, Don Jensen Jr, Dariusz Jarosz and Uta Ruett.
Abola, E., Kuhn, P., Earnest, T. & Stevens, R. C. (2000). Nat. Struct. Biol. 7, 973–977.
DOE. (2021). Office of Science User Facilities: Lessons from the COVID Era and Visions for the Future. Report from the December 2020 Roundtable. DOI: 10.2172/1785683
Toby, B. H., Huang, Y., Dohan, D., Carroll, D., Jiao, X., Ribaud, L., Doebbler, J. A., Suchomel, M. R., Wang, J., Preissner, C., Kline, D. & Mooney, T. M. (2009). J. Appl. Cryst. 42, 990–993.
Weckert, E. (2015). IUCrJ, 2, 230–245.
Miao, J., Ishikawa, T., Robinson, I. K. & Murnane, M. M. (2015). Science, 348, 530–535.
The authors are at the Advanced Photon Source, Argonne National Laboratory, Lemont, IL, USA.
Copyright © - All Rights Reserved - International Union of Crystallography