Skip to main content
Photo of a welding helmet and torch
The extended reality welding helmet and torch system developed by CMU researchers.

Machine Learning and Extended Reality Used To Train Welders

Media Inquiries
Name
Peter Kerwin
Title
University Communications & Marketing
Name
Lynn Shea
Title
College of Engineering

Ever since the ancient Egyptians hammered two pieces of gold together until they fused, the art of welding has continuously progressed. Iron Age blacksmiths used heat to forge and weld iron. The discovery of acetylene in the early days of the industrial revolution added a versatile new fuel to welding. At the end of the 19th century, two engineers invented metal arc welding, and more recently, the rise of robotic welding systems and advances in high-strength alloys expanded welding applications.

A photo of a boy using a welding torch with text pointing to the sound sensor, AR feedback, breath sensing and gun movement tracking

A welder uses the gun to place coordinate locations for the start and end of the weld, linking the real-world weld line to a graphic representation in the XR display.

Despite a long history of technological improvements, welding remains a challenging skill to learn. It requires a combination of technical knowledge and manual dexterity. And given its prevalence throughout numerous industries, including manufacturing, construction, aerospace and automotive, the need for skilled welders remains strong. According to the American Welding Society, U.S. employers are now facing a deficit of 375,000 welders.

At Carnegie Mellon University, researchers are addressing the problem by developing a new way to train welders that once again applies an emerging technology. With financial support from the Manufacturing Futures Institute (MFI)(opens in new window) seed funding program, Dina El-Zanfaly(opens in new window), an assistant professor in the School of Design(opens in new window), and Daragh Byrne(opens in new window), an associate teaching professor in the School of Architecture(opens in new window), worked with a team of researchers to develop an extended reality (XR) welding helmet and torch system to help welders acquire the embodied knowledge they need to master the challenging skill.

“Not only is this a really cool project, but it also incorporates the key objectives of the MFI mission,” said Sandra DeVincent Wolf(opens in new window), the executive director of MFI. “It is groundbreaking research that advances manufacturing technology, contributes to workforce development and engages partners from the local community.” 

Extended reality combines virtual reality (VR), which is a computer-generated environment that simulates a realistic or imaginary experience; augmented reality (AR), which combines computer-generated information with a user’s real-world environment; and mixed reality (MR), where real-world and digital objects coexist and interact in real time. Together, these features create an immersive experience that allows users to interact with information, environments and digital content in real-time.

Training welders requires the development of hand-eye coordination and a keen perception of the position and movement of the body in space. This embodied knowledge is acquired through hands-on interactions with tools, and materials and can be difficult to replicate in training scenarios.

The Carnegie Mellon researchers worked to better understand the training challenges by organizing a series of co-design workshops. They worked with eight instructors and four students at the Industrial Arts Workshop (IAW)(opens in new window), a nonprofit youth welding training program in the Hazelwood neighborhood of Pittsburgh, to develop a system that integrates a welding helmet and torch with a Meta Quest Pro and a machine learning model that enhances the embodied learning of welding in three key ways.

Photo of a welding instructor demonstrating proper technique to students

An instructor from the Industrial Arts Workshop demonstrates proper welding technique to Carnegie Mellon students.

Visual XR guides and integrated motion sensing

The highly immersive and embodied nature of welding practice makes it exceptionally difficult for an instructor to visually monitor the process and provide feedback to the student in a timely, safe and audible manner. Neither written instruction nor feedback can convey the nuanced, hands-on skill in real-time.

The researchers overcame these obstacles by modifying a welding helmet with a Meta Quest headset that displays a series of visual feedback mechanisms, which guide the student during training sessions and provide a record of their performance that instructors can assess during or after the session.

Two separate XR indicators within the welding helmet show the slight changes and adjustments the welding student should make to maintain the correct angle of the welding gun that is connected to the Quest Touch controller. The status icons, which are near the top of the headset viewport, allow users to see the feedback without taking their focus away from the active weld. The status icons also give a much clearer overview of performance for instructors and users while viewing the live playback.

The researchers leveraged feedback from the workshop instructors to determine how to calibrate the XR representation of the weld to a real workpiece so that users can set the start and endpoints of the weld line with the welding gun to set the position of the scrolling guideline they need to follow.

"Not only is this a really cool project, it also incorporates the key objectives of the MFI mission." — Sandra DeVincent Wolf

Sensing sonic cues during welding practice

The Carnegie Mellon researchers learned from workshop instructors that experienced welders are able to assess welds through active listening. So instead of evaluating welds visually after they are completed, their system can use an auditory-based method to diagnosis the weld in real time.

“For example, a good welding speed should sound like sizzling bacon, not popcorn, according to the instructors,” explained El-Zanfaly.

Metal inert gas welding involves extruding a metal wire through the tip of the welding gun, shielding the wire with inert gas, and using the heat generated by short-circuit current between the wire and the workpiece to fuse the two metals together. Incorrect settings of this system result in poor-quality welds. For example, if the amperage of the welder is set too low, it will result in an excessively thin weld bead and lead to inconsistent penetration of the working plate.

According to prior research and feedback from the AIW instructors, different settings result in changes of the welding sound, which offers potentially important training feedback. But the extreme heat, light and sound conditions in the welding space — together with the bulky welding helmet and other personal protective gear needed to guard against the intense heat, sparks, ultraviolet radiation and metal splatter — limits the welder’s ability to perceive this auditory stimulus.

A diagram of the modified welding helmet fixed to the Meta Quest Pro XR headset

The researchers modified the welding helmet by fixing it to the Meta Quest Pro XR headset and interfaced the Seeed ESP32S3 board to a Unity software program running the XR display, using the Quest Pro’s USB-C port and the Serial Port Utility Pro plugin. The adjustable Quest head strap and connected battery replace the traditional helmet insert, and additional components were 3D-printed with PLA filament.

By employing tiny machine learning (TinyML) enabled sound detection to recognize key factors such as settings and tip distance, the researchers trained and deployed their model to provide visual feedback that indicates errors detected by sounds, such as those made when the tip of the gun is held too far away from the welding plate.

The researchers asked experienced welders to repeatedly perform the same welding movement, only changing one setting per weld. They collected more than 20 minutes of audio data, evenly distributed across the category settings to use in training and testing a TinyML classification model.

TinyML focuses on deploying and running machine learning models on resource-constrained devices, such as a microcontroller, which in this case was connected to the augmented helmet to provide feedback. The researchers trained the TinyML model to alert trainees to attend to common errors, such as incorrect settings and gun tip distance.

Sound was also used to detect the beginning and end of welds. Researchers collected 19 minutes of welding sound using recordings from five different devices simultaneously — two microcontrollers, two Samsung smartphones and a USB microphone — to train a classification system that could detect welding with 97% accuracy. This classifier replaced the need to electromechanically detect interactions with the physical button on the welding gun, which was used to initiate welding tracking and to make the system more portable.

Photo of a welding instructor's tablet

Real-time instructor view allows each student’s live performance to be cast to the instructor’s tablet. An instructor or student can review real-time or recorded point-of-view footage of the experience to monitor behaviors or analyze difficult scenarios.

Pre-welding meditation

During the workshops, researchers saw that instructors encouraged students to use mediation and breathing exercises before starting to weld as a way to induce relaxation and foster a sense of focus to offset the effects of the welding environment, which can be overwhelming due to loud noises, sparks, heat and the smell of burning.

In order to enhance mindfulness practices, the researchers programmed the platform to begin each welding session by encouraging trainees to engage in breathing exercises. They also placed an anemometer near the mouth and nose inside the welding helmet to measure the exhaled wind speed of breath and track the breathing pattern over time in order to develop system prompts to help welding students regulate their breathing for improved task performance.

The system’s ability to sense motion, detect sound and enhance users’ focus through mediation and breathing exercises can help students transfer the skills they acquire in the virtual training to actual welding practice. Its ability to provide guidance in real time provides numerous advantages for both students and instructors who otherwise must rely on information derived after a weld is complete to assess the performance. The overall approach could also inform crafts and skill training in XR systems beyond welding training. 

“A really exciting aspect of our work is the ability of our system to enable in-situ welding experiences using a lightly modified off-the-shelf XR and welding setup,” explained El-Zanfaly.

Their work has already received recognition — receiving awards at the 2023 Association for Computing Machinery(opens in new window) (ACM) Conference on Interactive Surfaces and Spaces and at the 2024 ACM Conference on Tangible, Embedded, and Embodied Interactions.

Moving forward, the team plans to pursue numerous opportunities to improve both the technical dimensions of the work and the embodied experience. They plan to deploy the platform in A/B lab studies and over multiple weeks at IAW to assess how extended use of this device contributes to the formation of skills, habits and the novice experience.

— Related Content —