fbpx
Research

How wearable, trackable tech makes driving safer for teens with ADHD

Technology developed by researchers in the UNC STAR Heel Performance Laboratory provided real-time feedback for teens during a study aimed at decreasing the risk of traffic crashes.

A teenager wearing glasses while in a driving simulator.
A study participant wearing the eye-tracking technology developed by Kiefer’s team. (Courtesy Cincinnati Children’s Hospital)

There’s a reason the “please be patient, student driver” bumper sticker exists. Teenage drivers are four times as likely to be involved in a car accident when compared to adult drivers. Teens diagnosed with attention-deficit/hyperactivity disorder are twice as likely as neurotypical teens to be in a crash.

“Drivers with ADHD have a very high risk of crashes in their early years of driving,” says Adam Kiefer, co-director of the UNC STAR Heel Performance Laboratory. “The teen years are especially dangerous, and the only interventions that currently exist to help students with ADHD drive more safely are pharmaceutical.”

Before starting the STAR Heel Lab at UNC-Chapel Hill, Kiefer, an expert in portable technologies and artificial intelligence, collaborated with colleagues at Cincinnati Children’s Hospital to begin a simulated driver training study. The results, published in the New England Journal of Medicine, were so promising that Cincinnati Children’s is now offering the training to teens as a five-session program for $250, and is exploring other options to make it more accessible. So, how did they create the first real-time intervention for driving training in teens with ADHD?

“My experience with eye-tracking is what drove our team to create this approach,” Kiefer says. “We were able to modify an eye-tracking device to create a behavioral feedback system that would alert the drivers in this training when they were looking away from their target for longer than two seconds.”

Two seconds — the amount of time studies have shown is safe to look away from the road while driving. By reducing the number of long, distracted glances, researchers theorized that teens with ADHD could reduce their risk of traffic accidents. And they were right.

In the study, 152 teen drivers with ADHD were divided into two groups. Both groups used a desktop-based learning software platform called Focused Concentration and Attention Learning, or FOCAL, that simulated a driving experience. Throughout multiple sessions, the teens were asked to perform secondary tasks, like searching for a specific symbol on a monitor next to the steering wheel.

Throughout these sessions, all participants wore the eye-tracking technology developed by Kiefer’s team. The control group’s eye movements were tracked, and data was recorded on how often and how long they looked away from the road. The other group’s simulation, deemed FOCAL+ by researchers, included the addition of an alarm that would sound when drivers stopped looking at the road for two or more seconds, alerting them to steer their vision back to the street.

“After the simulations, we tracked the teens’ driving in real life for six months using dashboard-mounted cameras that recorded video of the driver as well as their view out the windshield,” Kiefer explains. “The cameras were programmed to save the couple seconds of video leading up to a high rate of deceleration to see what caused it, which allowed us to look at study outcomes in a realistic way.”

One month into simulation training, drivers in the control group had 28.05 long glances during a session compared to 16.52 long glances for the FOCAL+ group — a 41% difference that was maintained six months into training. In the real world, FOCAL+ trained drivers had fewer long glances and near crashes compared to the control group, as well as 40% fewer crashes. None of the crashes in either group of drivers involved fatalities.

“This is the first non-pharmaceutical intervention to show efficacy in this population of drivers, and it was well-received by them too,” Kiefer says. “Now we’re looking at how to adapt the technology to make it widely available via a virtual reality device or maybe even cell phones. That’s exactly the kind of research translation we hope to make with the STAR Heel Lab, conducting science in a way that’s quickly applicable and scalable.”

Kiefer and Cincinnati Children’s colleague Ryan MacPherson came to UNC-Chapel Hill together in 2019 to start the STAR Heel Lab, where they continued their work on the driving training intervention. Kiefer says MacPherson broke new ground in leveraging the eye-tracking technology to meet their team’s vision for the intervention and continues to be integral to technological and research innovation as the lab fosters new collaborations with a variety of Carolina researchers.

Most recently, Kiefer and MacPherson, along with a team of investigators, were selected as a finalist for the Office of the Vice Chancellor for Research’s Creativity Hubs for their proposal titled, “Behavioral Digital Twins: The Digital Transformation of the Human Phenome to Empower Simulation and Training.” A digital twin is a digital copy or model of a physical entity, with both digital and physical entities interconnected via data and artificial intelligence. For the first time, this team will expand the DT concept to human behavior.

“In my line of research, I never thought being published in the NEJM was a possibility,” Kiefer says. “This is a testament to the impact we can make by connecting researchers who specialize in different areas and bringing different skill sets to the table. It’s a great demonstration of the power of convergent science and what we’re building here at Carolina.”