Six months ago, a dozen students from Eden Prairie High School’s (EPHS) Distraction-free Life Club (DLC) got together to confront the epidemic of distracted driving. They designed and demonstrated EyeDA-1; an artificial intelligence (AI) – based device for tackling distracted driving.
Their invention was widely reported in local media and even became the topic of a Sunday sermon by Pastor Gail Bach at Eden Prairie’s St. Andrew’s Lutheran Church.
Yash Dagade, a 17-year-old junior at EPHS, the EyeDa’s lead developer, and his team are now working on a refined EyeDA-2 design.
“Drowsy and distracted driving are significant contributing factors to motor vehicle accidents,” Dagade, said. “The National Highway Traffic Safety Administration (NHTSA) estimates that drowsy driving alone was responsible for over 938,000 crashes, contributing to almost 15% of the total crashes in 2018 alone.”
NHTSA reported that U.S. traffic fatalities are estimated to have hit a 16-year high in 2021, with 42,915 people killed in traffic crashes.
Research conducted by AAA Foundation for Traffic Safety shows that it can take up to 27 seconds to refocus after being distracted. A car traveling at 55 miles per hour will travel more than six football fields in that amount of time.
Minnesota’s Office of Traffic Safety has reported 10 traffic fatalities since the start of the new year, and 445 traffic deaths for the year 2022.
Dagade lamented the lack of a readily available affordable solution for cars that people already own. “EyeDA can be used in any vehicle …it detects potential distractions or fatigue and alerts drivers,” he said.
In recent years, automakers have been equipping vehicles with AI-based advanced driver-assistance systems. These safety options alert drivers about blind spots, rear- and front-end collisions, and more. However, they do not address the problem of risky driver behaviors frequently exhibited by drivers.
An earlier version of EyeDA-1 equipped with a camera and processor, sent a visual and audio alert when the driver begins to be distracted, prompting an immediate driver intervention. But this approach only detected driver behavior based on facial features with a camera that was affected by changes in light intensity.
The young developers are now working on a refined EyeDA-2 design with a more advanced approach using a neural network with a camera, microphone, GPS, and a gyroscope to enhance performance, physical size, and image processing to decipher driver behavior.
With the emerging future of autonomous cars, the students are also working on an EyeDA-2 design that will feature the ability to differentiate between manual and autonomous driving.
Editor’s Note: Contributor Vijay Dixit is a member of the Eden Prairie Local News board of directors.
We offer several ways for our readers to provide feedback. Your comments are welcome on our social media posts (Facebook, X, Instagram, Threads, and LinkedIn). We also encourage Letters to the Editor; submission guidelines can be found on our Contact Us page. If you believe this story has an error or you would like to get in touch with the author, please connect with us.