Friday, May 23, 2025

Eye-Tracking Innovation Merges the Powers of Deflectometry, AI




Eye-tracking technology is critical in virtual and augmented reality headsets, scientific research, medical and behavioral sciences, automotive driving assistance, and industrial engineering. Tracking the movements of the human eye with high accuracy, however, is a daunting challenge.

Researchers at the University of Arizona Wyant College of Optical Sciences have demonstrated an approach that integrates deflectometry with advanced computation. The method, the researchers said, has the potential to significantly improve state-of-the-art eye-tracking technology.

“Current eye-tracking methods can only capture directional information of the eyeball from a few sparse surface points, about a dozen at most,” said Florian Willomitzer, associate professor of optical sciences and principal investigator of the study. “With our deflectometry-based method, we can use the information from more than 40,000 surface points, theoretically even millions, all extracted from only one single, instantaneous camera image.”

“More data points provide more information that can be potentially used to significantly increase the accuracy of the gaze direction estimation,” said Jiazhang Wang, postdoctoral researcher in Willomitzer's lab and the study's first author. “This is critical, for instance, to enable next-generation applications in virtual reality. We have shown that our method can easily increase the number of acquired data points by a factor of more than 3000, compared to conventional approaches.”

Deflectometry is a 3D imaging technique that allows for the measurement of reflective surfaces with very high accuracy. Common applications of deflectometry include scanning large telescope mirrors or other high-performance optics for the slightest imperfections or deviations from their prescribed shape.

The team conducted experiments with human participants and a realistic, artificial eye model. The team measured the study subjects’ viewing direction and was able to track their gaze direction with accuracies between 0.46 and 0.97 degrees. When tested on the artificial eye model, the error was around just 0.1 degrees.

Instead of depending on a few infrared point light sources to acquire information from eye surface reflections, the new method uses a screen displaying known structured light patterns as the illumination source. Each of the more than 1 million pixels on the screen can thereby act as an individual point light source.

Bio Photonics Research Award

Visit: biophotonicsresearch.com
Nominate Now: https://biophotonicsresearch.com/award-nomination/?ecategory=Awards&rcategory=Awardee


#MeatAnalysis #FluorescenceTech #FoodQuality #FoodSafety #SpectroscopyInFood #MeatAuthentication #RapidDetection #FoodScience #MeatFreshness #MolecularDetection #FoodIndustryInnovation #NonDestructiveTesting #FoodMonitoring #SpectroscopyApplications #QualityControl #AdvancedSpectroscopy #MeatSpoilageDetection #FoodIntegrity #SmartFoodTesting #RealTimeAnalysis #FoodAuthenticity #FoodSafetyInnovation #SpectroscopyResearch #NextGenFoodSafety #InnovativeFoodScience,

No comments:

Post a Comment

When AI meets physics: Unlocking complex protein structures to accelerate biomedical breakthroughs

Artificial intelligence (AI) is transforming how scientists understand proteins—these are working molecules that drive nearly every process ...