A Comprehensive Evaluation of User Experience in Eye-Controlled Interaction

Authors

  • Hanwen Zhang Glasgow College, University of Electronic Science and Technology of China Department of Electronic Information Engineering Chengdu, Sichuan, 611731, China

Keywords:

Eye-controlled interaction, User experience, Intent recognition, Feedback design, Visual fatigue

Abstract

Eye-controlled interfaces offer a hands-free and intuitive means of human-computer interaction, increasingly used in healthcare, gaming, and assistive technologies. Yet, challenges such as low precision, unintentional activation (e.g., the “Midas Touch” effect), and visual fatigue persist. This paper presents a comprehensive evaluation of user experience in eye-controlled systems, with an emphasis on intent recognition accuracy, system feedback, and fatigue mitigation. A novel contribution of this work is the integration of machine learning algorithms, such as recurrent and convolutional neural networks, for modeling gaze trajectories and predicting user intent. A simulated experiment is conducted to assess the performance of the proposed models in reducing unintended activation. Real-time optimization strategies, including model compression and edge deployment, are also discussed. The findings suggest that ML-enhanced gaze interaction can improve responsiveness, accuracy, and user satisfaction, providing a promising path toward robust, fatigue-aware, and personalized gaze-based interfaces.

Downloads

Published

2025-11-30