Human Condition Detection via Eye Movement Analysis

AI-Powered Eye Movement Analysis for Human Condition Detection

This project, led by Datapipesoft, explored the use of advanced video analytics and time-series transformation to detect and analyze human physiological and psychological conditions through high-precision eye movement tracking. The primary objective was to achieve subpixel accuracy in eye tracking, enabling the detection of subtle movement patterns that serve as non-invasive indicators of cognitive and physical states.

By combining computer vision algorithms with Fourier-based time series analysis and machine learning classification, the system aimed to establish a highly accurate, real-time solution for assessing human conditions through ocular behavior.


Understanding human behavior and health through non-invasive, video-based observation presents both scientific and technical challenges. Eye movements, though minimal, carry significant information about cognitive load, fatigue, neurological health, and emotional state. However, extracting meaningful insights from these signals required overcoming several key challenges:

  • Subpixel Precision: Developing a system capable of tracking eye movements at a resolution finer than one pixel to capture minute ocular shifts.
  • Data Transformation: Converting high-resolution eye tracking data into a structured, analyzable format without loss of subtle temporal information.
  • Pattern Classification: Designing machine learning models capable of distinguishing between normal and abnormal movement patterns associated with various human conditions.

Traditional eye-tracking systems lack the precision, data richness, and processing capability required for this level of diagnostic insight.


Datapipesoft delivered an innovative AI-based solution composed of three tightly integrated technical components:

  • Computer Vision Algorithms: Implemented a hybrid approach using Haar cascades and convolutional neural networks (CNNs) for robust eye detection and continuous tracking across frames.
  • Subpixel Interpolation: Applied enhanced interpolation methods to detect movements between pixels, enabling subpixel-level resolution.
  • Discrete Fourier Transform (DFT): Transformed raw eye movement data into the frequency domain to identify periodicities, anomalies, and subtle behavioral signatures.
  • Noise Filtering: Isolated significant signal components, improving the signal-to-noise ratio for downstream analysis.
  • Classification Models: Developed and trained machine learning models—including deep learning architectures—for the classification of eye movement patterns.
  • Human Condition Mapping: Mapped identified patterns to a range of physiological and psychological conditions, enabling accurate interpretation and diagnostic support.

The project delivered a novel and scalable framework for non-invasive human condition assessment, with the following measurable outcomes:

  • High-Precision Tracking: Achieved reliable eye tracking with subpixel accuracy, capturing fine ocular movements previously undetectable with conventional systems.
  • Meaningful Pattern Analysis: Leveraged Fourier transformations to uncover temporal and behavioral trends, enhancing interpretability of raw movement data.
  • Reliable Classification: Enabled accurate real-time classification of multiple physiological and psychological states, offering potential applications in health tech, cognitive testing, and behavioral research.

This project demonstrated the transformative potential of AI-enhanced eye movement analysis in fields such as:

  • Cognitive state monitoring (e.g., attention, fatigue)
  • Neurological diagnostics (e.g., early detection of Parkinson’s or ADHD)
  • Emotional and behavioral research
  • Adaptive human-computer interaction

By integrating computer vision, signal processing, and machine learning, Datapipesoft established a powerful foundation for next-generation non-invasive diagnostic tools.