Bilal's innovation, EyeSignals, based emotional intelligence, decodes emotional states by analyzing physiological eye data, including pupillometry and gaze patterns. These signals reflect internal emotional changes, providing a reliable alternative to traditional behavioral cues.
A mapping between emotional states, based on arousal and valence levels, and eye movements is established by designing experimental protocols to evoke genuine emotional responses, combining subjective feedback with statistical algorithms. The system begins by capturing eye signals using specialized sensors or traditional cameras. Advanced signal processing techniques filter noise and outlier data. Then neural network models that utilize contrastive loss and sequence models identify patterns associated with different emotional states. These models recognize emotions based on arousal (intensity) and valence (positivity or negativity).
EyeSignals can be used independently or in combination with other modalities to verify emotional sincerity and improve overall system precision.
This fully automated framework represents a pioneering approach to understanding emotions through reliable physiological indicators.