Event-based Liveness Detection using Temporal Ocular Dynamics: An Exploratory Approach
arXiv cs.CV / 4/30/2026
📰 NewsModels & Research
Key Points
- The paper addresses a key limitation of RGB-based face liveness detection—reduced generalization across sensors and attack types—by exploring event cameras for liveness recognition.
- It argues that replay attacks struggle to reproduce event-camera “temporal ocular dynamics” because temporal resampling and display artifacts distort the spatio-temporal event patterns.
- The authors extend an existing RGBE-Gaze dataset by adding replay-attack recordings, creating an event-based “fake” counterpart for training and evaluation.
- Using event-driven temporal features from eye regions, they demonstrate ocular motion segmentation and liveness classification with a spiking convolutional neural network, reaching up to 95.37% top-1 accuracy.
- The exploratory results suggest event-based sensing could improve robustness and enable low-latency liveness detection by leveraging microsecond-resolution eye dynamics.
Related Articles

We Built a DNS-Based Discovery Protocol for AI Agents — Here's How It Works
Dev.to

Building AI Evaluation Pipelines: Automating LLM Testing from Dataset to CI/CD
Dev.to

Function Calling Harness 2: CoT Compliance from 9.91% to 100%
Dev.to

What Anthropic's April 23 Postmortem Reveals About Your Agent Harness
Dev.to

Fine-tuning YOLOv11 to detect stamps and signatures on banking documents - a practical walkthrough
Dev.to