Microsaccades are small, involuntary eye movements essential for visual perception and neural processing. Traditional microsaccade research often relies on eye trackers and frame-based video analysis. While eye trackers offer high precision, they can be expensive and have limitations in scalability and temporal resolution. In contrast, event-based sensing offers a more efficient and precise alternative, capturing high-resolution spatial and temporal information with minimal latency. This work introduces a pioneering event-based microsaccade dataset, simulating angular displacements from 0.5° to 2.0° using Blender and v2e. We evaluate the dataset using Spiking-VGG11, VGG13, VGG16, and a novel Spiking-VGG16Flow using SpikingJelly. The models achieve ~90% accuracy, establishing a strong benchmark for event-based eye movement research.
 
      @inproceedings{shariff2025microsaccade,
  title={Benchmarking Microsaccade Recognition with Event Cameras: A Novel Dataset and Evaluation},
  author={Shariff, Waseem and Hanley, Timothy and Stec, Maciej and Javidnia, Hossein and Corcoran, Peter},
  booktitle={BMVC},
  year={2025}
}