EeveeDark: A Binary Neural Framework for Low-Light Video Enhancement via Event-Guided Sensor-Level Fusion


Eker O., Erdem E., Erdem A.

IEEE Robotics and Automation Letters, 2026 (SCI-Expanded, Scopus) identifier

  • Publication Type: Article / Article
  • Publication Date: 2026
  • Doi Number: 10.1109/lra.2026.3666388
  • Journal Name: IEEE Robotics and Automation Letters
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Keywords: Deep learning for visual perception, Event camera, Low-light video enhancement, Sensor fusion
  • Hacettepe University Affiliated: Yes

Abstract

Enhancing videos under extreme low-light conditions remains challenging due to the difficulty of balancing restoration quality and computational efficiency in resource-constrained settings. This paper introduces EeveeDark, a low-light video enhancement framework that combines the spatial richness of sensor-level RAW data with the temporal precision of event streams. Central to our model is a Binary Neural Network (BNN) architecture that reduces computational overhead by quantizing weights and activations while preserving detail. EeveeDark incorporates (i) modality-specific binary encoders for processing RAW frames and event data, (ii) a lightweight fusion block for integrating spatial and temporal cues, and (iii) an event-guided skip gating mechanism for dynamic spatiotemporal refinement. Experiments on synthetic and real-world datasets show that EeveeDark outperforms prior BNN-based methods and offers a favorable performance-efficiency trade-off compared to full-precision models.