Forgetting Analysis by Module Probing for Online Object Detection with Faster R-CNN
Résumé
Online Object Detection (OOD) involves learning novel object categories from a stream of images, like the one generated by an agent exploring new environments. In this scenario, the widely used Faster R-CNN architecture faces catastrophic forgetting-the phenomenon where acquiring new knowledge leads to forget previously learned knowledge. The forgetting evaluations published in the literature focus only on the evolution of the performances on past seen data without questioning where forgetting occurs in the architecture and how it propagates. In this paper, our first contribution introduces a new protocol called Module Probing to offer a detailed evaluation of forgetting. This protocol identifies the layers accountable for catastrophic forgetting within the Faster R-CNN architecture. Our results reveal that forgetting is predominantly concentrated in the final classification layer. Building on these insights, our second contribution involves mitigating forgetting by modifying the architecture's classification layer. We demonstrate that it significantly reduces forgetting on three OOD benchmarks. Our achievements provides a first replay-free baseline for challenging OOD scenarios to enhance model long-term performance.
Origine | Fichiers produits par l'(les) auteur(s) |
---|