Data di Pubblicazione:
2022
Citazione:
Transfer without Forgetting / Boschini, Matteo; Bonicelli, Lorenzo; Porrello, Angelo; Bellitto, Giovanni; Pennisi, Matteo; Palazzo, Simone; Spampinato, Concetto; Calderara, Simone. - 13683 LNCS:(2022), pp. 692-709. ( 17th European Conference on Computer Vision, ECCV 2022 Tel Aviv, Israel 23-27 Oct, 2022) [10.1007/978-3-031-20050-2_40].
Abstract:
This work investigates the entanglement between Continual Learning (CL) and Transfer Learning (TL). In particular, we shed light on the widespread application of network pretraining, highlighting that it is itself subject to catastrophic forgetting. Unfortunately, this issue leads to the under-exploitation of knowledge transfer during later tasks. On this ground, we propose Transfer without Forgetting (TwF), a hybrid Continual Transfer Learning approach building upon a fixed pretrained sibling network, which continuously propagates the knowledge inherent in the source domain through a layer-wise loss term. Our experiments indicate that TwF steadily outperforms other CL methods across a variety of settings, averaging a 4.81% gain in Class-Incremental accuracy over a variety of datasets and different buffer sizes.
Tipologia CRIS:
Relazione in Atti di Convegno
Keywords:
Continual Learning, Lifelong Learning, Experience Replay, Transfer Learning, Pretraining, Attention
Elenco autori:
Boschini, Matteo; Bonicelli, Lorenzo; Porrello, Angelo; Bellitto, Giovanni; Pennisi, Matteo; Palazzo, Simone; Spampinato, Concetto; Calderara, Simone
Link alla scheda completa:
Titolo del libro:
Proceedings of the 17th European Conference on Computer Vision, ECCV 2022
Pubblicato in: