Layer-fusion for online mutual knowledge distillation

Abstract Online knowledge distillation opens a door for distillation on parallel student networks, which breaks the heavy reliance upon the pre-trained teacher model. The additional feature fusion solutions further provide positive training loop among parallel student networks. However, current feat...
Ausführliche Beschreibung

Gespeichert in:
Autor*in:

Hu, Gan [verfasserIn]

Ji, Yanli

Liang, Xingzhu

Han, Yuexing

Format:

E-Artikel

Sprache:

Englisch

Erschienen:

2022

Schlagwörter:

Online learning

Knowledge distillation

Feature fusion

Mutual learning

Anmerkung:

© The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2022. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Übergeordnetes Werk:

Enthalten in: Multimedia systems - Berlin : Springer, 1993, 29(2022), 2 vom: 10. Nov., Seite 787-796

Übergeordnetes Werk:

volume:29 ; year:2022 ; number:2 ; day:10 ; month:11 ; pages:787-796

Links:

Volltext

DOI / URN:

10.1007/s00530-022-01021-6

Katalog-ID:

SPR049486691

Nicht das Richtige dabei?

Schreiben Sie uns!