Skip to main content
placeholder image

Efficient hyperspectral image segmentation for biosecurity scanning using knowledge distillation from multi-head teacher

Journal Article


Abstract


  • Foreign species can deteriorate the environment and the economy of a country. To automatically monitor biosecurity threats at country borders, this paper investigates compact deep networks for accurate and real-time object segmentation for hyperspectral images. To this end, knowledge distillation (KD) approaches compress the model by distilling the knowledge of a large teacher network to a compact student network. However, when the student is over-compressed, the performance of standard KD methods degrades significantly due to the large capacity gap between the teacher and the student. This gap can be addressed by adding medium-sized teacher assistants, but training them incurs significant computation and hence is impractical. To address this problem, this paper proposes a new framework called Knowledge Distillation from Multi-head Teacher (KDM), which distills the knowledge of a multi-head teacher to the student. By encapsulating multiple teachers in a single network, our proposed KDM assists the learning of a very compact student and significantly reduces the training time. We also introduce Bio-HSI, a new large benchmark hyperspectral image dataset of 3,125 high-resolution images with dense segmentation ground truth. This new, large dataset can be expected to advance research on deep models for hyperspectral image segmentation. Evaluated on this dataset, the student trained via our KDM has 762 times fewer parameters than the state-of-the-art segmentation model (i.e., HRNet), while achieving competitive accuracy.

Publication Date


  • 2022

Citation


  • Phan, M. H., Phung, S. L., Luu, K., & Bouzerdoum, A. (2022). Efficient hyperspectral image segmentation for biosecurity scanning using knowledge distillation from multi-head teacher. Neurocomputing, 504, 189-203. doi:10.1016/j.neucom.2022.06.095

Scopus Eid


  • 2-s2.0-85134627366

Web Of Science Accession Number


Start Page


  • 189

End Page


  • 203

Volume


  • 504

Issue


Place Of Publication


Abstract


  • Foreign species can deteriorate the environment and the economy of a country. To automatically monitor biosecurity threats at country borders, this paper investigates compact deep networks for accurate and real-time object segmentation for hyperspectral images. To this end, knowledge distillation (KD) approaches compress the model by distilling the knowledge of a large teacher network to a compact student network. However, when the student is over-compressed, the performance of standard KD methods degrades significantly due to the large capacity gap between the teacher and the student. This gap can be addressed by adding medium-sized teacher assistants, but training them incurs significant computation and hence is impractical. To address this problem, this paper proposes a new framework called Knowledge Distillation from Multi-head Teacher (KDM), which distills the knowledge of a multi-head teacher to the student. By encapsulating multiple teachers in a single network, our proposed KDM assists the learning of a very compact student and significantly reduces the training time. We also introduce Bio-HSI, a new large benchmark hyperspectral image dataset of 3,125 high-resolution images with dense segmentation ground truth. This new, large dataset can be expected to advance research on deep models for hyperspectral image segmentation. Evaluated on this dataset, the student trained via our KDM has 762 times fewer parameters than the state-of-the-art segmentation model (i.e., HRNet), while achieving competitive accuracy.

Publication Date


  • 2022

Citation


  • Phan, M. H., Phung, S. L., Luu, K., & Bouzerdoum, A. (2022). Efficient hyperspectral image segmentation for biosecurity scanning using knowledge distillation from multi-head teacher. Neurocomputing, 504, 189-203. doi:10.1016/j.neucom.2022.06.095

Scopus Eid


  • 2-s2.0-85134627366

Web Of Science Accession Number


Start Page


  • 189

End Page


  • 203

Volume


  • 504

Issue


Place Of Publication