Skip to main content
placeholder image

Scattering transform framework for unmixing of hyperspectral data

Journal Article


Abstract


  • © 2019 by the authors. The scattering transform, which applies multiple convolutions using known filters targeting different scales of time or frequency, has a strong similarity to the structure of convolution neural networks (CNNs), without requiring training to learn the convolution filters, and has been used for hyperspectral image classification in recent research. This paper investigates the application of the scattering transform framework to hyperspectral unmixing (STFHU). While state-of-the-art research on unmixing hyperspectral data utilizing scattering transforms is limited, the proposed end-to-end method applies pixel-based scattering transforms and preliminary three-dimensional (3D) scattering transforms to hyperspectral images in the remote sensing scenario to extract feature vectors, which are then trained by employing the regression model based on the k-nearest neighbor (k-NN) to estimate the abundance of maps of endmembers. Experiments compare performances of the proposed algorithm with a series of existing methods in quantitative terms based on both synthetic data and real-world hyperspectral datasets. Results indicate that the proposed approach is more robust to additive noise, which is suppressed by utilizing the rich information in both high-frequency and low-frequency components represented by the scattering transform. Furthermore, the proposed method achieves higher accuracy for unmixing using the same amount of training data with all comparative approaches, while achieving equivalent performance to the best performing CNN method but using much less training data.

Authors


  •   Zeng, Yiliang (external author)
  •   Ritz, Christian H.
  •   Zhao, Jiahong (external author)
  •   Lan, Jinhui (external author)

Publication Date


  • 2019

Citation


  • Y. Zeng, C. Ritz, J. Zhao & J. Lan, "Scattering transform framework for unmixing of hyperspectral data," Remote Sensing, vol. 11, (23) 2019.

Scopus Eid


  • 2-s2.0-85076516351

Volume


  • 11

Issue


  • 23

Place Of Publication


  • Switzerland

Abstract


  • © 2019 by the authors. The scattering transform, which applies multiple convolutions using known filters targeting different scales of time or frequency, has a strong similarity to the structure of convolution neural networks (CNNs), without requiring training to learn the convolution filters, and has been used for hyperspectral image classification in recent research. This paper investigates the application of the scattering transform framework to hyperspectral unmixing (STFHU). While state-of-the-art research on unmixing hyperspectral data utilizing scattering transforms is limited, the proposed end-to-end method applies pixel-based scattering transforms and preliminary three-dimensional (3D) scattering transforms to hyperspectral images in the remote sensing scenario to extract feature vectors, which are then trained by employing the regression model based on the k-nearest neighbor (k-NN) to estimate the abundance of maps of endmembers. Experiments compare performances of the proposed algorithm with a series of existing methods in quantitative terms based on both synthetic data and real-world hyperspectral datasets. Results indicate that the proposed approach is more robust to additive noise, which is suppressed by utilizing the rich information in both high-frequency and low-frequency components represented by the scattering transform. Furthermore, the proposed method achieves higher accuracy for unmixing using the same amount of training data with all comparative approaches, while achieving equivalent performance to the best performing CNN method but using much less training data.

Authors


  •   Zeng, Yiliang (external author)
  •   Ritz, Christian H.
  •   Zhao, Jiahong (external author)
  •   Lan, Jinhui (external author)

Publication Date


  • 2019

Citation


  • Y. Zeng, C. Ritz, J. Zhao & J. Lan, "Scattering transform framework for unmixing of hyperspectral data," Remote Sensing, vol. 11, (23) 2019.

Scopus Eid


  • 2-s2.0-85076516351

Volume


  • 11

Issue


  • 23

Place Of Publication


  • Switzerland