Skip to main content
placeholder image

Absent Multiple Kernel Learning Algorithms

Journal Article


Abstract


  • Multiple kernel learning (MKL) has been intensively studied during the past decade. It optimally combines the multiple channels of each sample to improve classification performance. However, existing MKL algorithms cannot effectively handle the situation where some channels of the samples are missing, which is not uncommon in practical applications. This paper proposes three absent MKL (AMKL) algorithms to address this issue. Different from existing approaches where missing channels are first imputed and then a standard MKL algorithm is deployed on the imputed data, our algorithms directly classify each sample based on its observed channels, without performing imputation. Specifically, we define a margin for each sample in its own relevant space, a space corresponding to the observed channels of that sample. The proposed AMKL algorithms then maximize the minimum of all sample-based margins, and this leads to a difficult optimization problem. We first provide two two-step iterative algorithms to approximately solve this problem. After that, we show that this problem can be reformulated as a convex one by applying the representer theorem. This makes it readily be solved via existing convex optimization packages. In addition, we provide a generalization error bound to justify the proposed AMKL algorithms from a theoretical perspective. Extensive experiments are conducted on nine UCI and six MKL benchmark datasets to compare the proposed algorithms with existing imputation-based methods. As demonstrated, our algorithms achieve superior performance and the improvement is more significant with the increase of missing ratio.

Publication Date


  • 2020

Citation


  • Liu, X., Wang, L., Zhu, X., Li, M., Zhu, E., Liu, T., . . . Yin, J. (2020). Absent Multiple Kernel Learning Algorithms. IEEE Transactions on Pattern Analysis and Machine Intelligence, 42(6), 1303-1316. doi:10.1109/TPAMI.2019.2895608

Scopus Eid


  • 2-s2.0-85084720832

Start Page


  • 1303

End Page


  • 1316

Volume


  • 42

Issue


  • 6

Abstract


  • Multiple kernel learning (MKL) has been intensively studied during the past decade. It optimally combines the multiple channels of each sample to improve classification performance. However, existing MKL algorithms cannot effectively handle the situation where some channels of the samples are missing, which is not uncommon in practical applications. This paper proposes three absent MKL (AMKL) algorithms to address this issue. Different from existing approaches where missing channels are first imputed and then a standard MKL algorithm is deployed on the imputed data, our algorithms directly classify each sample based on its observed channels, without performing imputation. Specifically, we define a margin for each sample in its own relevant space, a space corresponding to the observed channels of that sample. The proposed AMKL algorithms then maximize the minimum of all sample-based margins, and this leads to a difficult optimization problem. We first provide two two-step iterative algorithms to approximately solve this problem. After that, we show that this problem can be reformulated as a convex one by applying the representer theorem. This makes it readily be solved via existing convex optimization packages. In addition, we provide a generalization error bound to justify the proposed AMKL algorithms from a theoretical perspective. Extensive experiments are conducted on nine UCI and six MKL benchmark datasets to compare the proposed algorithms with existing imputation-based methods. As demonstrated, our algorithms achieve superior performance and the improvement is more significant with the increase of missing ratio.

Publication Date


  • 2020

Citation


  • Liu, X., Wang, L., Zhu, X., Li, M., Zhu, E., Liu, T., . . . Yin, J. (2020). Absent Multiple Kernel Learning Algorithms. IEEE Transactions on Pattern Analysis and Machine Intelligence, 42(6), 1303-1316. doi:10.1109/TPAMI.2019.2895608

Scopus Eid


  • 2-s2.0-85084720832

Start Page


  • 1303

End Page


  • 1316

Volume


  • 42

Issue


  • 6