Skip to main content
placeholder image

A kernel-induced space selection approach to model selection of KLDA

Journal Article


Download full-text (Open Access)

Abstract


  • Model selection in kernel linear discriminant analysis

    (KLDA) refers to the selection of appropriate parameters of a

    kernel function and the regularizer. By following the principle

    of maximum information preservation, this paper formulates the

    model selection problem as a problem of selecting an optimal

    kernel-induced space in which different classes are maximally

    separated from each other. A scatter-matrix-based criterion is developed

    to measure the “goodness” of a kernel-induced space, and

    the kernel parameters are tuned by maximizing this criterion. This

    criterion is computationally efficient and is differentiable with respect

    to the kernel parameters. Compared with the leave-one-out

    (LOO) or -fold cross validation (CV), the proposed approach

    can achieve a faster model selection, especially when the number

    of training samples is large or when many kernel parameters

    need to be tuned. To tune the regularization parameter in the

    KLDA, our criterion is used together with the method proposed

    by Saadi et al. (2004). Experiments on benchmark data sets verify

    the effectiveness of this model selection approach.

Authors


Publication Date


  • 2008

Citation


  • Wang, L., Chan, K. Luk., Xue, P. & Zhou, L. (2008). A kernel-induced space selection approach to model selection of KLDA. IEEE Transactions on Neural Networks, 19 (12), 2116-2131.

Scopus Eid


  • 2-s2.0-57749100079

Ro Full-text Url


  • http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1467&context=eispapers

Ro Metadata Url


  • http://ro.uow.edu.au/eispapers/462

Has Global Citation Frequency


Number Of Pages


  • 15

Start Page


  • 2116

End Page


  • 2131

Volume


  • 19

Issue


  • 12

Place Of Publication


  • United States

Abstract


  • Model selection in kernel linear discriminant analysis

    (KLDA) refers to the selection of appropriate parameters of a

    kernel function and the regularizer. By following the principle

    of maximum information preservation, this paper formulates the

    model selection problem as a problem of selecting an optimal

    kernel-induced space in which different classes are maximally

    separated from each other. A scatter-matrix-based criterion is developed

    to measure the “goodness” of a kernel-induced space, and

    the kernel parameters are tuned by maximizing this criterion. This

    criterion is computationally efficient and is differentiable with respect

    to the kernel parameters. Compared with the leave-one-out

    (LOO) or -fold cross validation (CV), the proposed approach

    can achieve a faster model selection, especially when the number

    of training samples is large or when many kernel parameters

    need to be tuned. To tune the regularization parameter in the

    KLDA, our criterion is used together with the method proposed

    by Saadi et al. (2004). Experiments on benchmark data sets verify

    the effectiveness of this model selection approach.

Authors


Publication Date


  • 2008

Citation


  • Wang, L., Chan, K. Luk., Xue, P. & Zhou, L. (2008). A kernel-induced space selection approach to model selection of KLDA. IEEE Transactions on Neural Networks, 19 (12), 2116-2131.

Scopus Eid


  • 2-s2.0-57749100079

Ro Full-text Url


  • http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1467&context=eispapers

Ro Metadata Url


  • http://ro.uow.edu.au/eispapers/462

Has Global Citation Frequency


Number Of Pages


  • 15

Start Page


  • 2116

End Page


  • 2131

Volume


  • 19

Issue


  • 12

Place Of Publication


  • United States