skip to main content
10.1145/1015330.1015409acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
Article

A fast iterative algorithm for fisher discriminant using heterogeneous kernels

Published: 04 July 2004 Publication History

Abstract

We propose a fast iterative classification algorithm for Kernel Fisher Discriminant (KFD) using heterogeneous kernel models. In contrast with the standard KFD that requires the user to predefine a kernel function, we incorporate the task of choosing an appropriate kernel into the optimization problem to be solved. The choice of kernel is defined as a linear combination of kernels belonging to a potentially large family of different positive semidefinite kernels. The complexity of our algorithm does not increase significantly with respect to the number of kernels on the kernel family. Experiments on several benchmark datasets demonstrate that generalization performance of the proposed algorithm is not significantly different from that achieved by the standard KFD in which the kernel parameters have been tuned using cross validation. We also present results on a real-life colon cancer dataset that demonstrate the efficiency of the proposed method.

References

[1]
(2004). CPLEX optimizer. ILOG CPLEX Division, 889 Alder Avenue, Incline Village, Nevada. http://www.cplex.com/.]]
[2]
Bach, F., Lanckriet, J., & Jordan, M. (2004). Fast kernel learning using sequential minimal optimizationTechnical Report CSD-04-1307). Division of Computer Science, University of California, Berkeley.]]
[3]
Bennet, K., Momma, M., & Embrechts, M. (2002). Mark: a boosting algorithm for heterogeneous kernel models. Proceedings KDD-2002: Knowledge Discovery and Data Mining, July 23-26, 2002, Edmonton, Alberta, CA (pp. 24--31). Asscociation for Computing Machinery.]]
[4]
Bezdek, J., & Hathaway, R. (2002). Some notes on alternating optimization. Proceedings of the 2002 AFSS International Conference on Fuzzy Systems (pp. 288--300). Springer-Verlag.]]
[5]
Bezdek, J., & Hathaway, R. (2003). Convergence of alternating optimization. Neural, Parallel Sci. Comput., 11, 351--368.]]
[6]
Cherkassky, V., & Mulier, F. (1998). Learning from data - concepts, theory and methods. New York: John Wiley & Sons.]]
[7]
Cristianini, N., & Shawe-Taylor, J. (2000). An introduction to support vector machines. Cambridge: Cambridge University Press.]]
[8]
Evgeniou, T., Pontil, M., & Poggio, T. (2000a). Regularization networks and support vector machines. Advances in Computational Mathematics, 13, 1--50.]]
[9]
Evgeniou, T., Pontil, M., & Poggio, T. (2000b). Regularization networks and support vector machines. Advances in Large Margin Classifiers (pp. 171--203). Cambridge, MA: MIT Press.]]
[10]
Fukunaga, K. (1990). Introduction to statistical pattern recognition. San Diego, CA: Academic Press.]]
[11]
Fung, G., & Mangasarian, O. L. (2001). Proximal support vector machine classifiers. Proceedings KDD-2001: Knowledge Discovery and Data Mining, August 26-29, 2001, San Francisco, CA (pp. 77--86). New York: Association for Computing Machinery. ftp://ftp.cs.wisc.edu/pub/dmi/techreports/01-02.ps.]]
[12]
Fung, G., & Mangasarian, O. L. (2003). Finite Newton method for Lagrangian support vector machine classification. Special Issue on Support Vector Machines. Neurocomputing, 55, 39--55.]]
[13]
Hamers, B., Suykens, J., Leemans, V., & Moor, B. D. (2003). Ensemble learning of coupled parmeterised kernel models. International Conference on Neural Information Processing (pp. 130--133).]]
[14]
Lanckriet, G., Cristianini, N., Bartlett, P., Ghaoui, L. E., & Jordan, M. (2003). Learning the kernel matrix with semidefinite programming. Journal of Machine Learning Research, 5, 27--72.]]
[15]
Lee, Y.-J., & Mangasarian, O. L. (2001). SSVM: A smooth support vector machine. Computational Optimization and Applications, 20, 5--22. Data Mining Institute, University of Wisconsin, Technical Report 99-03. ftp://ftp.cs.wisc.edu/pub/dmi/techreports/99-03.ps.]]
[16]
Mangasarian, O. L. (1994). Nonlinear programming. Philadelphia, PA: SIAM.]]
[17]
Mangasarian, O. L. (2000). Generalized support vector machines. Advances in Large Margin Classifiers (pp. 135--146). Cambridge, MA: MIT Press. ftp://ftp.cs.wisc.edu/math-prog/techreports/98-14.ps.]]
[18]
Mika, S., Rätsch, G., & Müüller, K.-R. (2000). A mathematical programming approach to the kernel fisher algorithm. NIPS (pp. 591--597).]]
[19]
Mika, S., Rätsch, G., Weston, J., Schölkopf, B., & Müüller, K.-R. (1999). Fisher discriminant analysis with kernels. Neural Networks for Signal Processing IX (pp. 41--48). IEEE.]]
[20]
Mitchell, T. M. (1997). Machine learning. Boston: McGraw-Hill.]]
[21]
Murphy, P. M., & Aha, D. W. (1992). UCI machine learning repository. www.ics.uci.edu/~mlearn/MLRepository.html.]]
[22]
Suykens, J. A. K., & Vandewalle, J. (1999). Least squares support vector machine classifiers. Neural Processing Letters, 9, 293--300.]]
[23]
Vapnik, V. N. (2000). The nature of statistical learning theory. New York: Springer. Second edition.]]
[24]
Xu, J., & Zhang, X. (2001). Kernel mse algorithm: A unified framework for kfd, ls-svm and krr. Proceedings of The International Joint Conference on Neural Networks 2001. IEEE.]]
[25]
Yee, J., Geetanjali, A., Hung, R., Steinauer-Gebauer, A., wall, A., & McQuaid, K. (2003). Computer tomographic virtual colonoscopy to screen for colorectal neoplasia in asymptomatic adults. Proceeding of NEJM-2003 (pp. 2191--2200).]]

Cited By

View all
  • (2023)A Novel Cluster Matching-Based Improved Kernel Fisher Criterion for Image Classification in Unsupervised Domain AdaptationSymmetry10.3390/sym1506116315:6(1163)Online publication date: 28-May-2023
  • (2022)Kernel Learning Estimation: A Model-Free Approach to Tracking Randomly Moving ObjectEmerging IT/ICT and AI Technologies Affecting Society10.1007/978-981-19-2940-3_4(55-69)Online publication date: 25-Aug-2022
  • (2022)Coupled support tensor machine classification for multimodal neuroimaging dataStatistical Analysis and Data Mining: The ASA Data Science Journal10.1002/sam.1158715:6(797-818)Online publication date: 23-May-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICML '04: Proceedings of the twenty-first international conference on Machine learning
July 2004
934 pages
ISBN:1581138385
DOI:10.1145/1015330
  • Conference Chair:
  • Carla Brodley
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 July 2004

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Binary Classification
  2. Heterogeneous Kernels
  3. Linear Fisher Discriminant
  4. Mathematical Programming

Qualifiers

  • Article

Acceptance Rates

Overall Acceptance Rate 140 of 548 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)2
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2023)A Novel Cluster Matching-Based Improved Kernel Fisher Criterion for Image Classification in Unsupervised Domain AdaptationSymmetry10.3390/sym1506116315:6(1163)Online publication date: 28-May-2023
  • (2022)Kernel Learning Estimation: A Model-Free Approach to Tracking Randomly Moving ObjectEmerging IT/ICT and AI Technologies Affecting Society10.1007/978-981-19-2940-3_4(55-69)Online publication date: 25-Aug-2022
  • (2022)Coupled support tensor machine classification for multimodal neuroimaging dataStatistical Analysis and Data Mining: The ASA Data Science Journal10.1002/sam.1158715:6(797-818)Online publication date: 23-May-2022
  • (2018)Feature Extraction of Electronic Nose Signals Using QPSO-Based Multiple KFDA Signal ProcessingSensors10.3390/s1802038818:2(388)Online publication date: 29-Jan-2018
  • (2018)Feature Selection Using Metaheuristic Algorithms on Medical DatasetsHarmony Search and Nature Inspired Optimization Algorithms10.1007/978-981-13-0761-4_87(923-937)Online publication date: 24-Aug-2018
  • (2016)Hybrid data mining model for the classification and prediction of medical datasetsInternational Journal of Knowledge Engineering and Soft Data Paradigms10.5555/3124875.31248835:3-4(262-284)Online publication date: 1-Jan-2016
  • (2016)Some Additional TopicsTwin Support Vector Machines10.1007/978-3-319-46186-1_7(153-192)Online publication date: 13-Oct-2016
  • (2014)An ordinal kernel trick for a computationally efficient support vector machine2014 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN.2014.6889884(3930-3937)Online publication date: Jul-2014
  • (2014)Avoiding the Cluster Hypothesis in SV Classification of Partially Labeled DataRecent Advances of Neural Network Models and Applications10.1007/978-3-319-04129-2_4(33-40)Online publication date: 2014
  • (2013)Universal Blind Image Quality Assessment Metrics Via Natural Scene Statistics and Multiple Kernel LearningIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2013.227135624:12(2013-2026)Online publication date: Dec-2013
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media