Global convergence analysis of decomposition methods for support vector regression

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Decomposition method has been widely used to efficiently solve the large size quadratic programming (QP) problems arising in support vector regression (SVR). In a decomposition method, a large QP problem is decomposed into a series of smaller QP subproblems, which can be solved much faster than the original one. In this paper, we analyze the global convergence of decomposition methods for SVR. We will show the decomposition methods for the convex programming problem formulated by Flake and Lawrence always stop within a finite number of iterations.

Original languageEnglish
Title of host publicationAdvances in Neural Networks - ISNN 2008 - 5th International Symposium on Neural Networks, ISNN 2008, Proceedings
PublisherSpringer Verlag
Pages663-673
Number of pages11
EditionPART 1
ISBN (Print)3540877312, 9783540877318
DOIs
Publication statusPublished - 2008
Externally publishedYes
Event5th International Symposium on Neural Networks, ISNN 2008 - Beijing, China
Duration: Sept 24 2008Sept 28 2008

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 1
Volume5263 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other5th International Symposium on Neural Networks, ISNN 2008
Country/TerritoryChina
CityBeijing
Period9/24/089/28/08

Keywords

  • Decomposition method
  • Global convergence
  • Support vector regression

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Global convergence analysis of decomposition methods for support vector regression'. Together they form a unique fingerprint.

Cite this