Support vector machine via nonlinear rescaling method
Abstract In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). T...
Ausführliche Beschreibung
Autor*in: |
Polyak, Roman [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2006 |
---|
Schlagwörter: |
---|
Anmerkung: |
© Springer-Verlag 2006 |
---|
Übergeordnetes Werk: |
Enthalten in: Optimization letters - Berlin : Springer, 2007, 1(2006), 4 vom: 11. Nov., Seite 367-378 |
---|---|
Übergeordnetes Werk: |
volume:1 ; year:2006 ; number:4 ; day:11 ; month:11 ; pages:367-378 |
Links: |
---|
DOI / URN: |
10.1007/s11590-006-0033-2 |
---|
Katalog-ID: |
SPR020954530 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | SPR020954530 | ||
003 | DE-627 | ||
005 | 20230330164146.0 | ||
007 | cr uuu---uuuuu | ||
008 | 201006s2006 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1007/s11590-006-0033-2 |2 doi | |
035 | |a (DE-627)SPR020954530 | ||
035 | |a (SPR)s11590-006-0033-2-e | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
100 | 1 | |a Polyak, Roman |e verfasserin |4 aut | |
245 | 1 | 0 | |a Support vector machine via nonlinear rescaling method |
264 | 1 | |c 2006 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
500 | |a © Springer-Verlag 2006 | ||
520 | |a Abstract In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). The formulation of the linear SVM based on the NR method leads to an algorithm which reduces the number of support vectors without compromising the classification performance compared to the linear soft-margin SVM formulation. The NR algorithm computes both the primal and the dual approximation at each step. The dual variables associated with the given data-set provide important information about each data point and play the key role in selecting the set of support vectors. Experimental results on ten benchmark classification problems show that the NR formulation is feasible. The quality of discrimination, in most instances, is comparable to the linear soft-margin SVM while the number of support vectors in several instances were substantially reduced. | ||
650 | 4 | |a Support vector machine |7 (dpeaa)DE-He213 | |
650 | 4 | |a Convex optimization |7 (dpeaa)DE-He213 | |
650 | 4 | |a Duality |7 (dpeaa)DE-He213 | |
650 | 4 | |a Lagrange multipliers |7 (dpeaa)DE-He213 | |
700 | 1 | |a Ho, Shen-Shyang |4 aut | |
700 | 1 | |a Griva, Igor |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Optimization letters |d Berlin : Springer, 2007 |g 1(2006), 4 vom: 11. Nov., Seite 367-378 |w (DE-627)534676499 |w (DE-600)2374345-1 |x 1862-4480 |7 nnns |
773 | 1 | 8 | |g volume:1 |g year:2006 |g number:4 |g day:11 |g month:11 |g pages:367-378 |
856 | 4 | 0 | |u https://dx.doi.org/10.1007/s11590-006-0033-2 |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_SPRINGER | ||
951 | |a AR | ||
952 | |d 1 |j 2006 |e 4 |b 11 |c 11 |h 367-378 |
author_variant |
r p rp s s h ssh i g ig |
---|---|
matchkey_str |
article:18624480:2006----::uprvcomcieinnier |
hierarchy_sort_str |
2006 |
publishDate |
2006 |
allfields |
10.1007/s11590-006-0033-2 doi (DE-627)SPR020954530 (SPR)s11590-006-0033-2-e DE-627 ger DE-627 rakwb eng Polyak, Roman verfasserin aut Support vector machine via nonlinear rescaling method 2006 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag 2006 Abstract In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). The formulation of the linear SVM based on the NR method leads to an algorithm which reduces the number of support vectors without compromising the classification performance compared to the linear soft-margin SVM formulation. The NR algorithm computes both the primal and the dual approximation at each step. The dual variables associated with the given data-set provide important information about each data point and play the key role in selecting the set of support vectors. Experimental results on ten benchmark classification problems show that the NR formulation is feasible. The quality of discrimination, in most instances, is comparable to the linear soft-margin SVM while the number of support vectors in several instances were substantially reduced. Support vector machine (dpeaa)DE-He213 Convex optimization (dpeaa)DE-He213 Duality (dpeaa)DE-He213 Lagrange multipliers (dpeaa)DE-He213 Ho, Shen-Shyang aut Griva, Igor aut Enthalten in Optimization letters Berlin : Springer, 2007 1(2006), 4 vom: 11. Nov., Seite 367-378 (DE-627)534676499 (DE-600)2374345-1 1862-4480 nnns volume:1 year:2006 number:4 day:11 month:11 pages:367-378 https://dx.doi.org/10.1007/s11590-006-0033-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 1 2006 4 11 11 367-378 |
spelling |
10.1007/s11590-006-0033-2 doi (DE-627)SPR020954530 (SPR)s11590-006-0033-2-e DE-627 ger DE-627 rakwb eng Polyak, Roman verfasserin aut Support vector machine via nonlinear rescaling method 2006 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag 2006 Abstract In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). The formulation of the linear SVM based on the NR method leads to an algorithm which reduces the number of support vectors without compromising the classification performance compared to the linear soft-margin SVM formulation. The NR algorithm computes both the primal and the dual approximation at each step. The dual variables associated with the given data-set provide important information about each data point and play the key role in selecting the set of support vectors. Experimental results on ten benchmark classification problems show that the NR formulation is feasible. The quality of discrimination, in most instances, is comparable to the linear soft-margin SVM while the number of support vectors in several instances were substantially reduced. Support vector machine (dpeaa)DE-He213 Convex optimization (dpeaa)DE-He213 Duality (dpeaa)DE-He213 Lagrange multipliers (dpeaa)DE-He213 Ho, Shen-Shyang aut Griva, Igor aut Enthalten in Optimization letters Berlin : Springer, 2007 1(2006), 4 vom: 11. Nov., Seite 367-378 (DE-627)534676499 (DE-600)2374345-1 1862-4480 nnns volume:1 year:2006 number:4 day:11 month:11 pages:367-378 https://dx.doi.org/10.1007/s11590-006-0033-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 1 2006 4 11 11 367-378 |
allfields_unstemmed |
10.1007/s11590-006-0033-2 doi (DE-627)SPR020954530 (SPR)s11590-006-0033-2-e DE-627 ger DE-627 rakwb eng Polyak, Roman verfasserin aut Support vector machine via nonlinear rescaling method 2006 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag 2006 Abstract In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). The formulation of the linear SVM based on the NR method leads to an algorithm which reduces the number of support vectors without compromising the classification performance compared to the linear soft-margin SVM formulation. The NR algorithm computes both the primal and the dual approximation at each step. The dual variables associated with the given data-set provide important information about each data point and play the key role in selecting the set of support vectors. Experimental results on ten benchmark classification problems show that the NR formulation is feasible. The quality of discrimination, in most instances, is comparable to the linear soft-margin SVM while the number of support vectors in several instances were substantially reduced. Support vector machine (dpeaa)DE-He213 Convex optimization (dpeaa)DE-He213 Duality (dpeaa)DE-He213 Lagrange multipliers (dpeaa)DE-He213 Ho, Shen-Shyang aut Griva, Igor aut Enthalten in Optimization letters Berlin : Springer, 2007 1(2006), 4 vom: 11. Nov., Seite 367-378 (DE-627)534676499 (DE-600)2374345-1 1862-4480 nnns volume:1 year:2006 number:4 day:11 month:11 pages:367-378 https://dx.doi.org/10.1007/s11590-006-0033-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 1 2006 4 11 11 367-378 |
allfieldsGer |
10.1007/s11590-006-0033-2 doi (DE-627)SPR020954530 (SPR)s11590-006-0033-2-e DE-627 ger DE-627 rakwb eng Polyak, Roman verfasserin aut Support vector machine via nonlinear rescaling method 2006 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag 2006 Abstract In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). The formulation of the linear SVM based on the NR method leads to an algorithm which reduces the number of support vectors without compromising the classification performance compared to the linear soft-margin SVM formulation. The NR algorithm computes both the primal and the dual approximation at each step. The dual variables associated with the given data-set provide important information about each data point and play the key role in selecting the set of support vectors. Experimental results on ten benchmark classification problems show that the NR formulation is feasible. The quality of discrimination, in most instances, is comparable to the linear soft-margin SVM while the number of support vectors in several instances were substantially reduced. Support vector machine (dpeaa)DE-He213 Convex optimization (dpeaa)DE-He213 Duality (dpeaa)DE-He213 Lagrange multipliers (dpeaa)DE-He213 Ho, Shen-Shyang aut Griva, Igor aut Enthalten in Optimization letters Berlin : Springer, 2007 1(2006), 4 vom: 11. Nov., Seite 367-378 (DE-627)534676499 (DE-600)2374345-1 1862-4480 nnns volume:1 year:2006 number:4 day:11 month:11 pages:367-378 https://dx.doi.org/10.1007/s11590-006-0033-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 1 2006 4 11 11 367-378 |
allfieldsSound |
10.1007/s11590-006-0033-2 doi (DE-627)SPR020954530 (SPR)s11590-006-0033-2-e DE-627 ger DE-627 rakwb eng Polyak, Roman verfasserin aut Support vector machine via nonlinear rescaling method 2006 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag 2006 Abstract In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). The formulation of the linear SVM based on the NR method leads to an algorithm which reduces the number of support vectors without compromising the classification performance compared to the linear soft-margin SVM formulation. The NR algorithm computes both the primal and the dual approximation at each step. The dual variables associated with the given data-set provide important information about each data point and play the key role in selecting the set of support vectors. Experimental results on ten benchmark classification problems show that the NR formulation is feasible. The quality of discrimination, in most instances, is comparable to the linear soft-margin SVM while the number of support vectors in several instances were substantially reduced. Support vector machine (dpeaa)DE-He213 Convex optimization (dpeaa)DE-He213 Duality (dpeaa)DE-He213 Lagrange multipliers (dpeaa)DE-He213 Ho, Shen-Shyang aut Griva, Igor aut Enthalten in Optimization letters Berlin : Springer, 2007 1(2006), 4 vom: 11. Nov., Seite 367-378 (DE-627)534676499 (DE-600)2374345-1 1862-4480 nnns volume:1 year:2006 number:4 day:11 month:11 pages:367-378 https://dx.doi.org/10.1007/s11590-006-0033-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 1 2006 4 11 11 367-378 |
language |
English |
source |
Enthalten in Optimization letters 1(2006), 4 vom: 11. Nov., Seite 367-378 volume:1 year:2006 number:4 day:11 month:11 pages:367-378 |
sourceStr |
Enthalten in Optimization letters 1(2006), 4 vom: 11. Nov., Seite 367-378 volume:1 year:2006 number:4 day:11 month:11 pages:367-378 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Support vector machine Convex optimization Duality Lagrange multipliers |
isfreeaccess_bool |
false |
container_title |
Optimization letters |
authorswithroles_txt_mv |
Polyak, Roman @@aut@@ Ho, Shen-Shyang @@aut@@ Griva, Igor @@aut@@ |
publishDateDaySort_date |
2006-11-11T00:00:00Z |
hierarchy_top_id |
534676499 |
id |
SPR020954530 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">SPR020954530</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230330164146.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">201006s2006 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s11590-006-0033-2</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR020954530</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s11590-006-0033-2-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Polyak, Roman</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Support vector machine via nonlinear rescaling method</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2006</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer-Verlag 2006</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). The formulation of the linear SVM based on the NR method leads to an algorithm which reduces the number of support vectors without compromising the classification performance compared to the linear soft-margin SVM formulation. The NR algorithm computes both the primal and the dual approximation at each step. The dual variables associated with the given data-set provide important information about each data point and play the key role in selecting the set of support vectors. Experimental results on ten benchmark classification problems show that the NR formulation is feasible. The quality of discrimination, in most instances, is comparable to the linear soft-margin SVM while the number of support vectors in several instances were substantially reduced.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Support vector machine</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Convex optimization</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Duality</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Lagrange multipliers</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Ho, Shen-Shyang</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Griva, Igor</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Optimization letters</subfield><subfield code="d">Berlin : Springer, 2007</subfield><subfield code="g">1(2006), 4 vom: 11. Nov., Seite 367-378</subfield><subfield code="w">(DE-627)534676499</subfield><subfield code="w">(DE-600)2374345-1</subfield><subfield code="x">1862-4480</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:1</subfield><subfield code="g">year:2006</subfield><subfield code="g">number:4</subfield><subfield code="g">day:11</subfield><subfield code="g">month:11</subfield><subfield code="g">pages:367-378</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s11590-006-0033-2</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">1</subfield><subfield code="j">2006</subfield><subfield code="e">4</subfield><subfield code="b">11</subfield><subfield code="c">11</subfield><subfield code="h">367-378</subfield></datafield></record></collection>
|
author |
Polyak, Roman |
spellingShingle |
Polyak, Roman misc Support vector machine misc Convex optimization misc Duality misc Lagrange multipliers Support vector machine via nonlinear rescaling method |
authorStr |
Polyak, Roman |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)534676499 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut aut |
collection |
springer |
remote_str |
true |
illustrated |
Not Illustrated |
issn |
1862-4480 |
topic_title |
Support vector machine via nonlinear rescaling method Support vector machine (dpeaa)DE-He213 Convex optimization (dpeaa)DE-He213 Duality (dpeaa)DE-He213 Lagrange multipliers (dpeaa)DE-He213 |
topic |
misc Support vector machine misc Convex optimization misc Duality misc Lagrange multipliers |
topic_unstemmed |
misc Support vector machine misc Convex optimization misc Duality misc Lagrange multipliers |
topic_browse |
misc Support vector machine misc Convex optimization misc Duality misc Lagrange multipliers |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
Optimization letters |
hierarchy_parent_id |
534676499 |
hierarchy_top_title |
Optimization letters |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)534676499 (DE-600)2374345-1 |
title |
Support vector machine via nonlinear rescaling method |
ctrlnum |
(DE-627)SPR020954530 (SPR)s11590-006-0033-2-e |
title_full |
Support vector machine via nonlinear rescaling method |
author_sort |
Polyak, Roman |
journal |
Optimization letters |
journalStr |
Optimization letters |
lang_code |
eng |
isOA_bool |
false |
recordtype |
marc |
publishDateSort |
2006 |
contenttype_str_mv |
txt |
container_start_page |
367 |
author_browse |
Polyak, Roman Ho, Shen-Shyang Griva, Igor |
container_volume |
1 |
format_se |
Elektronische Aufsätze |
author-letter |
Polyak, Roman |
doi_str_mv |
10.1007/s11590-006-0033-2 |
title_sort |
support vector machine via nonlinear rescaling method |
title_auth |
Support vector machine via nonlinear rescaling method |
abstract |
Abstract In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). The formulation of the linear SVM based on the NR method leads to an algorithm which reduces the number of support vectors without compromising the classification performance compared to the linear soft-margin SVM formulation. The NR algorithm computes both the primal and the dual approximation at each step. The dual variables associated with the given data-set provide important information about each data point and play the key role in selecting the set of support vectors. Experimental results on ten benchmark classification problems show that the NR formulation is feasible. The quality of discrimination, in most instances, is comparable to the linear soft-margin SVM while the number of support vectors in several instances were substantially reduced. © Springer-Verlag 2006 |
abstractGer |
Abstract In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). The formulation of the linear SVM based on the NR method leads to an algorithm which reduces the number of support vectors without compromising the classification performance compared to the linear soft-margin SVM formulation. The NR algorithm computes both the primal and the dual approximation at each step. The dual variables associated with the given data-set provide important information about each data point and play the key role in selecting the set of support vectors. Experimental results on ten benchmark classification problems show that the NR formulation is feasible. The quality of discrimination, in most instances, is comparable to the linear soft-margin SVM while the number of support vectors in several instances were substantially reduced. © Springer-Verlag 2006 |
abstract_unstemmed |
Abstract In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). The formulation of the linear SVM based on the NR method leads to an algorithm which reduces the number of support vectors without compromising the classification performance compared to the linear soft-margin SVM formulation. The NR algorithm computes both the primal and the dual approximation at each step. The dual variables associated with the given data-set provide important information about each data point and play the key role in selecting the set of support vectors. Experimental results on ten benchmark classification problems show that the NR formulation is feasible. The quality of discrimination, in most instances, is comparable to the linear soft-margin SVM while the number of support vectors in several instances were substantially reduced. © Springer-Verlag 2006 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER |
container_issue |
4 |
title_short |
Support vector machine via nonlinear rescaling method |
url |
https://dx.doi.org/10.1007/s11590-006-0033-2 |
remote_bool |
true |
author2 |
Ho, Shen-Shyang Griva, Igor |
author2Str |
Ho, Shen-Shyang Griva, Igor |
ppnlink |
534676499 |
mediatype_str_mv |
c |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s11590-006-0033-2 |
up_date |
2024-07-03T19:20:18.066Z |
_version_ |
1803586809959022593 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">SPR020954530</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230330164146.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">201006s2006 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s11590-006-0033-2</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR020954530</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s11590-006-0033-2-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Polyak, Roman</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Support vector machine via nonlinear rescaling method</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2006</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer-Verlag 2006</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract In this paper we construct the linear support vector machine (SVM) based on the nonlinear rescaling (NR) methodology (see [Polyak in Math Program 54:177–222, 1992; Polyak in Math Program Ser A 92:197–235, 2002; Polyak and Teboulle in Math Program 76:265–284, 1997] and references therein). The formulation of the linear SVM based on the NR method leads to an algorithm which reduces the number of support vectors without compromising the classification performance compared to the linear soft-margin SVM formulation. The NR algorithm computes both the primal and the dual approximation at each step. The dual variables associated with the given data-set provide important information about each data point and play the key role in selecting the set of support vectors. Experimental results on ten benchmark classification problems show that the NR formulation is feasible. The quality of discrimination, in most instances, is comparable to the linear soft-margin SVM while the number of support vectors in several instances were substantially reduced.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Support vector machine</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Convex optimization</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Duality</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Lagrange multipliers</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Ho, Shen-Shyang</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Griva, Igor</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Optimization letters</subfield><subfield code="d">Berlin : Springer, 2007</subfield><subfield code="g">1(2006), 4 vom: 11. Nov., Seite 367-378</subfield><subfield code="w">(DE-627)534676499</subfield><subfield code="w">(DE-600)2374345-1</subfield><subfield code="x">1862-4480</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:1</subfield><subfield code="g">year:2006</subfield><subfield code="g">number:4</subfield><subfield code="g">day:11</subfield><subfield code="g">month:11</subfield><subfield code="g">pages:367-378</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s11590-006-0033-2</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">1</subfield><subfield code="j">2006</subfield><subfield code="e">4</subfield><subfield code="b">11</subfield><subfield code="c">11</subfield><subfield code="h">367-378</subfield></datafield></record></collection>
|
score |
7.40096 |