Generalization-error-bound-based discriminative dictionary learning
Abstract Support vector guided dictionary learning, as a discriminative dictionary learning method combining with support vector machine (SVM), embodies the margin maximization principle and achieves good generalization performances in many practical applications. However, this method ignores the ke...
Ausführliche Beschreibung
Autor*in: |
Zhang, Kaifang [verfasserIn] |
---|
Format: |
Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2021 |
---|
Schlagwörter: |
---|
Anmerkung: |
© The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021 |
---|
Übergeordnetes Werk: |
Enthalten in: The visual computer - Springer Berlin Heidelberg, 1985, 38(2021), 8 vom: 17. Mai, Seite 2853-2869 |
---|---|
Übergeordnetes Werk: |
volume:38 ; year:2021 ; number:8 ; day:17 ; month:05 ; pages:2853-2869 |
Links: |
---|
DOI / URN: |
10.1007/s00371-021-02160-z |
---|
Katalog-ID: |
OLC2079186728 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | OLC2079186728 | ||
003 | DE-627 | ||
005 | 20230506040751.0 | ||
007 | tu | ||
008 | 221220s2021 xx ||||| 00| ||eng c | ||
024 | 7 | |a 10.1007/s00371-021-02160-z |2 doi | |
035 | |a (DE-627)OLC2079186728 | ||
035 | |a (DE-He213)s00371-021-02160-z-p | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
082 | 0 | 4 | |a 004 |q VZ |
100 | 1 | |a Zhang, Kaifang |e verfasserin |0 (orcid)0000-0001-5196-6532 |4 aut | |
245 | 1 | 0 | |a Generalization-error-bound-based discriminative dictionary learning |
264 | 1 | |c 2021 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ohne Hilfsmittel zu benutzen |b n |2 rdamedia | ||
338 | |a Band |b nc |2 rdacarrier | ||
500 | |a © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021 | ||
520 | |a Abstract Support vector guided dictionary learning, as a discriminative dictionary learning method combining with support vector machine (SVM), embodies the margin maximization principle and achieves good generalization performances in many practical applications. However, this method ignores the key fact that the generalization performance of the SVM classifier depends not only on the margin between two classes of training samples, but also on the radius of the smallest sphere covering them. In the paper, we propose a novel method called generalization-error-bound-based discriminative dictionary learning (GEBDDL). The basic insight of GEBDDL is that the coding vectors, which are used to build the SVM classifier, are not fixed during the learning process. As a result, the radius of the smallest sphere changes with the learned coding vectors. The key feature of GEBDDL is that it explicitly incorporates the radius-margin-bound, which is directly related to the upper bound of the leave-one-out error of SVM, into its objective function to guide learning the dictionary and the coding vectors, and building the SVM classifier. In the paper, we first elaborate our motivation and propose the optimization model and then discuss how to solve it in detail. Further, we explore how to approximate the radius of the smallest sphere in our methodology. This can enhance the computational efficiency by bypassing the quadratic programming problem of computing the radius, while yielding a close performance to GEBDDL. Finally, the comprehensive experiments are conducted on several benchmark datasets, and the results demonstrate the superiority of the proposed methods over the other competing methods. | ||
650 | 4 | |a Discriminative dictionary learning | |
650 | 4 | |a Collaborative representation | |
650 | 4 | |a Support vector machine | |
650 | 4 | |a Generalization error bound | |
700 | 1 | |a Wang, Xiaoming |0 (orcid)0000-0002-3297-5270 |4 aut | |
700 | 1 | |a Xu, Tao |4 aut | |
700 | 1 | |a Du, Yajun |4 aut | |
700 | 1 | |a Huang, Zengxi |4 aut | |
773 | 0 | 8 | |i Enthalten in |t The visual computer |d Springer Berlin Heidelberg, 1985 |g 38(2021), 8 vom: 17. Mai, Seite 2853-2869 |w (DE-627)12917985X |w (DE-600)52035-4 |w (DE-576)014455897 |x 0178-2789 |7 nnns |
773 | 1 | 8 | |g volume:38 |g year:2021 |g number:8 |g day:17 |g month:05 |g pages:2853-2869 |
856 | 4 | 1 | |u https://doi.org/10.1007/s00371-021-02160-z |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_OLC | ||
912 | |a SSG-OLC-MAT | ||
912 | |a SSG-OLC-GWK | ||
912 | |a GBV_ILN_267 | ||
912 | |a GBV_ILN_2018 | ||
912 | |a GBV_ILN_4277 | ||
951 | |a AR | ||
952 | |d 38 |j 2021 |e 8 |b 17 |c 05 |h 2853-2869 |
author_variant |
k z kz x w xw t x tx y d yd z h zh |
---|---|
matchkey_str |
article:01782789:2021----::eeaiainrobudaedsrmntvd |
hierarchy_sort_str |
2021 |
publishDate |
2021 |
allfields |
10.1007/s00371-021-02160-z doi (DE-627)OLC2079186728 (DE-He213)s00371-021-02160-z-p DE-627 ger DE-627 rakwb eng 004 VZ Zhang, Kaifang verfasserin (orcid)0000-0001-5196-6532 aut Generalization-error-bound-based discriminative dictionary learning 2021 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021 Abstract Support vector guided dictionary learning, as a discriminative dictionary learning method combining with support vector machine (SVM), embodies the margin maximization principle and achieves good generalization performances in many practical applications. However, this method ignores the key fact that the generalization performance of the SVM classifier depends not only on the margin between two classes of training samples, but also on the radius of the smallest sphere covering them. In the paper, we propose a novel method called generalization-error-bound-based discriminative dictionary learning (GEBDDL). The basic insight of GEBDDL is that the coding vectors, which are used to build the SVM classifier, are not fixed during the learning process. As a result, the radius of the smallest sphere changes with the learned coding vectors. The key feature of GEBDDL is that it explicitly incorporates the radius-margin-bound, which is directly related to the upper bound of the leave-one-out error of SVM, into its objective function to guide learning the dictionary and the coding vectors, and building the SVM classifier. In the paper, we first elaborate our motivation and propose the optimization model and then discuss how to solve it in detail. Further, we explore how to approximate the radius of the smallest sphere in our methodology. This can enhance the computational efficiency by bypassing the quadratic programming problem of computing the radius, while yielding a close performance to GEBDDL. Finally, the comprehensive experiments are conducted on several benchmark datasets, and the results demonstrate the superiority of the proposed methods over the other competing methods. Discriminative dictionary learning Collaborative representation Support vector machine Generalization error bound Wang, Xiaoming (orcid)0000-0002-3297-5270 aut Xu, Tao aut Du, Yajun aut Huang, Zengxi aut Enthalten in The visual computer Springer Berlin Heidelberg, 1985 38(2021), 8 vom: 17. Mai, Seite 2853-2869 (DE-627)12917985X (DE-600)52035-4 (DE-576)014455897 0178-2789 nnns volume:38 year:2021 number:8 day:17 month:05 pages:2853-2869 https://doi.org/10.1007/s00371-021-02160-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT SSG-OLC-GWK GBV_ILN_267 GBV_ILN_2018 GBV_ILN_4277 AR 38 2021 8 17 05 2853-2869 |
spelling |
10.1007/s00371-021-02160-z doi (DE-627)OLC2079186728 (DE-He213)s00371-021-02160-z-p DE-627 ger DE-627 rakwb eng 004 VZ Zhang, Kaifang verfasserin (orcid)0000-0001-5196-6532 aut Generalization-error-bound-based discriminative dictionary learning 2021 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021 Abstract Support vector guided dictionary learning, as a discriminative dictionary learning method combining with support vector machine (SVM), embodies the margin maximization principle and achieves good generalization performances in many practical applications. However, this method ignores the key fact that the generalization performance of the SVM classifier depends not only on the margin between two classes of training samples, but also on the radius of the smallest sphere covering them. In the paper, we propose a novel method called generalization-error-bound-based discriminative dictionary learning (GEBDDL). The basic insight of GEBDDL is that the coding vectors, which are used to build the SVM classifier, are not fixed during the learning process. As a result, the radius of the smallest sphere changes with the learned coding vectors. The key feature of GEBDDL is that it explicitly incorporates the radius-margin-bound, which is directly related to the upper bound of the leave-one-out error of SVM, into its objective function to guide learning the dictionary and the coding vectors, and building the SVM classifier. In the paper, we first elaborate our motivation and propose the optimization model and then discuss how to solve it in detail. Further, we explore how to approximate the radius of the smallest sphere in our methodology. This can enhance the computational efficiency by bypassing the quadratic programming problem of computing the radius, while yielding a close performance to GEBDDL. Finally, the comprehensive experiments are conducted on several benchmark datasets, and the results demonstrate the superiority of the proposed methods over the other competing methods. Discriminative dictionary learning Collaborative representation Support vector machine Generalization error bound Wang, Xiaoming (orcid)0000-0002-3297-5270 aut Xu, Tao aut Du, Yajun aut Huang, Zengxi aut Enthalten in The visual computer Springer Berlin Heidelberg, 1985 38(2021), 8 vom: 17. Mai, Seite 2853-2869 (DE-627)12917985X (DE-600)52035-4 (DE-576)014455897 0178-2789 nnns volume:38 year:2021 number:8 day:17 month:05 pages:2853-2869 https://doi.org/10.1007/s00371-021-02160-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT SSG-OLC-GWK GBV_ILN_267 GBV_ILN_2018 GBV_ILN_4277 AR 38 2021 8 17 05 2853-2869 |
allfields_unstemmed |
10.1007/s00371-021-02160-z doi (DE-627)OLC2079186728 (DE-He213)s00371-021-02160-z-p DE-627 ger DE-627 rakwb eng 004 VZ Zhang, Kaifang verfasserin (orcid)0000-0001-5196-6532 aut Generalization-error-bound-based discriminative dictionary learning 2021 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021 Abstract Support vector guided dictionary learning, as a discriminative dictionary learning method combining with support vector machine (SVM), embodies the margin maximization principle and achieves good generalization performances in many practical applications. However, this method ignores the key fact that the generalization performance of the SVM classifier depends not only on the margin between two classes of training samples, but also on the radius of the smallest sphere covering them. In the paper, we propose a novel method called generalization-error-bound-based discriminative dictionary learning (GEBDDL). The basic insight of GEBDDL is that the coding vectors, which are used to build the SVM classifier, are not fixed during the learning process. As a result, the radius of the smallest sphere changes with the learned coding vectors. The key feature of GEBDDL is that it explicitly incorporates the radius-margin-bound, which is directly related to the upper bound of the leave-one-out error of SVM, into its objective function to guide learning the dictionary and the coding vectors, and building the SVM classifier. In the paper, we first elaborate our motivation and propose the optimization model and then discuss how to solve it in detail. Further, we explore how to approximate the radius of the smallest sphere in our methodology. This can enhance the computational efficiency by bypassing the quadratic programming problem of computing the radius, while yielding a close performance to GEBDDL. Finally, the comprehensive experiments are conducted on several benchmark datasets, and the results demonstrate the superiority of the proposed methods over the other competing methods. Discriminative dictionary learning Collaborative representation Support vector machine Generalization error bound Wang, Xiaoming (orcid)0000-0002-3297-5270 aut Xu, Tao aut Du, Yajun aut Huang, Zengxi aut Enthalten in The visual computer Springer Berlin Heidelberg, 1985 38(2021), 8 vom: 17. Mai, Seite 2853-2869 (DE-627)12917985X (DE-600)52035-4 (DE-576)014455897 0178-2789 nnns volume:38 year:2021 number:8 day:17 month:05 pages:2853-2869 https://doi.org/10.1007/s00371-021-02160-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT SSG-OLC-GWK GBV_ILN_267 GBV_ILN_2018 GBV_ILN_4277 AR 38 2021 8 17 05 2853-2869 |
allfieldsGer |
10.1007/s00371-021-02160-z doi (DE-627)OLC2079186728 (DE-He213)s00371-021-02160-z-p DE-627 ger DE-627 rakwb eng 004 VZ Zhang, Kaifang verfasserin (orcid)0000-0001-5196-6532 aut Generalization-error-bound-based discriminative dictionary learning 2021 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021 Abstract Support vector guided dictionary learning, as a discriminative dictionary learning method combining with support vector machine (SVM), embodies the margin maximization principle and achieves good generalization performances in many practical applications. However, this method ignores the key fact that the generalization performance of the SVM classifier depends not only on the margin between two classes of training samples, but also on the radius of the smallest sphere covering them. In the paper, we propose a novel method called generalization-error-bound-based discriminative dictionary learning (GEBDDL). The basic insight of GEBDDL is that the coding vectors, which are used to build the SVM classifier, are not fixed during the learning process. As a result, the radius of the smallest sphere changes with the learned coding vectors. The key feature of GEBDDL is that it explicitly incorporates the radius-margin-bound, which is directly related to the upper bound of the leave-one-out error of SVM, into its objective function to guide learning the dictionary and the coding vectors, and building the SVM classifier. In the paper, we first elaborate our motivation and propose the optimization model and then discuss how to solve it in detail. Further, we explore how to approximate the radius of the smallest sphere in our methodology. This can enhance the computational efficiency by bypassing the quadratic programming problem of computing the radius, while yielding a close performance to GEBDDL. Finally, the comprehensive experiments are conducted on several benchmark datasets, and the results demonstrate the superiority of the proposed methods over the other competing methods. Discriminative dictionary learning Collaborative representation Support vector machine Generalization error bound Wang, Xiaoming (orcid)0000-0002-3297-5270 aut Xu, Tao aut Du, Yajun aut Huang, Zengxi aut Enthalten in The visual computer Springer Berlin Heidelberg, 1985 38(2021), 8 vom: 17. Mai, Seite 2853-2869 (DE-627)12917985X (DE-600)52035-4 (DE-576)014455897 0178-2789 nnns volume:38 year:2021 number:8 day:17 month:05 pages:2853-2869 https://doi.org/10.1007/s00371-021-02160-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT SSG-OLC-GWK GBV_ILN_267 GBV_ILN_2018 GBV_ILN_4277 AR 38 2021 8 17 05 2853-2869 |
allfieldsSound |
10.1007/s00371-021-02160-z doi (DE-627)OLC2079186728 (DE-He213)s00371-021-02160-z-p DE-627 ger DE-627 rakwb eng 004 VZ Zhang, Kaifang verfasserin (orcid)0000-0001-5196-6532 aut Generalization-error-bound-based discriminative dictionary learning 2021 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021 Abstract Support vector guided dictionary learning, as a discriminative dictionary learning method combining with support vector machine (SVM), embodies the margin maximization principle and achieves good generalization performances in many practical applications. However, this method ignores the key fact that the generalization performance of the SVM classifier depends not only on the margin between two classes of training samples, but also on the radius of the smallest sphere covering them. In the paper, we propose a novel method called generalization-error-bound-based discriminative dictionary learning (GEBDDL). The basic insight of GEBDDL is that the coding vectors, which are used to build the SVM classifier, are not fixed during the learning process. As a result, the radius of the smallest sphere changes with the learned coding vectors. The key feature of GEBDDL is that it explicitly incorporates the radius-margin-bound, which is directly related to the upper bound of the leave-one-out error of SVM, into its objective function to guide learning the dictionary and the coding vectors, and building the SVM classifier. In the paper, we first elaborate our motivation and propose the optimization model and then discuss how to solve it in detail. Further, we explore how to approximate the radius of the smallest sphere in our methodology. This can enhance the computational efficiency by bypassing the quadratic programming problem of computing the radius, while yielding a close performance to GEBDDL. Finally, the comprehensive experiments are conducted on several benchmark datasets, and the results demonstrate the superiority of the proposed methods over the other competing methods. Discriminative dictionary learning Collaborative representation Support vector machine Generalization error bound Wang, Xiaoming (orcid)0000-0002-3297-5270 aut Xu, Tao aut Du, Yajun aut Huang, Zengxi aut Enthalten in The visual computer Springer Berlin Heidelberg, 1985 38(2021), 8 vom: 17. Mai, Seite 2853-2869 (DE-627)12917985X (DE-600)52035-4 (DE-576)014455897 0178-2789 nnns volume:38 year:2021 number:8 day:17 month:05 pages:2853-2869 https://doi.org/10.1007/s00371-021-02160-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT SSG-OLC-GWK GBV_ILN_267 GBV_ILN_2018 GBV_ILN_4277 AR 38 2021 8 17 05 2853-2869 |
language |
English |
source |
Enthalten in The visual computer 38(2021), 8 vom: 17. Mai, Seite 2853-2869 volume:38 year:2021 number:8 day:17 month:05 pages:2853-2869 |
sourceStr |
Enthalten in The visual computer 38(2021), 8 vom: 17. Mai, Seite 2853-2869 volume:38 year:2021 number:8 day:17 month:05 pages:2853-2869 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Discriminative dictionary learning Collaborative representation Support vector machine Generalization error bound |
dewey-raw |
004 |
isfreeaccess_bool |
false |
container_title |
The visual computer |
authorswithroles_txt_mv |
Zhang, Kaifang @@aut@@ Wang, Xiaoming @@aut@@ Xu, Tao @@aut@@ Du, Yajun @@aut@@ Huang, Zengxi @@aut@@ |
publishDateDaySort_date |
2021-05-17T00:00:00Z |
hierarchy_top_id |
12917985X |
dewey-sort |
14 |
id |
OLC2079186728 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2079186728</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230506040751.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">221220s2021 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00371-021-02160-z</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2079186728</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s00371-021-02160-z-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Zhang, Kaifang</subfield><subfield code="e">verfasserin</subfield><subfield code="0">(orcid)0000-0001-5196-6532</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Generalization-error-bound-based discriminative dictionary learning</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2021</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Support vector guided dictionary learning, as a discriminative dictionary learning method combining with support vector machine (SVM), embodies the margin maximization principle and achieves good generalization performances in many practical applications. However, this method ignores the key fact that the generalization performance of the SVM classifier depends not only on the margin between two classes of training samples, but also on the radius of the smallest sphere covering them. In the paper, we propose a novel method called generalization-error-bound-based discriminative dictionary learning (GEBDDL). The basic insight of GEBDDL is that the coding vectors, which are used to build the SVM classifier, are not fixed during the learning process. As a result, the radius of the smallest sphere changes with the learned coding vectors. The key feature of GEBDDL is that it explicitly incorporates the radius-margin-bound, which is directly related to the upper bound of the leave-one-out error of SVM, into its objective function to guide learning the dictionary and the coding vectors, and building the SVM classifier. In the paper, we first elaborate our motivation and propose the optimization model and then discuss how to solve it in detail. Further, we explore how to approximate the radius of the smallest sphere in our methodology. This can enhance the computational efficiency by bypassing the quadratic programming problem of computing the radius, while yielding a close performance to GEBDDL. Finally, the comprehensive experiments are conducted on several benchmark datasets, and the results demonstrate the superiority of the proposed methods over the other competing methods.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Discriminative dictionary learning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Collaborative representation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Support vector machine</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Generalization error bound</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Wang, Xiaoming</subfield><subfield code="0">(orcid)0000-0002-3297-5270</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Xu, Tao</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Du, Yajun</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Huang, Zengxi</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">The visual computer</subfield><subfield code="d">Springer Berlin Heidelberg, 1985</subfield><subfield code="g">38(2021), 8 vom: 17. Mai, Seite 2853-2869</subfield><subfield code="w">(DE-627)12917985X</subfield><subfield code="w">(DE-600)52035-4</subfield><subfield code="w">(DE-576)014455897</subfield><subfield code="x">0178-2789</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:38</subfield><subfield code="g">year:2021</subfield><subfield code="g">number:8</subfield><subfield code="g">day:17</subfield><subfield code="g">month:05</subfield><subfield code="g">pages:2853-2869</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s00371-021-02160-z</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-GWK</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_267</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2018</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4277</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">38</subfield><subfield code="j">2021</subfield><subfield code="e">8</subfield><subfield code="b">17</subfield><subfield code="c">05</subfield><subfield code="h">2853-2869</subfield></datafield></record></collection>
|
author |
Zhang, Kaifang |
spellingShingle |
Zhang, Kaifang ddc 004 misc Discriminative dictionary learning misc Collaborative representation misc Support vector machine misc Generalization error bound Generalization-error-bound-based discriminative dictionary learning |
authorStr |
Zhang, Kaifang |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)12917985X |
format |
Article |
dewey-ones |
004 - Data processing & computer science |
delete_txt_mv |
keep |
author_role |
aut aut aut aut aut |
collection |
OLC |
remote_str |
false |
illustrated |
Not Illustrated |
issn |
0178-2789 |
topic_title |
004 VZ Generalization-error-bound-based discriminative dictionary learning Discriminative dictionary learning Collaborative representation Support vector machine Generalization error bound |
topic |
ddc 004 misc Discriminative dictionary learning misc Collaborative representation misc Support vector machine misc Generalization error bound |
topic_unstemmed |
ddc 004 misc Discriminative dictionary learning misc Collaborative representation misc Support vector machine misc Generalization error bound |
topic_browse |
ddc 004 misc Discriminative dictionary learning misc Collaborative representation misc Support vector machine misc Generalization error bound |
format_facet |
Aufsätze Gedruckte Aufsätze |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
nc |
hierarchy_parent_title |
The visual computer |
hierarchy_parent_id |
12917985X |
dewey-tens |
000 - Computer science, knowledge & systems |
hierarchy_top_title |
The visual computer |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)12917985X (DE-600)52035-4 (DE-576)014455897 |
title |
Generalization-error-bound-based discriminative dictionary learning |
ctrlnum |
(DE-627)OLC2079186728 (DE-He213)s00371-021-02160-z-p |
title_full |
Generalization-error-bound-based discriminative dictionary learning |
author_sort |
Zhang, Kaifang |
journal |
The visual computer |
journalStr |
The visual computer |
lang_code |
eng |
isOA_bool |
false |
dewey-hundreds |
000 - Computer science, information & general works |
recordtype |
marc |
publishDateSort |
2021 |
contenttype_str_mv |
txt |
container_start_page |
2853 |
author_browse |
Zhang, Kaifang Wang, Xiaoming Xu, Tao Du, Yajun Huang, Zengxi |
container_volume |
38 |
class |
004 VZ |
format_se |
Aufsätze |
author-letter |
Zhang, Kaifang |
doi_str_mv |
10.1007/s00371-021-02160-z |
normlink |
(ORCID)0000-0001-5196-6532 (ORCID)0000-0002-3297-5270 |
normlink_prefix_str_mv |
(orcid)0000-0001-5196-6532 (orcid)0000-0002-3297-5270 |
dewey-full |
004 |
title_sort |
generalization-error-bound-based discriminative dictionary learning |
title_auth |
Generalization-error-bound-based discriminative dictionary learning |
abstract |
Abstract Support vector guided dictionary learning, as a discriminative dictionary learning method combining with support vector machine (SVM), embodies the margin maximization principle and achieves good generalization performances in many practical applications. However, this method ignores the key fact that the generalization performance of the SVM classifier depends not only on the margin between two classes of training samples, but also on the radius of the smallest sphere covering them. In the paper, we propose a novel method called generalization-error-bound-based discriminative dictionary learning (GEBDDL). The basic insight of GEBDDL is that the coding vectors, which are used to build the SVM classifier, are not fixed during the learning process. As a result, the radius of the smallest sphere changes with the learned coding vectors. The key feature of GEBDDL is that it explicitly incorporates the radius-margin-bound, which is directly related to the upper bound of the leave-one-out error of SVM, into its objective function to guide learning the dictionary and the coding vectors, and building the SVM classifier. In the paper, we first elaborate our motivation and propose the optimization model and then discuss how to solve it in detail. Further, we explore how to approximate the radius of the smallest sphere in our methodology. This can enhance the computational efficiency by bypassing the quadratic programming problem of computing the radius, while yielding a close performance to GEBDDL. Finally, the comprehensive experiments are conducted on several benchmark datasets, and the results demonstrate the superiority of the proposed methods over the other competing methods. © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021 |
abstractGer |
Abstract Support vector guided dictionary learning, as a discriminative dictionary learning method combining with support vector machine (SVM), embodies the margin maximization principle and achieves good generalization performances in many practical applications. However, this method ignores the key fact that the generalization performance of the SVM classifier depends not only on the margin between two classes of training samples, but also on the radius of the smallest sphere covering them. In the paper, we propose a novel method called generalization-error-bound-based discriminative dictionary learning (GEBDDL). The basic insight of GEBDDL is that the coding vectors, which are used to build the SVM classifier, are not fixed during the learning process. As a result, the radius of the smallest sphere changes with the learned coding vectors. The key feature of GEBDDL is that it explicitly incorporates the radius-margin-bound, which is directly related to the upper bound of the leave-one-out error of SVM, into its objective function to guide learning the dictionary and the coding vectors, and building the SVM classifier. In the paper, we first elaborate our motivation and propose the optimization model and then discuss how to solve it in detail. Further, we explore how to approximate the radius of the smallest sphere in our methodology. This can enhance the computational efficiency by bypassing the quadratic programming problem of computing the radius, while yielding a close performance to GEBDDL. Finally, the comprehensive experiments are conducted on several benchmark datasets, and the results demonstrate the superiority of the proposed methods over the other competing methods. © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021 |
abstract_unstemmed |
Abstract Support vector guided dictionary learning, as a discriminative dictionary learning method combining with support vector machine (SVM), embodies the margin maximization principle and achieves good generalization performances in many practical applications. However, this method ignores the key fact that the generalization performance of the SVM classifier depends not only on the margin between two classes of training samples, but also on the radius of the smallest sphere covering them. In the paper, we propose a novel method called generalization-error-bound-based discriminative dictionary learning (GEBDDL). The basic insight of GEBDDL is that the coding vectors, which are used to build the SVM classifier, are not fixed during the learning process. As a result, the radius of the smallest sphere changes with the learned coding vectors. The key feature of GEBDDL is that it explicitly incorporates the radius-margin-bound, which is directly related to the upper bound of the leave-one-out error of SVM, into its objective function to guide learning the dictionary and the coding vectors, and building the SVM classifier. In the paper, we first elaborate our motivation and propose the optimization model and then discuss how to solve it in detail. Further, we explore how to approximate the radius of the smallest sphere in our methodology. This can enhance the computational efficiency by bypassing the quadratic programming problem of computing the radius, while yielding a close performance to GEBDDL. Finally, the comprehensive experiments are conducted on several benchmark datasets, and the results demonstrate the superiority of the proposed methods over the other competing methods. © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT SSG-OLC-GWK GBV_ILN_267 GBV_ILN_2018 GBV_ILN_4277 |
container_issue |
8 |
title_short |
Generalization-error-bound-based discriminative dictionary learning |
url |
https://doi.org/10.1007/s00371-021-02160-z |
remote_bool |
false |
author2 |
Wang, Xiaoming Xu, Tao Du, Yajun Huang, Zengxi |
author2Str |
Wang, Xiaoming Xu, Tao Du, Yajun Huang, Zengxi |
ppnlink |
12917985X |
mediatype_str_mv |
n |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s00371-021-02160-z |
up_date |
2024-07-03T23:52:53.960Z |
_version_ |
1803603960361123840 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2079186728</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230506040751.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">221220s2021 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00371-021-02160-z</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2079186728</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s00371-021-02160-z-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Zhang, Kaifang</subfield><subfield code="e">verfasserin</subfield><subfield code="0">(orcid)0000-0001-5196-6532</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Generalization-error-bound-based discriminative dictionary learning</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2021</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Support vector guided dictionary learning, as a discriminative dictionary learning method combining with support vector machine (SVM), embodies the margin maximization principle and achieves good generalization performances in many practical applications. However, this method ignores the key fact that the generalization performance of the SVM classifier depends not only on the margin between two classes of training samples, but also on the radius of the smallest sphere covering them. In the paper, we propose a novel method called generalization-error-bound-based discriminative dictionary learning (GEBDDL). The basic insight of GEBDDL is that the coding vectors, which are used to build the SVM classifier, are not fixed during the learning process. As a result, the radius of the smallest sphere changes with the learned coding vectors. The key feature of GEBDDL is that it explicitly incorporates the radius-margin-bound, which is directly related to the upper bound of the leave-one-out error of SVM, into its objective function to guide learning the dictionary and the coding vectors, and building the SVM classifier. In the paper, we first elaborate our motivation and propose the optimization model and then discuss how to solve it in detail. Further, we explore how to approximate the radius of the smallest sphere in our methodology. This can enhance the computational efficiency by bypassing the quadratic programming problem of computing the radius, while yielding a close performance to GEBDDL. Finally, the comprehensive experiments are conducted on several benchmark datasets, and the results demonstrate the superiority of the proposed methods over the other competing methods.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Discriminative dictionary learning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Collaborative representation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Support vector machine</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Generalization error bound</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Wang, Xiaoming</subfield><subfield code="0">(orcid)0000-0002-3297-5270</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Xu, Tao</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Du, Yajun</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Huang, Zengxi</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">The visual computer</subfield><subfield code="d">Springer Berlin Heidelberg, 1985</subfield><subfield code="g">38(2021), 8 vom: 17. Mai, Seite 2853-2869</subfield><subfield code="w">(DE-627)12917985X</subfield><subfield code="w">(DE-600)52035-4</subfield><subfield code="w">(DE-576)014455897</subfield><subfield code="x">0178-2789</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:38</subfield><subfield code="g">year:2021</subfield><subfield code="g">number:8</subfield><subfield code="g">day:17</subfield><subfield code="g">month:05</subfield><subfield code="g">pages:2853-2869</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s00371-021-02160-z</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-GWK</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_267</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2018</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4277</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">38</subfield><subfield code="j">2021</subfield><subfield code="e">8</subfield><subfield code="b">17</subfield><subfield code="c">05</subfield><subfield code="h">2853-2869</subfield></datafield></record></collection>
|
score |
7.4011087 |