An Iterative Method for Deciding SVM and Single Layer Neural Network Structures
Abstract We present two new classifiers for two-class classification problems using a new Beta-SVM kernel transformation and an iterative algorithm to concurrently select the support vectors for a support vector machine (SVM) and the hidden units for a single hidden layer neural network to achieve a...
Ausführliche Beschreibung
Autor*in: |
Hamdani, Tarek M. [verfasserIn] |
---|
Format: |
Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2011 |
---|
Schlagwörter: |
---|
Anmerkung: |
© Springer Science+Business Media, LLC. 2011 |
---|
Übergeordnetes Werk: |
Enthalten in: Neural processing letters - Springer US, 1994, 33(2011), 2 vom: 23. Jan., Seite 171-186 |
---|---|
Übergeordnetes Werk: |
volume:33 ; year:2011 ; number:2 ; day:23 ; month:01 ; pages:171-186 |
Links: |
---|
DOI / URN: |
10.1007/s11063-011-9171-3 |
---|
Katalog-ID: |
OLC2044706652 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | OLC2044706652 | ||
003 | DE-627 | ||
005 | 20230503210136.0 | ||
007 | tu | ||
008 | 200820s2011 xx ||||| 00| ||eng c | ||
024 | 7 | |a 10.1007/s11063-011-9171-3 |2 doi | |
035 | |a (DE-627)OLC2044706652 | ||
035 | |a (DE-He213)s11063-011-9171-3-p | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
082 | 0 | 4 | |a 000 |q VZ |
100 | 1 | |a Hamdani, Tarek M. |e verfasserin |4 aut | |
245 | 1 | 0 | |a An Iterative Method for Deciding SVM and Single Layer Neural Network Structures |
264 | 1 | |c 2011 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ohne Hilfsmittel zu benutzen |b n |2 rdamedia | ||
338 | |a Band |b nc |2 rdacarrier | ||
500 | |a © Springer Science+Business Media, LLC. 2011 | ||
520 | |a Abstract We present two new classifiers for two-class classification problems using a new Beta-SVM kernel transformation and an iterative algorithm to concurrently select the support vectors for a support vector machine (SVM) and the hidden units for a single hidden layer neural network to achieve a better generalization performance. To construct the classifiers, the contributing data points are chosen on the basis of a thresholding scheme of the outputs of a single perceptron trained using all training data samples. The chosen support vectors are used to construct a new SVM classifier that we call Beta-SVN. The number of chosen support vectors is used to determine the structure of the hidden layer in a single hidden layer neural network that we call Beta-NN. The Beta-SVN and Beta-NN structures produced by our method outperformed other commonly used classifiers when tested on a 2-dimensional non-linearly separable data set. | ||
650 | 4 | |a Neural and statistical pattern recognition | |
650 | 4 | |a Support vector machines | |
650 | 4 | |a Single layer neural network | |
650 | 4 | |a Kernel function | |
650 | 4 | |a Beta function | |
700 | 1 | |a Alimi, Adel M. |4 aut | |
700 | 1 | |a Khabou, Mohamed A. |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Neural processing letters |d Springer US, 1994 |g 33(2011), 2 vom: 23. Jan., Seite 171-186 |w (DE-627)198692617 |w (DE-600)1316823-X |w (DE-576)052842762 |x 1370-4621 |7 nnns |
773 | 1 | 8 | |g volume:33 |g year:2011 |g number:2 |g day:23 |g month:01 |g pages:171-186 |
856 | 4 | 1 | |u https://doi.org/10.1007/s11063-011-9171-3 |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_OLC | ||
912 | |a SSG-OLC-PSY | ||
912 | |a SSG-OLC-MAT | ||
912 | |a GBV_ILN_70 | ||
951 | |a AR | ||
952 | |d 33 |j 2011 |e 2 |b 23 |c 01 |h 171-186 |
author_variant |
t m h tm tmh a m a am ama m a k ma mak |
---|---|
matchkey_str |
article:13704621:2011----::ntrtvmtofreiigvadigeaenua |
hierarchy_sort_str |
2011 |
publishDate |
2011 |
allfields |
10.1007/s11063-011-9171-3 doi (DE-627)OLC2044706652 (DE-He213)s11063-011-9171-3-p DE-627 ger DE-627 rakwb eng 000 VZ Hamdani, Tarek M. verfasserin aut An Iterative Method for Deciding SVM and Single Layer Neural Network Structures 2011 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer Science+Business Media, LLC. 2011 Abstract We present two new classifiers for two-class classification problems using a new Beta-SVM kernel transformation and an iterative algorithm to concurrently select the support vectors for a support vector machine (SVM) and the hidden units for a single hidden layer neural network to achieve a better generalization performance. To construct the classifiers, the contributing data points are chosen on the basis of a thresholding scheme of the outputs of a single perceptron trained using all training data samples. The chosen support vectors are used to construct a new SVM classifier that we call Beta-SVN. The number of chosen support vectors is used to determine the structure of the hidden layer in a single hidden layer neural network that we call Beta-NN. The Beta-SVN and Beta-NN structures produced by our method outperformed other commonly used classifiers when tested on a 2-dimensional non-linearly separable data set. Neural and statistical pattern recognition Support vector machines Single layer neural network Kernel function Beta function Alimi, Adel M. aut Khabou, Mohamed A. aut Enthalten in Neural processing letters Springer US, 1994 33(2011), 2 vom: 23. Jan., Seite 171-186 (DE-627)198692617 (DE-600)1316823-X (DE-576)052842762 1370-4621 nnns volume:33 year:2011 number:2 day:23 month:01 pages:171-186 https://doi.org/10.1007/s11063-011-9171-3 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-PSY SSG-OLC-MAT GBV_ILN_70 AR 33 2011 2 23 01 171-186 |
spelling |
10.1007/s11063-011-9171-3 doi (DE-627)OLC2044706652 (DE-He213)s11063-011-9171-3-p DE-627 ger DE-627 rakwb eng 000 VZ Hamdani, Tarek M. verfasserin aut An Iterative Method for Deciding SVM and Single Layer Neural Network Structures 2011 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer Science+Business Media, LLC. 2011 Abstract We present two new classifiers for two-class classification problems using a new Beta-SVM kernel transformation and an iterative algorithm to concurrently select the support vectors for a support vector machine (SVM) and the hidden units for a single hidden layer neural network to achieve a better generalization performance. To construct the classifiers, the contributing data points are chosen on the basis of a thresholding scheme of the outputs of a single perceptron trained using all training data samples. The chosen support vectors are used to construct a new SVM classifier that we call Beta-SVN. The number of chosen support vectors is used to determine the structure of the hidden layer in a single hidden layer neural network that we call Beta-NN. The Beta-SVN and Beta-NN structures produced by our method outperformed other commonly used classifiers when tested on a 2-dimensional non-linearly separable data set. Neural and statistical pattern recognition Support vector machines Single layer neural network Kernel function Beta function Alimi, Adel M. aut Khabou, Mohamed A. aut Enthalten in Neural processing letters Springer US, 1994 33(2011), 2 vom: 23. Jan., Seite 171-186 (DE-627)198692617 (DE-600)1316823-X (DE-576)052842762 1370-4621 nnns volume:33 year:2011 number:2 day:23 month:01 pages:171-186 https://doi.org/10.1007/s11063-011-9171-3 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-PSY SSG-OLC-MAT GBV_ILN_70 AR 33 2011 2 23 01 171-186 |
allfields_unstemmed |
10.1007/s11063-011-9171-3 doi (DE-627)OLC2044706652 (DE-He213)s11063-011-9171-3-p DE-627 ger DE-627 rakwb eng 000 VZ Hamdani, Tarek M. verfasserin aut An Iterative Method for Deciding SVM and Single Layer Neural Network Structures 2011 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer Science+Business Media, LLC. 2011 Abstract We present two new classifiers for two-class classification problems using a new Beta-SVM kernel transformation and an iterative algorithm to concurrently select the support vectors for a support vector machine (SVM) and the hidden units for a single hidden layer neural network to achieve a better generalization performance. To construct the classifiers, the contributing data points are chosen on the basis of a thresholding scheme of the outputs of a single perceptron trained using all training data samples. The chosen support vectors are used to construct a new SVM classifier that we call Beta-SVN. The number of chosen support vectors is used to determine the structure of the hidden layer in a single hidden layer neural network that we call Beta-NN. The Beta-SVN and Beta-NN structures produced by our method outperformed other commonly used classifiers when tested on a 2-dimensional non-linearly separable data set. Neural and statistical pattern recognition Support vector machines Single layer neural network Kernel function Beta function Alimi, Adel M. aut Khabou, Mohamed A. aut Enthalten in Neural processing letters Springer US, 1994 33(2011), 2 vom: 23. Jan., Seite 171-186 (DE-627)198692617 (DE-600)1316823-X (DE-576)052842762 1370-4621 nnns volume:33 year:2011 number:2 day:23 month:01 pages:171-186 https://doi.org/10.1007/s11063-011-9171-3 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-PSY SSG-OLC-MAT GBV_ILN_70 AR 33 2011 2 23 01 171-186 |
allfieldsGer |
10.1007/s11063-011-9171-3 doi (DE-627)OLC2044706652 (DE-He213)s11063-011-9171-3-p DE-627 ger DE-627 rakwb eng 000 VZ Hamdani, Tarek M. verfasserin aut An Iterative Method for Deciding SVM and Single Layer Neural Network Structures 2011 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer Science+Business Media, LLC. 2011 Abstract We present two new classifiers for two-class classification problems using a new Beta-SVM kernel transformation and an iterative algorithm to concurrently select the support vectors for a support vector machine (SVM) and the hidden units for a single hidden layer neural network to achieve a better generalization performance. To construct the classifiers, the contributing data points are chosen on the basis of a thresholding scheme of the outputs of a single perceptron trained using all training data samples. The chosen support vectors are used to construct a new SVM classifier that we call Beta-SVN. The number of chosen support vectors is used to determine the structure of the hidden layer in a single hidden layer neural network that we call Beta-NN. The Beta-SVN and Beta-NN structures produced by our method outperformed other commonly used classifiers when tested on a 2-dimensional non-linearly separable data set. Neural and statistical pattern recognition Support vector machines Single layer neural network Kernel function Beta function Alimi, Adel M. aut Khabou, Mohamed A. aut Enthalten in Neural processing letters Springer US, 1994 33(2011), 2 vom: 23. Jan., Seite 171-186 (DE-627)198692617 (DE-600)1316823-X (DE-576)052842762 1370-4621 nnns volume:33 year:2011 number:2 day:23 month:01 pages:171-186 https://doi.org/10.1007/s11063-011-9171-3 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-PSY SSG-OLC-MAT GBV_ILN_70 AR 33 2011 2 23 01 171-186 |
allfieldsSound |
10.1007/s11063-011-9171-3 doi (DE-627)OLC2044706652 (DE-He213)s11063-011-9171-3-p DE-627 ger DE-627 rakwb eng 000 VZ Hamdani, Tarek M. verfasserin aut An Iterative Method for Deciding SVM and Single Layer Neural Network Structures 2011 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer Science+Business Media, LLC. 2011 Abstract We present two new classifiers for two-class classification problems using a new Beta-SVM kernel transformation and an iterative algorithm to concurrently select the support vectors for a support vector machine (SVM) and the hidden units for a single hidden layer neural network to achieve a better generalization performance. To construct the classifiers, the contributing data points are chosen on the basis of a thresholding scheme of the outputs of a single perceptron trained using all training data samples. The chosen support vectors are used to construct a new SVM classifier that we call Beta-SVN. The number of chosen support vectors is used to determine the structure of the hidden layer in a single hidden layer neural network that we call Beta-NN. The Beta-SVN and Beta-NN structures produced by our method outperformed other commonly used classifiers when tested on a 2-dimensional non-linearly separable data set. Neural and statistical pattern recognition Support vector machines Single layer neural network Kernel function Beta function Alimi, Adel M. aut Khabou, Mohamed A. aut Enthalten in Neural processing letters Springer US, 1994 33(2011), 2 vom: 23. Jan., Seite 171-186 (DE-627)198692617 (DE-600)1316823-X (DE-576)052842762 1370-4621 nnns volume:33 year:2011 number:2 day:23 month:01 pages:171-186 https://doi.org/10.1007/s11063-011-9171-3 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-PSY SSG-OLC-MAT GBV_ILN_70 AR 33 2011 2 23 01 171-186 |
language |
English |
source |
Enthalten in Neural processing letters 33(2011), 2 vom: 23. Jan., Seite 171-186 volume:33 year:2011 number:2 day:23 month:01 pages:171-186 |
sourceStr |
Enthalten in Neural processing letters 33(2011), 2 vom: 23. Jan., Seite 171-186 volume:33 year:2011 number:2 day:23 month:01 pages:171-186 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Neural and statistical pattern recognition Support vector machines Single layer neural network Kernel function Beta function |
dewey-raw |
000 |
isfreeaccess_bool |
false |
container_title |
Neural processing letters |
authorswithroles_txt_mv |
Hamdani, Tarek M. @@aut@@ Alimi, Adel M. @@aut@@ Khabou, Mohamed A. @@aut@@ |
publishDateDaySort_date |
2011-01-23T00:00:00Z |
hierarchy_top_id |
198692617 |
dewey-sort |
0 |
id |
OLC2044706652 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2044706652</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230503210136.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">200820s2011 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s11063-011-9171-3</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2044706652</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s11063-011-9171-3-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">000</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Hamdani, Tarek M.</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">An Iterative Method for Deciding SVM and Single Layer Neural Network Structures</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2011</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer Science+Business Media, LLC. 2011</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract We present two new classifiers for two-class classification problems using a new Beta-SVM kernel transformation and an iterative algorithm to concurrently select the support vectors for a support vector machine (SVM) and the hidden units for a single hidden layer neural network to achieve a better generalization performance. To construct the classifiers, the contributing data points are chosen on the basis of a thresholding scheme of the outputs of a single perceptron trained using all training data samples. The chosen support vectors are used to construct a new SVM classifier that we call Beta-SVN. The number of chosen support vectors is used to determine the structure of the hidden layer in a single hidden layer neural network that we call Beta-NN. The Beta-SVN and Beta-NN structures produced by our method outperformed other commonly used classifiers when tested on a 2-dimensional non-linearly separable data set.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Neural and statistical pattern recognition</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Support vector machines</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Single layer neural network</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Kernel function</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Beta function</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Alimi, Adel M.</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Khabou, Mohamed A.</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Neural processing letters</subfield><subfield code="d">Springer US, 1994</subfield><subfield code="g">33(2011), 2 vom: 23. Jan., Seite 171-186</subfield><subfield code="w">(DE-627)198692617</subfield><subfield code="w">(DE-600)1316823-X</subfield><subfield code="w">(DE-576)052842762</subfield><subfield code="x">1370-4621</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:33</subfield><subfield code="g">year:2011</subfield><subfield code="g">number:2</subfield><subfield code="g">day:23</subfield><subfield code="g">month:01</subfield><subfield code="g">pages:171-186</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s11063-011-9171-3</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-PSY</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">33</subfield><subfield code="j">2011</subfield><subfield code="e">2</subfield><subfield code="b">23</subfield><subfield code="c">01</subfield><subfield code="h">171-186</subfield></datafield></record></collection>
|
author |
Hamdani, Tarek M. |
spellingShingle |
Hamdani, Tarek M. ddc 000 misc Neural and statistical pattern recognition misc Support vector machines misc Single layer neural network misc Kernel function misc Beta function An Iterative Method for Deciding SVM and Single Layer Neural Network Structures |
authorStr |
Hamdani, Tarek M. |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)198692617 |
format |
Article |
dewey-ones |
000 - Computer science, information & general works |
delete_txt_mv |
keep |
author_role |
aut aut aut |
collection |
OLC |
remote_str |
false |
illustrated |
Not Illustrated |
issn |
1370-4621 |
topic_title |
000 VZ An Iterative Method for Deciding SVM and Single Layer Neural Network Structures Neural and statistical pattern recognition Support vector machines Single layer neural network Kernel function Beta function |
topic |
ddc 000 misc Neural and statistical pattern recognition misc Support vector machines misc Single layer neural network misc Kernel function misc Beta function |
topic_unstemmed |
ddc 000 misc Neural and statistical pattern recognition misc Support vector machines misc Single layer neural network misc Kernel function misc Beta function |
topic_browse |
ddc 000 misc Neural and statistical pattern recognition misc Support vector machines misc Single layer neural network misc Kernel function misc Beta function |
format_facet |
Aufsätze Gedruckte Aufsätze |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
nc |
hierarchy_parent_title |
Neural processing letters |
hierarchy_parent_id |
198692617 |
dewey-tens |
000 - Computer science, knowledge & systems |
hierarchy_top_title |
Neural processing letters |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)198692617 (DE-600)1316823-X (DE-576)052842762 |
title |
An Iterative Method for Deciding SVM and Single Layer Neural Network Structures |
ctrlnum |
(DE-627)OLC2044706652 (DE-He213)s11063-011-9171-3-p |
title_full |
An Iterative Method for Deciding SVM and Single Layer Neural Network Structures |
author_sort |
Hamdani, Tarek M. |
journal |
Neural processing letters |
journalStr |
Neural processing letters |
lang_code |
eng |
isOA_bool |
false |
dewey-hundreds |
000 - Computer science, information & general works |
recordtype |
marc |
publishDateSort |
2011 |
contenttype_str_mv |
txt |
container_start_page |
171 |
author_browse |
Hamdani, Tarek M. Alimi, Adel M. Khabou, Mohamed A. |
container_volume |
33 |
class |
000 VZ |
format_se |
Aufsätze |
author-letter |
Hamdani, Tarek M. |
doi_str_mv |
10.1007/s11063-011-9171-3 |
dewey-full |
000 |
title_sort |
an iterative method for deciding svm and single layer neural network structures |
title_auth |
An Iterative Method for Deciding SVM and Single Layer Neural Network Structures |
abstract |
Abstract We present two new classifiers for two-class classification problems using a new Beta-SVM kernel transformation and an iterative algorithm to concurrently select the support vectors for a support vector machine (SVM) and the hidden units for a single hidden layer neural network to achieve a better generalization performance. To construct the classifiers, the contributing data points are chosen on the basis of a thresholding scheme of the outputs of a single perceptron trained using all training data samples. The chosen support vectors are used to construct a new SVM classifier that we call Beta-SVN. The number of chosen support vectors is used to determine the structure of the hidden layer in a single hidden layer neural network that we call Beta-NN. The Beta-SVN and Beta-NN structures produced by our method outperformed other commonly used classifiers when tested on a 2-dimensional non-linearly separable data set. © Springer Science+Business Media, LLC. 2011 |
abstractGer |
Abstract We present two new classifiers for two-class classification problems using a new Beta-SVM kernel transformation and an iterative algorithm to concurrently select the support vectors for a support vector machine (SVM) and the hidden units for a single hidden layer neural network to achieve a better generalization performance. To construct the classifiers, the contributing data points are chosen on the basis of a thresholding scheme of the outputs of a single perceptron trained using all training data samples. The chosen support vectors are used to construct a new SVM classifier that we call Beta-SVN. The number of chosen support vectors is used to determine the structure of the hidden layer in a single hidden layer neural network that we call Beta-NN. The Beta-SVN and Beta-NN structures produced by our method outperformed other commonly used classifiers when tested on a 2-dimensional non-linearly separable data set. © Springer Science+Business Media, LLC. 2011 |
abstract_unstemmed |
Abstract We present two new classifiers for two-class classification problems using a new Beta-SVM kernel transformation and an iterative algorithm to concurrently select the support vectors for a support vector machine (SVM) and the hidden units for a single hidden layer neural network to achieve a better generalization performance. To construct the classifiers, the contributing data points are chosen on the basis of a thresholding scheme of the outputs of a single perceptron trained using all training data samples. The chosen support vectors are used to construct a new SVM classifier that we call Beta-SVN. The number of chosen support vectors is used to determine the structure of the hidden layer in a single hidden layer neural network that we call Beta-NN. The Beta-SVN and Beta-NN structures produced by our method outperformed other commonly used classifiers when tested on a 2-dimensional non-linearly separable data set. © Springer Science+Business Media, LLC. 2011 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-PSY SSG-OLC-MAT GBV_ILN_70 |
container_issue |
2 |
title_short |
An Iterative Method for Deciding SVM and Single Layer Neural Network Structures |
url |
https://doi.org/10.1007/s11063-011-9171-3 |
remote_bool |
false |
author2 |
Alimi, Adel M. Khabou, Mohamed A. |
author2Str |
Alimi, Adel M. Khabou, Mohamed A. |
ppnlink |
198692617 |
mediatype_str_mv |
n |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s11063-011-9171-3 |
up_date |
2024-07-04T00:30:14.044Z |
_version_ |
1803606309255249920 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2044706652</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230503210136.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">200820s2011 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s11063-011-9171-3</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2044706652</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s11063-011-9171-3-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">000</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Hamdani, Tarek M.</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">An Iterative Method for Deciding SVM and Single Layer Neural Network Structures</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2011</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer Science+Business Media, LLC. 2011</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract We present two new classifiers for two-class classification problems using a new Beta-SVM kernel transformation and an iterative algorithm to concurrently select the support vectors for a support vector machine (SVM) and the hidden units for a single hidden layer neural network to achieve a better generalization performance. To construct the classifiers, the contributing data points are chosen on the basis of a thresholding scheme of the outputs of a single perceptron trained using all training data samples. The chosen support vectors are used to construct a new SVM classifier that we call Beta-SVN. The number of chosen support vectors is used to determine the structure of the hidden layer in a single hidden layer neural network that we call Beta-NN. The Beta-SVN and Beta-NN structures produced by our method outperformed other commonly used classifiers when tested on a 2-dimensional non-linearly separable data set.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Neural and statistical pattern recognition</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Support vector machines</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Single layer neural network</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Kernel function</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Beta function</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Alimi, Adel M.</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Khabou, Mohamed A.</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Neural processing letters</subfield><subfield code="d">Springer US, 1994</subfield><subfield code="g">33(2011), 2 vom: 23. Jan., Seite 171-186</subfield><subfield code="w">(DE-627)198692617</subfield><subfield code="w">(DE-600)1316823-X</subfield><subfield code="w">(DE-576)052842762</subfield><subfield code="x">1370-4621</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:33</subfield><subfield code="g">year:2011</subfield><subfield code="g">number:2</subfield><subfield code="g">day:23</subfield><subfield code="g">month:01</subfield><subfield code="g">pages:171-186</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s11063-011-9171-3</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-PSY</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">33</subfield><subfield code="j">2011</subfield><subfield code="e">2</subfield><subfield code="b">23</subfield><subfield code="c">01</subfield><subfield code="h">171-186</subfield></datafield></record></collection>
|
score |
7.401229 |