Minimal Structure of Self-Organizing HCMAC Neural Network Classifier
Abstract The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural...
Ausführliche Beschreibung
Autor*in: |
Chen, Chih-Ming [verfasserIn] |
---|
Format: |
Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2006 |
---|
Schlagwörter: |
Cerebellar Model Articulation Controller (CMAC) minimal structure of self-organzing HCMAC (MHCMAC) neural network |
---|
Anmerkung: |
© Springer 2006 |
---|
Übergeordnetes Werk: |
Enthalten in: Neural processing letters - Kluwer Academic Publishers, 1994, 23(2006), 2 vom: Apr., Seite 201-228 |
---|---|
Übergeordnetes Werk: |
volume:23 ; year:2006 ; number:2 ; month:04 ; pages:201-228 |
Links: |
---|
DOI / URN: |
10.1007/s11063-006-6277-0 |
---|
Katalog-ID: |
OLC2044705036 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | OLC2044705036 | ||
003 | DE-627 | ||
005 | 20230503210112.0 | ||
007 | tu | ||
008 | 200820s2006 xx ||||| 00| ||eng c | ||
024 | 7 | |a 10.1007/s11063-006-6277-0 |2 doi | |
035 | |a (DE-627)OLC2044705036 | ||
035 | |a (DE-He213)s11063-006-6277-0-p | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
082 | 0 | 4 | |a 000 |q VZ |
100 | 1 | |a Chen, Chih-Ming |e verfasserin |4 aut | |
245 | 1 | 0 | |a Minimal Structure of Self-Organizing HCMAC Neural Network Classifier |
264 | 1 | |c 2006 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ohne Hilfsmittel zu benutzen |b n |2 rdamedia | ||
338 | |a Band |b nc |2 rdacarrier | ||
500 | |a © Springer 2006 | ||
520 | |a Abstract The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification. | ||
650 | 4 | |a Cerebellar Model Articulation Controller (CMAC) | |
650 | 4 | |a minimal structure of self-organzing HCMAC (MHCMAC) neural network | |
650 | 4 | |a self-organizing hierarchical CMAC (HCMAC) neural network | |
700 | 1 | |a Lu, Yung-Feng |4 aut | |
700 | 1 | |a Hong, Chin-Ming |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Neural processing letters |d Kluwer Academic Publishers, 1994 |g 23(2006), 2 vom: Apr., Seite 201-228 |w (DE-627)198692617 |w (DE-600)1316823-X |w (DE-576)052842762 |x 1370-4621 |7 nnns |
773 | 1 | 8 | |g volume:23 |g year:2006 |g number:2 |g month:04 |g pages:201-228 |
856 | 4 | 1 | |u https://doi.org/10.1007/s11063-006-6277-0 |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_OLC | ||
912 | |a SSG-OLC-PSY | ||
912 | |a SSG-OLC-MAT | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_120 | ||
912 | |a GBV_ILN_2021 | ||
951 | |a AR | ||
952 | |d 23 |j 2006 |e 2 |c 04 |h 201-228 |
author_variant |
c m c cmc y f l yfl c m h cmh |
---|---|
matchkey_str |
article:13704621:2006----::iiasrcuefefraiigcanuan |
hierarchy_sort_str |
2006 |
publishDate |
2006 |
allfields |
10.1007/s11063-006-6277-0 doi (DE-627)OLC2044705036 (DE-He213)s11063-006-6277-0-p DE-627 ger DE-627 rakwb eng 000 VZ Chen, Chih-Ming verfasserin aut Minimal Structure of Self-Organizing HCMAC Neural Network Classifier 2006 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer 2006 Abstract The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification. Cerebellar Model Articulation Controller (CMAC) minimal structure of self-organzing HCMAC (MHCMAC) neural network self-organizing hierarchical CMAC (HCMAC) neural network Lu, Yung-Feng aut Hong, Chin-Ming aut Enthalten in Neural processing letters Kluwer Academic Publishers, 1994 23(2006), 2 vom: Apr., Seite 201-228 (DE-627)198692617 (DE-600)1316823-X (DE-576)052842762 1370-4621 nnns volume:23 year:2006 number:2 month:04 pages:201-228 https://doi.org/10.1007/s11063-006-6277-0 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-PSY SSG-OLC-MAT GBV_ILN_70 GBV_ILN_120 GBV_ILN_2021 AR 23 2006 2 04 201-228 |
spelling |
10.1007/s11063-006-6277-0 doi (DE-627)OLC2044705036 (DE-He213)s11063-006-6277-0-p DE-627 ger DE-627 rakwb eng 000 VZ Chen, Chih-Ming verfasserin aut Minimal Structure of Self-Organizing HCMAC Neural Network Classifier 2006 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer 2006 Abstract The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification. Cerebellar Model Articulation Controller (CMAC) minimal structure of self-organzing HCMAC (MHCMAC) neural network self-organizing hierarchical CMAC (HCMAC) neural network Lu, Yung-Feng aut Hong, Chin-Ming aut Enthalten in Neural processing letters Kluwer Academic Publishers, 1994 23(2006), 2 vom: Apr., Seite 201-228 (DE-627)198692617 (DE-600)1316823-X (DE-576)052842762 1370-4621 nnns volume:23 year:2006 number:2 month:04 pages:201-228 https://doi.org/10.1007/s11063-006-6277-0 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-PSY SSG-OLC-MAT GBV_ILN_70 GBV_ILN_120 GBV_ILN_2021 AR 23 2006 2 04 201-228 |
allfields_unstemmed |
10.1007/s11063-006-6277-0 doi (DE-627)OLC2044705036 (DE-He213)s11063-006-6277-0-p DE-627 ger DE-627 rakwb eng 000 VZ Chen, Chih-Ming verfasserin aut Minimal Structure of Self-Organizing HCMAC Neural Network Classifier 2006 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer 2006 Abstract The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification. Cerebellar Model Articulation Controller (CMAC) minimal structure of self-organzing HCMAC (MHCMAC) neural network self-organizing hierarchical CMAC (HCMAC) neural network Lu, Yung-Feng aut Hong, Chin-Ming aut Enthalten in Neural processing letters Kluwer Academic Publishers, 1994 23(2006), 2 vom: Apr., Seite 201-228 (DE-627)198692617 (DE-600)1316823-X (DE-576)052842762 1370-4621 nnns volume:23 year:2006 number:2 month:04 pages:201-228 https://doi.org/10.1007/s11063-006-6277-0 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-PSY SSG-OLC-MAT GBV_ILN_70 GBV_ILN_120 GBV_ILN_2021 AR 23 2006 2 04 201-228 |
allfieldsGer |
10.1007/s11063-006-6277-0 doi (DE-627)OLC2044705036 (DE-He213)s11063-006-6277-0-p DE-627 ger DE-627 rakwb eng 000 VZ Chen, Chih-Ming verfasserin aut Minimal Structure of Self-Organizing HCMAC Neural Network Classifier 2006 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer 2006 Abstract The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification. Cerebellar Model Articulation Controller (CMAC) minimal structure of self-organzing HCMAC (MHCMAC) neural network self-organizing hierarchical CMAC (HCMAC) neural network Lu, Yung-Feng aut Hong, Chin-Ming aut Enthalten in Neural processing letters Kluwer Academic Publishers, 1994 23(2006), 2 vom: Apr., Seite 201-228 (DE-627)198692617 (DE-600)1316823-X (DE-576)052842762 1370-4621 nnns volume:23 year:2006 number:2 month:04 pages:201-228 https://doi.org/10.1007/s11063-006-6277-0 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-PSY SSG-OLC-MAT GBV_ILN_70 GBV_ILN_120 GBV_ILN_2021 AR 23 2006 2 04 201-228 |
allfieldsSound |
10.1007/s11063-006-6277-0 doi (DE-627)OLC2044705036 (DE-He213)s11063-006-6277-0-p DE-627 ger DE-627 rakwb eng 000 VZ Chen, Chih-Ming verfasserin aut Minimal Structure of Self-Organizing HCMAC Neural Network Classifier 2006 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer 2006 Abstract The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification. Cerebellar Model Articulation Controller (CMAC) minimal structure of self-organzing HCMAC (MHCMAC) neural network self-organizing hierarchical CMAC (HCMAC) neural network Lu, Yung-Feng aut Hong, Chin-Ming aut Enthalten in Neural processing letters Kluwer Academic Publishers, 1994 23(2006), 2 vom: Apr., Seite 201-228 (DE-627)198692617 (DE-600)1316823-X (DE-576)052842762 1370-4621 nnns volume:23 year:2006 number:2 month:04 pages:201-228 https://doi.org/10.1007/s11063-006-6277-0 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-PSY SSG-OLC-MAT GBV_ILN_70 GBV_ILN_120 GBV_ILN_2021 AR 23 2006 2 04 201-228 |
language |
English |
source |
Enthalten in Neural processing letters 23(2006), 2 vom: Apr., Seite 201-228 volume:23 year:2006 number:2 month:04 pages:201-228 |
sourceStr |
Enthalten in Neural processing letters 23(2006), 2 vom: Apr., Seite 201-228 volume:23 year:2006 number:2 month:04 pages:201-228 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Cerebellar Model Articulation Controller (CMAC) minimal structure of self-organzing HCMAC (MHCMAC) neural network self-organizing hierarchical CMAC (HCMAC) neural network |
dewey-raw |
000 |
isfreeaccess_bool |
false |
container_title |
Neural processing letters |
authorswithroles_txt_mv |
Chen, Chih-Ming @@aut@@ Lu, Yung-Feng @@aut@@ Hong, Chin-Ming @@aut@@ |
publishDateDaySort_date |
2006-04-01T00:00:00Z |
hierarchy_top_id |
198692617 |
dewey-sort |
0 |
id |
OLC2044705036 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2044705036</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230503210112.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">200820s2006 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s11063-006-6277-0</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2044705036</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s11063-006-6277-0-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">000</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Chen, Chih-Ming</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Minimal Structure of Self-Organizing HCMAC Neural Network Classifier</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2006</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer 2006</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Cerebellar Model Articulation Controller (CMAC)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">minimal structure of self-organzing HCMAC (MHCMAC) neural network</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">self-organizing hierarchical CMAC (HCMAC) neural network</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Lu, Yung-Feng</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Hong, Chin-Ming</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Neural processing letters</subfield><subfield code="d">Kluwer Academic Publishers, 1994</subfield><subfield code="g">23(2006), 2 vom: Apr., Seite 201-228</subfield><subfield code="w">(DE-627)198692617</subfield><subfield code="w">(DE-600)1316823-X</subfield><subfield code="w">(DE-576)052842762</subfield><subfield code="x">1370-4621</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:23</subfield><subfield code="g">year:2006</subfield><subfield code="g">number:2</subfield><subfield code="g">month:04</subfield><subfield code="g">pages:201-228</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s11063-006-6277-0</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-PSY</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_120</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2021</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">23</subfield><subfield code="j">2006</subfield><subfield code="e">2</subfield><subfield code="c">04</subfield><subfield code="h">201-228</subfield></datafield></record></collection>
|
author |
Chen, Chih-Ming |
spellingShingle |
Chen, Chih-Ming ddc 000 misc Cerebellar Model Articulation Controller (CMAC) misc minimal structure of self-organzing HCMAC (MHCMAC) neural network misc self-organizing hierarchical CMAC (HCMAC) neural network Minimal Structure of Self-Organizing HCMAC Neural Network Classifier |
authorStr |
Chen, Chih-Ming |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)198692617 |
format |
Article |
dewey-ones |
000 - Computer science, information & general works |
delete_txt_mv |
keep |
author_role |
aut aut aut |
collection |
OLC |
remote_str |
false |
illustrated |
Not Illustrated |
issn |
1370-4621 |
topic_title |
000 VZ Minimal Structure of Self-Organizing HCMAC Neural Network Classifier Cerebellar Model Articulation Controller (CMAC) minimal structure of self-organzing HCMAC (MHCMAC) neural network self-organizing hierarchical CMAC (HCMAC) neural network |
topic |
ddc 000 misc Cerebellar Model Articulation Controller (CMAC) misc minimal structure of self-organzing HCMAC (MHCMAC) neural network misc self-organizing hierarchical CMAC (HCMAC) neural network |
topic_unstemmed |
ddc 000 misc Cerebellar Model Articulation Controller (CMAC) misc minimal structure of self-organzing HCMAC (MHCMAC) neural network misc self-organizing hierarchical CMAC (HCMAC) neural network |
topic_browse |
ddc 000 misc Cerebellar Model Articulation Controller (CMAC) misc minimal structure of self-organzing HCMAC (MHCMAC) neural network misc self-organizing hierarchical CMAC (HCMAC) neural network |
format_facet |
Aufsätze Gedruckte Aufsätze |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
nc |
hierarchy_parent_title |
Neural processing letters |
hierarchy_parent_id |
198692617 |
dewey-tens |
000 - Computer science, knowledge & systems |
hierarchy_top_title |
Neural processing letters |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)198692617 (DE-600)1316823-X (DE-576)052842762 |
title |
Minimal Structure of Self-Organizing HCMAC Neural Network Classifier |
ctrlnum |
(DE-627)OLC2044705036 (DE-He213)s11063-006-6277-0-p |
title_full |
Minimal Structure of Self-Organizing HCMAC Neural Network Classifier |
author_sort |
Chen, Chih-Ming |
journal |
Neural processing letters |
journalStr |
Neural processing letters |
lang_code |
eng |
isOA_bool |
false |
dewey-hundreds |
000 - Computer science, information & general works |
recordtype |
marc |
publishDateSort |
2006 |
contenttype_str_mv |
txt |
container_start_page |
201 |
author_browse |
Chen, Chih-Ming Lu, Yung-Feng Hong, Chin-Ming |
container_volume |
23 |
class |
000 VZ |
format_se |
Aufsätze |
author-letter |
Chen, Chih-Ming |
doi_str_mv |
10.1007/s11063-006-6277-0 |
dewey-full |
000 |
title_sort |
minimal structure of self-organizing hcmac neural network classifier |
title_auth |
Minimal Structure of Self-Organizing HCMAC Neural Network Classifier |
abstract |
Abstract The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification. © Springer 2006 |
abstractGer |
Abstract The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification. © Springer 2006 |
abstract_unstemmed |
Abstract The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification. © Springer 2006 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-PSY SSG-OLC-MAT GBV_ILN_70 GBV_ILN_120 GBV_ILN_2021 |
container_issue |
2 |
title_short |
Minimal Structure of Self-Organizing HCMAC Neural Network Classifier |
url |
https://doi.org/10.1007/s11063-006-6277-0 |
remote_bool |
false |
author2 |
Lu, Yung-Feng Hong, Chin-Ming |
author2Str |
Lu, Yung-Feng Hong, Chin-Ming |
ppnlink |
198692617 |
mediatype_str_mv |
n |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s11063-006-6277-0 |
up_date |
2024-07-04T00:29:54.740Z |
_version_ |
1803606289013538816 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2044705036</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230503210112.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">200820s2006 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s11063-006-6277-0</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2044705036</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s11063-006-6277-0-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">000</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Chen, Chih-Ming</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Minimal Structure of Self-Organizing HCMAC Neural Network Classifier</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2006</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer 2006</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Cerebellar Model Articulation Controller (CMAC)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">minimal structure of self-organzing HCMAC (MHCMAC) neural network</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">self-organizing hierarchical CMAC (HCMAC) neural network</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Lu, Yung-Feng</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Hong, Chin-Ming</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Neural processing letters</subfield><subfield code="d">Kluwer Academic Publishers, 1994</subfield><subfield code="g">23(2006), 2 vom: Apr., Seite 201-228</subfield><subfield code="w">(DE-627)198692617</subfield><subfield code="w">(DE-600)1316823-X</subfield><subfield code="w">(DE-576)052842762</subfield><subfield code="x">1370-4621</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:23</subfield><subfield code="g">year:2006</subfield><subfield code="g">number:2</subfield><subfield code="g">month:04</subfield><subfield code="g">pages:201-228</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s11063-006-6277-0</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-PSY</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_120</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2021</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">23</subfield><subfield code="j">2006</subfield><subfield code="e">2</subfield><subfield code="c">04</subfield><subfield code="h">201-228</subfield></datafield></record></collection>
|
score |
7.400259 |