Novel self-adjusted particle swarm optimization algorithm for feature selection
Abstract Due to the ever increasing number of features in the most practical application fields, i.e, expert and intelligent systems, feature selection (FS) has become a promising pre-processing step for a particular task (i.e., classification and regression) in the last few decades. FS aims at sele...
Ausführliche Beschreibung
Autor*in: |
Wei, Bo [verfasserIn] |
---|
Format: |
Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2021 |
---|
Schlagwörter: |
---|
Systematik: |
|
---|
Anmerkung: |
© The Author(s), under exclusive licence to Springer-Verlag GmbH, AT part of Springer Nature 2021 |
---|
Übergeordnetes Werk: |
Enthalten in: Computing - Springer Vienna, 1966, 103(2021), 8 vom: 21. Jan., Seite 1569-1597 |
---|---|
Übergeordnetes Werk: |
volume:103 ; year:2021 ; number:8 ; day:21 ; month:01 ; pages:1569-1597 |
Links: |
---|
DOI / URN: |
10.1007/s00607-020-00891-w |
---|
Katalog-ID: |
OLC2126754391 |
---|
LEADER | 01000naa a22002652 4500 | ||
---|---|---|---|
001 | OLC2126754391 | ||
003 | DE-627 | ||
005 | 20230505121147.0 | ||
007 | tu | ||
008 | 230505s2021 xx ||||| 00| ||eng c | ||
024 | 7 | |a 10.1007/s00607-020-00891-w |2 doi | |
035 | |a (DE-627)OLC2126754391 | ||
035 | |a (DE-He213)s00607-020-00891-w-p | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
082 | 0 | 4 | |a 004 |q VZ |
084 | |a SA 4220 |q VZ |2 rvk | ||
084 | |a SA 4220 |q VZ |2 rvk | ||
100 | 1 | |a Wei, Bo |e verfasserin |4 aut | |
245 | 1 | 0 | |a Novel self-adjusted particle swarm optimization algorithm for feature selection |
264 | 1 | |c 2021 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ohne Hilfsmittel zu benutzen |b n |2 rdamedia | ||
338 | |a Band |b nc |2 rdacarrier | ||
500 | |a © The Author(s), under exclusive licence to Springer-Verlag GmbH, AT part of Springer Nature 2021 | ||
520 | |a Abstract Due to the ever increasing number of features in the most practical application fields, i.e, expert and intelligent systems, feature selection (FS) has become a promising pre-processing step for a particular task (i.e., classification and regression) in the last few decades. FS aims at selecting the optimal feature subset from the original feature dataset by removing redundant and irrelevant features, which improve the performance of the learning models. In this paper, a novel self-adjusted particle swarm optimization algorithm (SAPSO) is proposed for selecting the optimal feature subset for classification datasets. In SAPSO, we make three improvements: First, a new learning model of particles, which can extract much more useful knowledge from multiple information providers, is used to enhance the diversity of particles. Second, one-flip neighborhood search strategy is adopted to strengthen the local search ability of a swarm when the swarm enters a period of stagnation. Finally, a population replacement process is conducted, which bases on the part of new particles generated by the neighborhood search strategy, to enhance the diversity of the swarm. Moreover, the k-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. The proposed method is benchmarked on 10 well-known UCI datasets and the results are compared with 9 state-of-the-art wrapper-based FS methods. From the results, it is observed that the proposed approach significantly outperforms others on most the 10 datasets. | ||
650 | 4 | |a Feature selection | |
650 | 4 | |a Combinatorial optimization | |
650 | 4 | |a Particle swarm optimization | |
650 | 4 | |a Neighborhood search strategy | |
700 | 1 | |a Wang, Xuan |4 aut | |
700 | 1 | |a Xia, Xuewen |4 aut | |
700 | 1 | |a Jiang, Mingfeng |4 aut | |
700 | 1 | |a Ding, Zuohua |4 aut | |
700 | 1 | |a Huang, Yanrong |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Computing |d Springer Vienna, 1966 |g 103(2021), 8 vom: 21. Jan., Seite 1569-1597 |w (DE-627)129534927 |w (DE-600)215907-7 |w (DE-576)014963949 |x 0010-485X |7 nnns |
773 | 1 | 8 | |g volume:103 |g year:2021 |g number:8 |g day:21 |g month:01 |g pages:1569-1597 |
856 | 4 | 1 | |u https://doi.org/10.1007/s00607-020-00891-w |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_OLC | ||
912 | |a SSG-OLC-TEC | ||
912 | |a SSG-OLC-MAT | ||
912 | |a SSG-OPC-MAT | ||
936 | r | v | |a SA 4220 |
936 | r | v | |a SA 4220 |
951 | |a AR | ||
952 | |d 103 |j 2021 |e 8 |b 21 |c 01 |h 1569-1597 |
author_variant |
b w bw x w xw x x xx m j mj z d zd y h yh |
---|---|
matchkey_str |
article:0010485X:2021----::oeslajseprilsampiiainloih |
hierarchy_sort_str |
2021 |
publishDate |
2021 |
allfields |
10.1007/s00607-020-00891-w doi (DE-627)OLC2126754391 (DE-He213)s00607-020-00891-w-p DE-627 ger DE-627 rakwb eng 004 VZ SA 4220 VZ rvk SA 4220 VZ rvk Wei, Bo verfasserin aut Novel self-adjusted particle swarm optimization algorithm for feature selection 2021 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag GmbH, AT part of Springer Nature 2021 Abstract Due to the ever increasing number of features in the most practical application fields, i.e, expert and intelligent systems, feature selection (FS) has become a promising pre-processing step for a particular task (i.e., classification and regression) in the last few decades. FS aims at selecting the optimal feature subset from the original feature dataset by removing redundant and irrelevant features, which improve the performance of the learning models. In this paper, a novel self-adjusted particle swarm optimization algorithm (SAPSO) is proposed for selecting the optimal feature subset for classification datasets. In SAPSO, we make three improvements: First, a new learning model of particles, which can extract much more useful knowledge from multiple information providers, is used to enhance the diversity of particles. Second, one-flip neighborhood search strategy is adopted to strengthen the local search ability of a swarm when the swarm enters a period of stagnation. Finally, a population replacement process is conducted, which bases on the part of new particles generated by the neighborhood search strategy, to enhance the diversity of the swarm. Moreover, the k-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. The proposed method is benchmarked on 10 well-known UCI datasets and the results are compared with 9 state-of-the-art wrapper-based FS methods. From the results, it is observed that the proposed approach significantly outperforms others on most the 10 datasets. Feature selection Combinatorial optimization Particle swarm optimization Neighborhood search strategy Wang, Xuan aut Xia, Xuewen aut Jiang, Mingfeng aut Ding, Zuohua aut Huang, Yanrong aut Enthalten in Computing Springer Vienna, 1966 103(2021), 8 vom: 21. Jan., Seite 1569-1597 (DE-627)129534927 (DE-600)215907-7 (DE-576)014963949 0010-485X nnns volume:103 year:2021 number:8 day:21 month:01 pages:1569-1597 https://doi.org/10.1007/s00607-020-00891-w lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-TEC SSG-OLC-MAT SSG-OPC-MAT SA 4220 SA 4220 AR 103 2021 8 21 01 1569-1597 |
spelling |
10.1007/s00607-020-00891-w doi (DE-627)OLC2126754391 (DE-He213)s00607-020-00891-w-p DE-627 ger DE-627 rakwb eng 004 VZ SA 4220 VZ rvk SA 4220 VZ rvk Wei, Bo verfasserin aut Novel self-adjusted particle swarm optimization algorithm for feature selection 2021 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag GmbH, AT part of Springer Nature 2021 Abstract Due to the ever increasing number of features in the most practical application fields, i.e, expert and intelligent systems, feature selection (FS) has become a promising pre-processing step for a particular task (i.e., classification and regression) in the last few decades. FS aims at selecting the optimal feature subset from the original feature dataset by removing redundant and irrelevant features, which improve the performance of the learning models. In this paper, a novel self-adjusted particle swarm optimization algorithm (SAPSO) is proposed for selecting the optimal feature subset for classification datasets. In SAPSO, we make three improvements: First, a new learning model of particles, which can extract much more useful knowledge from multiple information providers, is used to enhance the diversity of particles. Second, one-flip neighborhood search strategy is adopted to strengthen the local search ability of a swarm when the swarm enters a period of stagnation. Finally, a population replacement process is conducted, which bases on the part of new particles generated by the neighborhood search strategy, to enhance the diversity of the swarm. Moreover, the k-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. The proposed method is benchmarked on 10 well-known UCI datasets and the results are compared with 9 state-of-the-art wrapper-based FS methods. From the results, it is observed that the proposed approach significantly outperforms others on most the 10 datasets. Feature selection Combinatorial optimization Particle swarm optimization Neighborhood search strategy Wang, Xuan aut Xia, Xuewen aut Jiang, Mingfeng aut Ding, Zuohua aut Huang, Yanrong aut Enthalten in Computing Springer Vienna, 1966 103(2021), 8 vom: 21. Jan., Seite 1569-1597 (DE-627)129534927 (DE-600)215907-7 (DE-576)014963949 0010-485X nnns volume:103 year:2021 number:8 day:21 month:01 pages:1569-1597 https://doi.org/10.1007/s00607-020-00891-w lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-TEC SSG-OLC-MAT SSG-OPC-MAT SA 4220 SA 4220 AR 103 2021 8 21 01 1569-1597 |
allfields_unstemmed |
10.1007/s00607-020-00891-w doi (DE-627)OLC2126754391 (DE-He213)s00607-020-00891-w-p DE-627 ger DE-627 rakwb eng 004 VZ SA 4220 VZ rvk SA 4220 VZ rvk Wei, Bo verfasserin aut Novel self-adjusted particle swarm optimization algorithm for feature selection 2021 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag GmbH, AT part of Springer Nature 2021 Abstract Due to the ever increasing number of features in the most practical application fields, i.e, expert and intelligent systems, feature selection (FS) has become a promising pre-processing step for a particular task (i.e., classification and regression) in the last few decades. FS aims at selecting the optimal feature subset from the original feature dataset by removing redundant and irrelevant features, which improve the performance of the learning models. In this paper, a novel self-adjusted particle swarm optimization algorithm (SAPSO) is proposed for selecting the optimal feature subset for classification datasets. In SAPSO, we make three improvements: First, a new learning model of particles, which can extract much more useful knowledge from multiple information providers, is used to enhance the diversity of particles. Second, one-flip neighborhood search strategy is adopted to strengthen the local search ability of a swarm when the swarm enters a period of stagnation. Finally, a population replacement process is conducted, which bases on the part of new particles generated by the neighborhood search strategy, to enhance the diversity of the swarm. Moreover, the k-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. The proposed method is benchmarked on 10 well-known UCI datasets and the results are compared with 9 state-of-the-art wrapper-based FS methods. From the results, it is observed that the proposed approach significantly outperforms others on most the 10 datasets. Feature selection Combinatorial optimization Particle swarm optimization Neighborhood search strategy Wang, Xuan aut Xia, Xuewen aut Jiang, Mingfeng aut Ding, Zuohua aut Huang, Yanrong aut Enthalten in Computing Springer Vienna, 1966 103(2021), 8 vom: 21. Jan., Seite 1569-1597 (DE-627)129534927 (DE-600)215907-7 (DE-576)014963949 0010-485X nnns volume:103 year:2021 number:8 day:21 month:01 pages:1569-1597 https://doi.org/10.1007/s00607-020-00891-w lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-TEC SSG-OLC-MAT SSG-OPC-MAT SA 4220 SA 4220 AR 103 2021 8 21 01 1569-1597 |
allfieldsGer |
10.1007/s00607-020-00891-w doi (DE-627)OLC2126754391 (DE-He213)s00607-020-00891-w-p DE-627 ger DE-627 rakwb eng 004 VZ SA 4220 VZ rvk SA 4220 VZ rvk Wei, Bo verfasserin aut Novel self-adjusted particle swarm optimization algorithm for feature selection 2021 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag GmbH, AT part of Springer Nature 2021 Abstract Due to the ever increasing number of features in the most practical application fields, i.e, expert and intelligent systems, feature selection (FS) has become a promising pre-processing step for a particular task (i.e., classification and regression) in the last few decades. FS aims at selecting the optimal feature subset from the original feature dataset by removing redundant and irrelevant features, which improve the performance of the learning models. In this paper, a novel self-adjusted particle swarm optimization algorithm (SAPSO) is proposed for selecting the optimal feature subset for classification datasets. In SAPSO, we make three improvements: First, a new learning model of particles, which can extract much more useful knowledge from multiple information providers, is used to enhance the diversity of particles. Second, one-flip neighborhood search strategy is adopted to strengthen the local search ability of a swarm when the swarm enters a period of stagnation. Finally, a population replacement process is conducted, which bases on the part of new particles generated by the neighborhood search strategy, to enhance the diversity of the swarm. Moreover, the k-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. The proposed method is benchmarked on 10 well-known UCI datasets and the results are compared with 9 state-of-the-art wrapper-based FS methods. From the results, it is observed that the proposed approach significantly outperforms others on most the 10 datasets. Feature selection Combinatorial optimization Particle swarm optimization Neighborhood search strategy Wang, Xuan aut Xia, Xuewen aut Jiang, Mingfeng aut Ding, Zuohua aut Huang, Yanrong aut Enthalten in Computing Springer Vienna, 1966 103(2021), 8 vom: 21. Jan., Seite 1569-1597 (DE-627)129534927 (DE-600)215907-7 (DE-576)014963949 0010-485X nnns volume:103 year:2021 number:8 day:21 month:01 pages:1569-1597 https://doi.org/10.1007/s00607-020-00891-w lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-TEC SSG-OLC-MAT SSG-OPC-MAT SA 4220 SA 4220 AR 103 2021 8 21 01 1569-1597 |
allfieldsSound |
10.1007/s00607-020-00891-w doi (DE-627)OLC2126754391 (DE-He213)s00607-020-00891-w-p DE-627 ger DE-627 rakwb eng 004 VZ SA 4220 VZ rvk SA 4220 VZ rvk Wei, Bo verfasserin aut Novel self-adjusted particle swarm optimization algorithm for feature selection 2021 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag GmbH, AT part of Springer Nature 2021 Abstract Due to the ever increasing number of features in the most practical application fields, i.e, expert and intelligent systems, feature selection (FS) has become a promising pre-processing step for a particular task (i.e., classification and regression) in the last few decades. FS aims at selecting the optimal feature subset from the original feature dataset by removing redundant and irrelevant features, which improve the performance of the learning models. In this paper, a novel self-adjusted particle swarm optimization algorithm (SAPSO) is proposed for selecting the optimal feature subset for classification datasets. In SAPSO, we make three improvements: First, a new learning model of particles, which can extract much more useful knowledge from multiple information providers, is used to enhance the diversity of particles. Second, one-flip neighborhood search strategy is adopted to strengthen the local search ability of a swarm when the swarm enters a period of stagnation. Finally, a population replacement process is conducted, which bases on the part of new particles generated by the neighborhood search strategy, to enhance the diversity of the swarm. Moreover, the k-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. The proposed method is benchmarked on 10 well-known UCI datasets and the results are compared with 9 state-of-the-art wrapper-based FS methods. From the results, it is observed that the proposed approach significantly outperforms others on most the 10 datasets. Feature selection Combinatorial optimization Particle swarm optimization Neighborhood search strategy Wang, Xuan aut Xia, Xuewen aut Jiang, Mingfeng aut Ding, Zuohua aut Huang, Yanrong aut Enthalten in Computing Springer Vienna, 1966 103(2021), 8 vom: 21. Jan., Seite 1569-1597 (DE-627)129534927 (DE-600)215907-7 (DE-576)014963949 0010-485X nnns volume:103 year:2021 number:8 day:21 month:01 pages:1569-1597 https://doi.org/10.1007/s00607-020-00891-w lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-TEC SSG-OLC-MAT SSG-OPC-MAT SA 4220 SA 4220 AR 103 2021 8 21 01 1569-1597 |
language |
English |
source |
Enthalten in Computing 103(2021), 8 vom: 21. Jan., Seite 1569-1597 volume:103 year:2021 number:8 day:21 month:01 pages:1569-1597 |
sourceStr |
Enthalten in Computing 103(2021), 8 vom: 21. Jan., Seite 1569-1597 volume:103 year:2021 number:8 day:21 month:01 pages:1569-1597 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Feature selection Combinatorial optimization Particle swarm optimization Neighborhood search strategy |
dewey-raw |
004 |
isfreeaccess_bool |
false |
container_title |
Computing |
authorswithroles_txt_mv |
Wei, Bo @@aut@@ Wang, Xuan @@aut@@ Xia, Xuewen @@aut@@ Jiang, Mingfeng @@aut@@ Ding, Zuohua @@aut@@ Huang, Yanrong @@aut@@ |
publishDateDaySort_date |
2021-01-21T00:00:00Z |
hierarchy_top_id |
129534927 |
dewey-sort |
14 |
id |
OLC2126754391 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">OLC2126754391</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230505121147.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">230505s2021 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00607-020-00891-w</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2126754391</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s00607-020-00891-w-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">SA 4220</subfield><subfield code="q">VZ</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">SA 4220</subfield><subfield code="q">VZ</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Wei, Bo</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Novel self-adjusted particle swarm optimization algorithm for feature selection</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2021</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© The Author(s), under exclusive licence to Springer-Verlag GmbH, AT part of Springer Nature 2021</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Due to the ever increasing number of features in the most practical application fields, i.e, expert and intelligent systems, feature selection (FS) has become a promising pre-processing step for a particular task (i.e., classification and regression) in the last few decades. FS aims at selecting the optimal feature subset from the original feature dataset by removing redundant and irrelevant features, which improve the performance of the learning models. In this paper, a novel self-adjusted particle swarm optimization algorithm (SAPSO) is proposed for selecting the optimal feature subset for classification datasets. In SAPSO, we make three improvements: First, a new learning model of particles, which can extract much more useful knowledge from multiple information providers, is used to enhance the diversity of particles. Second, one-flip neighborhood search strategy is adopted to strengthen the local search ability of a swarm when the swarm enters a period of stagnation. Finally, a population replacement process is conducted, which bases on the part of new particles generated by the neighborhood search strategy, to enhance the diversity of the swarm. Moreover, the k-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. The proposed method is benchmarked on 10 well-known UCI datasets and the results are compared with 9 state-of-the-art wrapper-based FS methods. From the results, it is observed that the proposed approach significantly outperforms others on most the 10 datasets.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Feature selection</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Combinatorial optimization</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Particle swarm optimization</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Neighborhood search strategy</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Wang, Xuan</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Xia, Xuewen</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Jiang, Mingfeng</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Ding, Zuohua</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Huang, Yanrong</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Computing</subfield><subfield code="d">Springer Vienna, 1966</subfield><subfield code="g">103(2021), 8 vom: 21. Jan., Seite 1569-1597</subfield><subfield code="w">(DE-627)129534927</subfield><subfield code="w">(DE-600)215907-7</subfield><subfield code="w">(DE-576)014963949</subfield><subfield code="x">0010-485X</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:103</subfield><subfield code="g">year:2021</subfield><subfield code="g">number:8</subfield><subfield code="g">day:21</subfield><subfield code="g">month:01</subfield><subfield code="g">pages:1569-1597</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s00607-020-00891-w</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-TEC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OPC-MAT</subfield></datafield><datafield tag="936" ind1="r" ind2="v"><subfield code="a">SA 4220</subfield></datafield><datafield tag="936" ind1="r" ind2="v"><subfield code="a">SA 4220</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">103</subfield><subfield code="j">2021</subfield><subfield code="e">8</subfield><subfield code="b">21</subfield><subfield code="c">01</subfield><subfield code="h">1569-1597</subfield></datafield></record></collection>
|
author |
Wei, Bo |
spellingShingle |
Wei, Bo ddc 004 rvk SA 4220 misc Feature selection misc Combinatorial optimization misc Particle swarm optimization misc Neighborhood search strategy Novel self-adjusted particle swarm optimization algorithm for feature selection |
authorStr |
Wei, Bo |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)129534927 |
format |
Article |
dewey-ones |
004 - Data processing & computer science |
delete_txt_mv |
keep |
author_role |
aut aut aut aut aut aut |
collection |
OLC |
remote_str |
false |
illustrated |
Not Illustrated |
issn |
0010-485X |
topic_title |
004 VZ SA 4220 VZ rvk Novel self-adjusted particle swarm optimization algorithm for feature selection Feature selection Combinatorial optimization Particle swarm optimization Neighborhood search strategy |
topic |
ddc 004 rvk SA 4220 misc Feature selection misc Combinatorial optimization misc Particle swarm optimization misc Neighborhood search strategy |
topic_unstemmed |
ddc 004 rvk SA 4220 misc Feature selection misc Combinatorial optimization misc Particle swarm optimization misc Neighborhood search strategy |
topic_browse |
ddc 004 rvk SA 4220 misc Feature selection misc Combinatorial optimization misc Particle swarm optimization misc Neighborhood search strategy |
format_facet |
Aufsätze Gedruckte Aufsätze |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
nc |
hierarchy_parent_title |
Computing |
hierarchy_parent_id |
129534927 |
dewey-tens |
000 - Computer science, knowledge & systems |
hierarchy_top_title |
Computing |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)129534927 (DE-600)215907-7 (DE-576)014963949 |
title |
Novel self-adjusted particle swarm optimization algorithm for feature selection |
ctrlnum |
(DE-627)OLC2126754391 (DE-He213)s00607-020-00891-w-p |
title_full |
Novel self-adjusted particle swarm optimization algorithm for feature selection |
author_sort |
Wei, Bo |
journal |
Computing |
journalStr |
Computing |
lang_code |
eng |
isOA_bool |
false |
dewey-hundreds |
000 - Computer science, information & general works |
recordtype |
marc |
publishDateSort |
2021 |
contenttype_str_mv |
txt |
container_start_page |
1569 |
author_browse |
Wei, Bo Wang, Xuan Xia, Xuewen Jiang, Mingfeng Ding, Zuohua Huang, Yanrong |
container_volume |
103 |
class |
004 VZ SA 4220 VZ rvk |
format_se |
Aufsätze |
author-letter |
Wei, Bo |
doi_str_mv |
10.1007/s00607-020-00891-w |
dewey-full |
004 |
title_sort |
novel self-adjusted particle swarm optimization algorithm for feature selection |
title_auth |
Novel self-adjusted particle swarm optimization algorithm for feature selection |
abstract |
Abstract Due to the ever increasing number of features in the most practical application fields, i.e, expert and intelligent systems, feature selection (FS) has become a promising pre-processing step for a particular task (i.e., classification and regression) in the last few decades. FS aims at selecting the optimal feature subset from the original feature dataset by removing redundant and irrelevant features, which improve the performance of the learning models. In this paper, a novel self-adjusted particle swarm optimization algorithm (SAPSO) is proposed for selecting the optimal feature subset for classification datasets. In SAPSO, we make three improvements: First, a new learning model of particles, which can extract much more useful knowledge from multiple information providers, is used to enhance the diversity of particles. Second, one-flip neighborhood search strategy is adopted to strengthen the local search ability of a swarm when the swarm enters a period of stagnation. Finally, a population replacement process is conducted, which bases on the part of new particles generated by the neighborhood search strategy, to enhance the diversity of the swarm. Moreover, the k-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. The proposed method is benchmarked on 10 well-known UCI datasets and the results are compared with 9 state-of-the-art wrapper-based FS methods. From the results, it is observed that the proposed approach significantly outperforms others on most the 10 datasets. © The Author(s), under exclusive licence to Springer-Verlag GmbH, AT part of Springer Nature 2021 |
abstractGer |
Abstract Due to the ever increasing number of features in the most practical application fields, i.e, expert and intelligent systems, feature selection (FS) has become a promising pre-processing step for a particular task (i.e., classification and regression) in the last few decades. FS aims at selecting the optimal feature subset from the original feature dataset by removing redundant and irrelevant features, which improve the performance of the learning models. In this paper, a novel self-adjusted particle swarm optimization algorithm (SAPSO) is proposed for selecting the optimal feature subset for classification datasets. In SAPSO, we make three improvements: First, a new learning model of particles, which can extract much more useful knowledge from multiple information providers, is used to enhance the diversity of particles. Second, one-flip neighborhood search strategy is adopted to strengthen the local search ability of a swarm when the swarm enters a period of stagnation. Finally, a population replacement process is conducted, which bases on the part of new particles generated by the neighborhood search strategy, to enhance the diversity of the swarm. Moreover, the k-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. The proposed method is benchmarked on 10 well-known UCI datasets and the results are compared with 9 state-of-the-art wrapper-based FS methods. From the results, it is observed that the proposed approach significantly outperforms others on most the 10 datasets. © The Author(s), under exclusive licence to Springer-Verlag GmbH, AT part of Springer Nature 2021 |
abstract_unstemmed |
Abstract Due to the ever increasing number of features in the most practical application fields, i.e, expert and intelligent systems, feature selection (FS) has become a promising pre-processing step for a particular task (i.e., classification and regression) in the last few decades. FS aims at selecting the optimal feature subset from the original feature dataset by removing redundant and irrelevant features, which improve the performance of the learning models. In this paper, a novel self-adjusted particle swarm optimization algorithm (SAPSO) is proposed for selecting the optimal feature subset for classification datasets. In SAPSO, we make three improvements: First, a new learning model of particles, which can extract much more useful knowledge from multiple information providers, is used to enhance the diversity of particles. Second, one-flip neighborhood search strategy is adopted to strengthen the local search ability of a swarm when the swarm enters a period of stagnation. Finally, a population replacement process is conducted, which bases on the part of new particles generated by the neighborhood search strategy, to enhance the diversity of the swarm. Moreover, the k-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. The proposed method is benchmarked on 10 well-known UCI datasets and the results are compared with 9 state-of-the-art wrapper-based FS methods. From the results, it is observed that the proposed approach significantly outperforms others on most the 10 datasets. © The Author(s), under exclusive licence to Springer-Verlag GmbH, AT part of Springer Nature 2021 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-TEC SSG-OLC-MAT SSG-OPC-MAT |
container_issue |
8 |
title_short |
Novel self-adjusted particle swarm optimization algorithm for feature selection |
url |
https://doi.org/10.1007/s00607-020-00891-w |
remote_bool |
false |
author2 |
Wang, Xuan Xia, Xuewen Jiang, Mingfeng Ding, Zuohua Huang, Yanrong |
author2Str |
Wang, Xuan Xia, Xuewen Jiang, Mingfeng Ding, Zuohua Huang, Yanrong |
ppnlink |
129534927 |
mediatype_str_mv |
n |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s00607-020-00891-w |
up_date |
2024-07-04T08:16:55.871Z |
_version_ |
1803635671298998272 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">OLC2126754391</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230505121147.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">230505s2021 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00607-020-00891-w</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2126754391</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s00607-020-00891-w-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">SA 4220</subfield><subfield code="q">VZ</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">SA 4220</subfield><subfield code="q">VZ</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Wei, Bo</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Novel self-adjusted particle swarm optimization algorithm for feature selection</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2021</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© The Author(s), under exclusive licence to Springer-Verlag GmbH, AT part of Springer Nature 2021</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Due to the ever increasing number of features in the most practical application fields, i.e, expert and intelligent systems, feature selection (FS) has become a promising pre-processing step for a particular task (i.e., classification and regression) in the last few decades. FS aims at selecting the optimal feature subset from the original feature dataset by removing redundant and irrelevant features, which improve the performance of the learning models. In this paper, a novel self-adjusted particle swarm optimization algorithm (SAPSO) is proposed for selecting the optimal feature subset for classification datasets. In SAPSO, we make three improvements: First, a new learning model of particles, which can extract much more useful knowledge from multiple information providers, is used to enhance the diversity of particles. Second, one-flip neighborhood search strategy is adopted to strengthen the local search ability of a swarm when the swarm enters a period of stagnation. Finally, a population replacement process is conducted, which bases on the part of new particles generated by the neighborhood search strategy, to enhance the diversity of the swarm. Moreover, the k-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. The proposed method is benchmarked on 10 well-known UCI datasets and the results are compared with 9 state-of-the-art wrapper-based FS methods. From the results, it is observed that the proposed approach significantly outperforms others on most the 10 datasets.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Feature selection</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Combinatorial optimization</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Particle swarm optimization</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Neighborhood search strategy</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Wang, Xuan</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Xia, Xuewen</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Jiang, Mingfeng</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Ding, Zuohua</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Huang, Yanrong</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Computing</subfield><subfield code="d">Springer Vienna, 1966</subfield><subfield code="g">103(2021), 8 vom: 21. Jan., Seite 1569-1597</subfield><subfield code="w">(DE-627)129534927</subfield><subfield code="w">(DE-600)215907-7</subfield><subfield code="w">(DE-576)014963949</subfield><subfield code="x">0010-485X</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:103</subfield><subfield code="g">year:2021</subfield><subfield code="g">number:8</subfield><subfield code="g">day:21</subfield><subfield code="g">month:01</subfield><subfield code="g">pages:1569-1597</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s00607-020-00891-w</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-TEC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OPC-MAT</subfield></datafield><datafield tag="936" ind1="r" ind2="v"><subfield code="a">SA 4220</subfield></datafield><datafield tag="936" ind1="r" ind2="v"><subfield code="a">SA 4220</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">103</subfield><subfield code="j">2021</subfield><subfield code="e">8</subfield><subfield code="b">21</subfield><subfield code="c">01</subfield><subfield code="h">1569-1597</subfield></datafield></record></collection>
|
score |
7.4022093 |