Hybrid differential evolution and Nelder–Mead algorithm with re-optimization
Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better...
Ausführliche Beschreibung
Autor*in: |
Gao, Zhenxiao [verfasserIn] |
---|
Format: |
Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2010 |
---|
Schlagwörter: |
---|
Anmerkung: |
© Springer-Verlag 2010 |
---|
Übergeordnetes Werk: |
Enthalten in: Soft computing - Springer-Verlag, 1997, 15(2010), 3 vom: 25. Feb., Seite 581-594 |
---|---|
Übergeordnetes Werk: |
volume:15 ; year:2010 ; number:3 ; day:25 ; month:02 ; pages:581-594 |
Links: |
---|
DOI / URN: |
10.1007/s00500-010-0566-2 |
---|
Katalog-ID: |
OLC2034869893 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | OLC2034869893 | ||
003 | DE-627 | ||
005 | 20230502111556.0 | ||
007 | tu | ||
008 | 200820s2010 xx ||||| 00| ||eng c | ||
024 | 7 | |a 10.1007/s00500-010-0566-2 |2 doi | |
035 | |a (DE-627)OLC2034869893 | ||
035 | |a (DE-He213)s00500-010-0566-2-p | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
082 | 0 | 4 | |a 004 |q VZ |
082 | 0 | 4 | |a 004 |q VZ |
084 | |a 11 |2 ssgn | ||
100 | 1 | |a Gao, Zhenxiao |e verfasserin |4 aut | |
245 | 1 | 0 | |a Hybrid differential evolution and Nelder–Mead algorithm with re-optimization |
264 | 1 | |c 2010 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ohne Hilfsmittel zu benutzen |b n |2 rdamedia | ||
338 | |a Band |b nc |2 rdacarrier | ||
500 | |a © Springer-Verlag 2010 | ||
520 | |a Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. | ||
650 | 4 | |a Nelder–Mead algorithm | |
650 | 4 | |a Differential evolution | |
650 | 4 | |a Hybrid algorithm | |
650 | 4 | |a Memetic algorithm | |
700 | 1 | |a Xiao, Tianyuan |4 aut | |
700 | 1 | |a Fan, Wenhui |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Soft computing |d Springer-Verlag, 1997 |g 15(2010), 3 vom: 25. Feb., Seite 581-594 |w (DE-627)231970536 |w (DE-600)1387526-7 |w (DE-576)060238259 |x 1432-7643 |7 nnns |
773 | 1 | 8 | |g volume:15 |g year:2010 |g number:3 |g day:25 |g month:02 |g pages:581-594 |
856 | 4 | 1 | |u https://doi.org/10.1007/s00500-010-0566-2 |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_OLC | ||
912 | |a SSG-OLC-MAT | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_267 | ||
912 | |a GBV_ILN_2018 | ||
912 | |a GBV_ILN_4277 | ||
951 | |a AR | ||
952 | |d 15 |j 2010 |e 3 |b 25 |c 02 |h 581-594 |
author_variant |
z g zg t x tx w f wf |
---|---|
matchkey_str |
article:14327643:2010----::yrdifrnilvltoadedredloih |
hierarchy_sort_str |
2010 |
publishDate |
2010 |
allfields |
10.1007/s00500-010-0566-2 doi (DE-627)OLC2034869893 (DE-He213)s00500-010-0566-2-p DE-627 ger DE-627 rakwb eng 004 VZ 004 VZ 11 ssgn Gao, Zhenxiao verfasserin aut Hybrid differential evolution and Nelder–Mead algorithm with re-optimization 2010 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag 2010 Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. Nelder–Mead algorithm Differential evolution Hybrid algorithm Memetic algorithm Xiao, Tianyuan aut Fan, Wenhui aut Enthalten in Soft computing Springer-Verlag, 1997 15(2010), 3 vom: 25. Feb., Seite 581-594 (DE-627)231970536 (DE-600)1387526-7 (DE-576)060238259 1432-7643 nnns volume:15 year:2010 number:3 day:25 month:02 pages:581-594 https://doi.org/10.1007/s00500-010-0566-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_40 GBV_ILN_70 GBV_ILN_267 GBV_ILN_2018 GBV_ILN_4277 AR 15 2010 3 25 02 581-594 |
spelling |
10.1007/s00500-010-0566-2 doi (DE-627)OLC2034869893 (DE-He213)s00500-010-0566-2-p DE-627 ger DE-627 rakwb eng 004 VZ 004 VZ 11 ssgn Gao, Zhenxiao verfasserin aut Hybrid differential evolution and Nelder–Mead algorithm with re-optimization 2010 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag 2010 Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. Nelder–Mead algorithm Differential evolution Hybrid algorithm Memetic algorithm Xiao, Tianyuan aut Fan, Wenhui aut Enthalten in Soft computing Springer-Verlag, 1997 15(2010), 3 vom: 25. Feb., Seite 581-594 (DE-627)231970536 (DE-600)1387526-7 (DE-576)060238259 1432-7643 nnns volume:15 year:2010 number:3 day:25 month:02 pages:581-594 https://doi.org/10.1007/s00500-010-0566-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_40 GBV_ILN_70 GBV_ILN_267 GBV_ILN_2018 GBV_ILN_4277 AR 15 2010 3 25 02 581-594 |
allfields_unstemmed |
10.1007/s00500-010-0566-2 doi (DE-627)OLC2034869893 (DE-He213)s00500-010-0566-2-p DE-627 ger DE-627 rakwb eng 004 VZ 004 VZ 11 ssgn Gao, Zhenxiao verfasserin aut Hybrid differential evolution and Nelder–Mead algorithm with re-optimization 2010 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag 2010 Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. Nelder–Mead algorithm Differential evolution Hybrid algorithm Memetic algorithm Xiao, Tianyuan aut Fan, Wenhui aut Enthalten in Soft computing Springer-Verlag, 1997 15(2010), 3 vom: 25. Feb., Seite 581-594 (DE-627)231970536 (DE-600)1387526-7 (DE-576)060238259 1432-7643 nnns volume:15 year:2010 number:3 day:25 month:02 pages:581-594 https://doi.org/10.1007/s00500-010-0566-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_40 GBV_ILN_70 GBV_ILN_267 GBV_ILN_2018 GBV_ILN_4277 AR 15 2010 3 25 02 581-594 |
allfieldsGer |
10.1007/s00500-010-0566-2 doi (DE-627)OLC2034869893 (DE-He213)s00500-010-0566-2-p DE-627 ger DE-627 rakwb eng 004 VZ 004 VZ 11 ssgn Gao, Zhenxiao verfasserin aut Hybrid differential evolution and Nelder–Mead algorithm with re-optimization 2010 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag 2010 Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. Nelder–Mead algorithm Differential evolution Hybrid algorithm Memetic algorithm Xiao, Tianyuan aut Fan, Wenhui aut Enthalten in Soft computing Springer-Verlag, 1997 15(2010), 3 vom: 25. Feb., Seite 581-594 (DE-627)231970536 (DE-600)1387526-7 (DE-576)060238259 1432-7643 nnns volume:15 year:2010 number:3 day:25 month:02 pages:581-594 https://doi.org/10.1007/s00500-010-0566-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_40 GBV_ILN_70 GBV_ILN_267 GBV_ILN_2018 GBV_ILN_4277 AR 15 2010 3 25 02 581-594 |
allfieldsSound |
10.1007/s00500-010-0566-2 doi (DE-627)OLC2034869893 (DE-He213)s00500-010-0566-2-p DE-627 ger DE-627 rakwb eng 004 VZ 004 VZ 11 ssgn Gao, Zhenxiao verfasserin aut Hybrid differential evolution and Nelder–Mead algorithm with re-optimization 2010 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag 2010 Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. Nelder–Mead algorithm Differential evolution Hybrid algorithm Memetic algorithm Xiao, Tianyuan aut Fan, Wenhui aut Enthalten in Soft computing Springer-Verlag, 1997 15(2010), 3 vom: 25. Feb., Seite 581-594 (DE-627)231970536 (DE-600)1387526-7 (DE-576)060238259 1432-7643 nnns volume:15 year:2010 number:3 day:25 month:02 pages:581-594 https://doi.org/10.1007/s00500-010-0566-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_40 GBV_ILN_70 GBV_ILN_267 GBV_ILN_2018 GBV_ILN_4277 AR 15 2010 3 25 02 581-594 |
language |
English |
source |
Enthalten in Soft computing 15(2010), 3 vom: 25. Feb., Seite 581-594 volume:15 year:2010 number:3 day:25 month:02 pages:581-594 |
sourceStr |
Enthalten in Soft computing 15(2010), 3 vom: 25. Feb., Seite 581-594 volume:15 year:2010 number:3 day:25 month:02 pages:581-594 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Nelder–Mead algorithm Differential evolution Hybrid algorithm Memetic algorithm |
dewey-raw |
004 |
isfreeaccess_bool |
false |
container_title |
Soft computing |
authorswithroles_txt_mv |
Gao, Zhenxiao @@aut@@ Xiao, Tianyuan @@aut@@ Fan, Wenhui @@aut@@ |
publishDateDaySort_date |
2010-02-25T00:00:00Z |
hierarchy_top_id |
231970536 |
dewey-sort |
14 |
id |
OLC2034869893 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2034869893</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230502111556.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">200820s2010 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00500-010-0566-2</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2034869893</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s00500-010-0566-2-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">11</subfield><subfield code="2">ssgn</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Gao, Zhenxiao</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Hybrid differential evolution and Nelder–Mead algorithm with re-optimization</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2010</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer-Verlag 2010</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Nelder–Mead algorithm</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Differential evolution</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Hybrid algorithm</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Memetic algorithm</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Xiao, Tianyuan</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Fan, Wenhui</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Soft computing</subfield><subfield code="d">Springer-Verlag, 1997</subfield><subfield code="g">15(2010), 3 vom: 25. Feb., Seite 581-594</subfield><subfield code="w">(DE-627)231970536</subfield><subfield code="w">(DE-600)1387526-7</subfield><subfield code="w">(DE-576)060238259</subfield><subfield code="x">1432-7643</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:15</subfield><subfield code="g">year:2010</subfield><subfield code="g">number:3</subfield><subfield code="g">day:25</subfield><subfield code="g">month:02</subfield><subfield code="g">pages:581-594</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s00500-010-0566-2</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_267</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2018</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4277</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">15</subfield><subfield code="j">2010</subfield><subfield code="e">3</subfield><subfield code="b">25</subfield><subfield code="c">02</subfield><subfield code="h">581-594</subfield></datafield></record></collection>
|
author |
Gao, Zhenxiao |
spellingShingle |
Gao, Zhenxiao ddc 004 ssgn 11 misc Nelder–Mead algorithm misc Differential evolution misc Hybrid algorithm misc Memetic algorithm Hybrid differential evolution and Nelder–Mead algorithm with re-optimization |
authorStr |
Gao, Zhenxiao |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)231970536 |
format |
Article |
dewey-ones |
004 - Data processing & computer science |
delete_txt_mv |
keep |
author_role |
aut aut aut |
collection |
OLC |
remote_str |
false |
illustrated |
Not Illustrated |
issn |
1432-7643 |
topic_title |
004 VZ 11 ssgn Hybrid differential evolution and Nelder–Mead algorithm with re-optimization Nelder–Mead algorithm Differential evolution Hybrid algorithm Memetic algorithm |
topic |
ddc 004 ssgn 11 misc Nelder–Mead algorithm misc Differential evolution misc Hybrid algorithm misc Memetic algorithm |
topic_unstemmed |
ddc 004 ssgn 11 misc Nelder–Mead algorithm misc Differential evolution misc Hybrid algorithm misc Memetic algorithm |
topic_browse |
ddc 004 ssgn 11 misc Nelder–Mead algorithm misc Differential evolution misc Hybrid algorithm misc Memetic algorithm |
format_facet |
Aufsätze Gedruckte Aufsätze |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
nc |
hierarchy_parent_title |
Soft computing |
hierarchy_parent_id |
231970536 |
dewey-tens |
000 - Computer science, knowledge & systems |
hierarchy_top_title |
Soft computing |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)231970536 (DE-600)1387526-7 (DE-576)060238259 |
title |
Hybrid differential evolution and Nelder–Mead algorithm with re-optimization |
ctrlnum |
(DE-627)OLC2034869893 (DE-He213)s00500-010-0566-2-p |
title_full |
Hybrid differential evolution and Nelder–Mead algorithm with re-optimization |
author_sort |
Gao, Zhenxiao |
journal |
Soft computing |
journalStr |
Soft computing |
lang_code |
eng |
isOA_bool |
false |
dewey-hundreds |
000 - Computer science, information & general works |
recordtype |
marc |
publishDateSort |
2010 |
contenttype_str_mv |
txt |
container_start_page |
581 |
author_browse |
Gao, Zhenxiao Xiao, Tianyuan Fan, Wenhui |
container_volume |
15 |
class |
004 VZ 11 ssgn |
format_se |
Aufsätze |
author-letter |
Gao, Zhenxiao |
doi_str_mv |
10.1007/s00500-010-0566-2 |
dewey-full |
004 |
title_sort |
hybrid differential evolution and nelder–mead algorithm with re-optimization |
title_auth |
Hybrid differential evolution and Nelder–Mead algorithm with re-optimization |
abstract |
Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. © Springer-Verlag 2010 |
abstractGer |
Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. © Springer-Verlag 2010 |
abstract_unstemmed |
Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. © Springer-Verlag 2010 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_40 GBV_ILN_70 GBV_ILN_267 GBV_ILN_2018 GBV_ILN_4277 |
container_issue |
3 |
title_short |
Hybrid differential evolution and Nelder–Mead algorithm with re-optimization |
url |
https://doi.org/10.1007/s00500-010-0566-2 |
remote_bool |
false |
author2 |
Xiao, Tianyuan Fan, Wenhui |
author2Str |
Xiao, Tianyuan Fan, Wenhui |
ppnlink |
231970536 |
mediatype_str_mv |
n |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s00500-010-0566-2 |
up_date |
2024-07-03T22:47:29.762Z |
_version_ |
1803599845537087488 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2034869893</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230502111556.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">200820s2010 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00500-010-0566-2</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2034869893</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s00500-010-0566-2-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">11</subfield><subfield code="2">ssgn</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Gao, Zhenxiao</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Hybrid differential evolution and Nelder–Mead algorithm with re-optimization</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2010</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer-Verlag 2010</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Nelder–Mead algorithm</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Differential evolution</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Hybrid algorithm</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Memetic algorithm</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Xiao, Tianyuan</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Fan, Wenhui</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Soft computing</subfield><subfield code="d">Springer-Verlag, 1997</subfield><subfield code="g">15(2010), 3 vom: 25. Feb., Seite 581-594</subfield><subfield code="w">(DE-627)231970536</subfield><subfield code="w">(DE-600)1387526-7</subfield><subfield code="w">(DE-576)060238259</subfield><subfield code="x">1432-7643</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:15</subfield><subfield code="g">year:2010</subfield><subfield code="g">number:3</subfield><subfield code="g">day:25</subfield><subfield code="g">month:02</subfield><subfield code="g">pages:581-594</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s00500-010-0566-2</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_267</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2018</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4277</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">15</subfield><subfield code="j">2010</subfield><subfield code="e">3</subfield><subfield code="b">25</subfield><subfield code="c">02</subfield><subfield code="h">581-594</subfield></datafield></record></collection>
|
score |
7.4017124 |