Hybrid differential evolution and Nelder–Mead algorithm with re-optimization
Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better...
Ausführliche Beschreibung
Autor*in: |
Gao, Zhenxiao [verfasserIn] Xiao, Tianyuan [verfasserIn] Fan, Wenhui [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2010 |
---|
Schlagwörter: |
---|
Übergeordnetes Werk: |
Enthalten in: Soft Computing - Springer-Verlag, 2003, 15(2010), 3 vom: 25. Feb., Seite 581-594 |
---|---|
Übergeordnetes Werk: |
volume:15 ; year:2010 ; number:3 ; day:25 ; month:02 ; pages:581-594 |
Links: |
---|
DOI / URN: |
10.1007/s00500-010-0566-2 |
---|
Katalog-ID: |
SPR00647876X |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | SPR00647876X | ||
003 | DE-627 | ||
005 | 20201124002732.0 | ||
007 | cr uuu---uuuuu | ||
008 | 201005s2010 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1007/s00500-010-0566-2 |2 doi | |
035 | |a (DE-627)SPR00647876X | ||
035 | |a (SPR)s00500-010-0566-2-e | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
100 | 1 | |a Gao, Zhenxiao |e verfasserin |4 aut | |
245 | 1 | 0 | |a Hybrid differential evolution and Nelder–Mead algorithm with re-optimization |
264 | 1 | |c 2010 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
520 | |a Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. | ||
650 | 4 | |a Nelder–Mead algorithm |7 (dpeaa)DE-He213 | |
650 | 4 | |a Differential evolution |7 (dpeaa)DE-He213 | |
650 | 4 | |a Hybrid algorithm |7 (dpeaa)DE-He213 | |
650 | 4 | |a Memetic algorithm |7 (dpeaa)DE-He213 | |
700 | 1 | |a Xiao, Tianyuan |e verfasserin |4 aut | |
700 | 1 | |a Fan, Wenhui |e verfasserin |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Soft Computing |d Springer-Verlag, 2003 |g 15(2010), 3 vom: 25. Feb., Seite 581-594 |w (DE-627)SPR006469531 |7 nnns |
773 | 1 | 8 | |g volume:15 |g year:2010 |g number:3 |g day:25 |g month:02 |g pages:581-594 |
856 | 4 | 0 | |u https://dx.doi.org/10.1007/s00500-010-0566-2 |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_SPRINGER | ||
951 | |a AR | ||
952 | |d 15 |j 2010 |e 3 |b 25 |c 02 |h 581-594 |
author_variant |
z g zg t x tx w f wf |
---|---|
matchkey_str |
gaozhenxiaoxiaotianyuanfanwenhui:2010----:yrdifrnilvltoadedredloih |
hierarchy_sort_str |
2010 |
publishDate |
2010 |
allfields |
10.1007/s00500-010-0566-2 doi (DE-627)SPR00647876X (SPR)s00500-010-0566-2-e DE-627 ger DE-627 rakwb eng Gao, Zhenxiao verfasserin aut Hybrid differential evolution and Nelder–Mead algorithm with re-optimization 2010 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. Nelder–Mead algorithm (dpeaa)DE-He213 Differential evolution (dpeaa)DE-He213 Hybrid algorithm (dpeaa)DE-He213 Memetic algorithm (dpeaa)DE-He213 Xiao, Tianyuan verfasserin aut Fan, Wenhui verfasserin aut Enthalten in Soft Computing Springer-Verlag, 2003 15(2010), 3 vom: 25. Feb., Seite 581-594 (DE-627)SPR006469531 nnns volume:15 year:2010 number:3 day:25 month:02 pages:581-594 https://dx.doi.org/10.1007/s00500-010-0566-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 15 2010 3 25 02 581-594 |
spelling |
10.1007/s00500-010-0566-2 doi (DE-627)SPR00647876X (SPR)s00500-010-0566-2-e DE-627 ger DE-627 rakwb eng Gao, Zhenxiao verfasserin aut Hybrid differential evolution and Nelder–Mead algorithm with re-optimization 2010 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. Nelder–Mead algorithm (dpeaa)DE-He213 Differential evolution (dpeaa)DE-He213 Hybrid algorithm (dpeaa)DE-He213 Memetic algorithm (dpeaa)DE-He213 Xiao, Tianyuan verfasserin aut Fan, Wenhui verfasserin aut Enthalten in Soft Computing Springer-Verlag, 2003 15(2010), 3 vom: 25. Feb., Seite 581-594 (DE-627)SPR006469531 nnns volume:15 year:2010 number:3 day:25 month:02 pages:581-594 https://dx.doi.org/10.1007/s00500-010-0566-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 15 2010 3 25 02 581-594 |
allfields_unstemmed |
10.1007/s00500-010-0566-2 doi (DE-627)SPR00647876X (SPR)s00500-010-0566-2-e DE-627 ger DE-627 rakwb eng Gao, Zhenxiao verfasserin aut Hybrid differential evolution and Nelder–Mead algorithm with re-optimization 2010 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. Nelder–Mead algorithm (dpeaa)DE-He213 Differential evolution (dpeaa)DE-He213 Hybrid algorithm (dpeaa)DE-He213 Memetic algorithm (dpeaa)DE-He213 Xiao, Tianyuan verfasserin aut Fan, Wenhui verfasserin aut Enthalten in Soft Computing Springer-Verlag, 2003 15(2010), 3 vom: 25. Feb., Seite 581-594 (DE-627)SPR006469531 nnns volume:15 year:2010 number:3 day:25 month:02 pages:581-594 https://dx.doi.org/10.1007/s00500-010-0566-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 15 2010 3 25 02 581-594 |
allfieldsGer |
10.1007/s00500-010-0566-2 doi (DE-627)SPR00647876X (SPR)s00500-010-0566-2-e DE-627 ger DE-627 rakwb eng Gao, Zhenxiao verfasserin aut Hybrid differential evolution and Nelder–Mead algorithm with re-optimization 2010 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. Nelder–Mead algorithm (dpeaa)DE-He213 Differential evolution (dpeaa)DE-He213 Hybrid algorithm (dpeaa)DE-He213 Memetic algorithm (dpeaa)DE-He213 Xiao, Tianyuan verfasserin aut Fan, Wenhui verfasserin aut Enthalten in Soft Computing Springer-Verlag, 2003 15(2010), 3 vom: 25. Feb., Seite 581-594 (DE-627)SPR006469531 nnns volume:15 year:2010 number:3 day:25 month:02 pages:581-594 https://dx.doi.org/10.1007/s00500-010-0566-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 15 2010 3 25 02 581-594 |
allfieldsSound |
10.1007/s00500-010-0566-2 doi (DE-627)SPR00647876X (SPR)s00500-010-0566-2-e DE-627 ger DE-627 rakwb eng Gao, Zhenxiao verfasserin aut Hybrid differential evolution and Nelder–Mead algorithm with re-optimization 2010 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. Nelder–Mead algorithm (dpeaa)DE-He213 Differential evolution (dpeaa)DE-He213 Hybrid algorithm (dpeaa)DE-He213 Memetic algorithm (dpeaa)DE-He213 Xiao, Tianyuan verfasserin aut Fan, Wenhui verfasserin aut Enthalten in Soft Computing Springer-Verlag, 2003 15(2010), 3 vom: 25. Feb., Seite 581-594 (DE-627)SPR006469531 nnns volume:15 year:2010 number:3 day:25 month:02 pages:581-594 https://dx.doi.org/10.1007/s00500-010-0566-2 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 15 2010 3 25 02 581-594 |
language |
English |
source |
Enthalten in Soft Computing 15(2010), 3 vom: 25. Feb., Seite 581-594 volume:15 year:2010 number:3 day:25 month:02 pages:581-594 |
sourceStr |
Enthalten in Soft Computing 15(2010), 3 vom: 25. Feb., Seite 581-594 volume:15 year:2010 number:3 day:25 month:02 pages:581-594 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Nelder–Mead algorithm Differential evolution Hybrid algorithm Memetic algorithm |
isfreeaccess_bool |
false |
container_title |
Soft Computing |
authorswithroles_txt_mv |
Gao, Zhenxiao @@aut@@ Xiao, Tianyuan @@aut@@ Fan, Wenhui @@aut@@ |
publishDateDaySort_date |
2010-02-25T00:00:00Z |
hierarchy_top_id |
SPR006469531 |
id |
SPR00647876X |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">SPR00647876X</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20201124002732.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">201005s2010 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00500-010-0566-2</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR00647876X</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s00500-010-0566-2-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Gao, Zhenxiao</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Hybrid differential evolution and Nelder–Mead algorithm with re-optimization</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2010</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Nelder–Mead algorithm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Differential evolution</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Hybrid algorithm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Memetic algorithm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Xiao, Tianyuan</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Fan, Wenhui</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Soft Computing</subfield><subfield code="d">Springer-Verlag, 2003</subfield><subfield code="g">15(2010), 3 vom: 25. Feb., Seite 581-594</subfield><subfield code="w">(DE-627)SPR006469531</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:15</subfield><subfield code="g">year:2010</subfield><subfield code="g">number:3</subfield><subfield code="g">day:25</subfield><subfield code="g">month:02</subfield><subfield code="g">pages:581-594</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s00500-010-0566-2</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">15</subfield><subfield code="j">2010</subfield><subfield code="e">3</subfield><subfield code="b">25</subfield><subfield code="c">02</subfield><subfield code="h">581-594</subfield></datafield></record></collection>
|
author |
Gao, Zhenxiao |
spellingShingle |
Gao, Zhenxiao misc Nelder–Mead algorithm misc Differential evolution misc Hybrid algorithm misc Memetic algorithm Hybrid differential evolution and Nelder–Mead algorithm with re-optimization |
authorStr |
Gao, Zhenxiao |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)SPR006469531 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut aut |
collection |
springer |
remote_str |
true |
illustrated |
Not Illustrated |
topic_title |
Hybrid differential evolution and Nelder–Mead algorithm with re-optimization Nelder–Mead algorithm (dpeaa)DE-He213 Differential evolution (dpeaa)DE-He213 Hybrid algorithm (dpeaa)DE-He213 Memetic algorithm (dpeaa)DE-He213 |
topic |
misc Nelder–Mead algorithm misc Differential evolution misc Hybrid algorithm misc Memetic algorithm |
topic_unstemmed |
misc Nelder–Mead algorithm misc Differential evolution misc Hybrid algorithm misc Memetic algorithm |
topic_browse |
misc Nelder–Mead algorithm misc Differential evolution misc Hybrid algorithm misc Memetic algorithm |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
Soft Computing |
hierarchy_parent_id |
SPR006469531 |
hierarchy_top_title |
Soft Computing |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)SPR006469531 |
title |
Hybrid differential evolution and Nelder–Mead algorithm with re-optimization |
ctrlnum |
(DE-627)SPR00647876X (SPR)s00500-010-0566-2-e |
title_full |
Hybrid differential evolution and Nelder–Mead algorithm with re-optimization |
author_sort |
Gao, Zhenxiao |
journal |
Soft Computing |
journalStr |
Soft Computing |
lang_code |
eng |
isOA_bool |
false |
recordtype |
marc |
publishDateSort |
2010 |
contenttype_str_mv |
txt |
container_start_page |
581 |
author_browse |
Gao, Zhenxiao Xiao, Tianyuan Fan, Wenhui |
container_volume |
15 |
format_se |
Elektronische Aufsätze |
author-letter |
Gao, Zhenxiao |
doi_str_mv |
10.1007/s00500-010-0566-2 |
author2-role |
verfasserin |
title_sort |
hybrid differential evolution and nelder–mead algorithm with re-optimization |
title_auth |
Hybrid differential evolution and Nelder–Mead algorithm with re-optimization |
abstract |
Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. |
abstractGer |
Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. |
abstract_unstemmed |
Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER |
container_issue |
3 |
title_short |
Hybrid differential evolution and Nelder–Mead algorithm with re-optimization |
url |
https://dx.doi.org/10.1007/s00500-010-0566-2 |
remote_bool |
true |
author2 |
Xiao, Tianyuan Fan, Wenhui |
author2Str |
Xiao, Tianyuan Fan, Wenhui |
ppnlink |
SPR006469531 |
mediatype_str_mv |
c |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s00500-010-0566-2 |
up_date |
2024-07-03T23:13:39.366Z |
_version_ |
1803601491387219968 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">SPR00647876X</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20201124002732.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">201005s2010 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00500-010-0566-2</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR00647876X</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s00500-010-0566-2-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Gao, Zhenxiao</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Hybrid differential evolution and Nelder–Mead algorithm with re-optimization</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2010</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Nonlinear optimization algorithms could be divided into local exploitation methods such as Nelder–Mead (NM) algorithm and global exploration ones, such as differential evolution (DE). The former searches fast yet could be easily trapped by local optimum, whereas the latter possesses better convergence quality. This paper proposes hybrid differential evolution and NM algorithm with re-optimization, called as DE-NMR. At first a modified NM, called NMR is presented. It re-optimizes from the optimum point at the first time and thus being able to jump out of local optimum, exhibits better properties than NM. Then, NMR is combined with DE. To deal with equal constraints, adaptive penalty function method is adopted in DE-NMR, which relaxes equal constraints into unequal constrained functions with an adaptive relaxation parameter that varies with iteration. Benchmark optimization problems as well as engineering design problems are used to experiment the performance of DE-NMR, with the number of function evaluation times being employed as the main index of measuring convergence speed, and objective function values as the main index of optimum’s quality. Non-parametric tests are employed in comparing results with other global optimization algorithms. Results illustrate the fast convergence speed of DE-NMR.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Nelder–Mead algorithm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Differential evolution</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Hybrid algorithm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Memetic algorithm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Xiao, Tianyuan</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Fan, Wenhui</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Soft Computing</subfield><subfield code="d">Springer-Verlag, 2003</subfield><subfield code="g">15(2010), 3 vom: 25. Feb., Seite 581-594</subfield><subfield code="w">(DE-627)SPR006469531</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:15</subfield><subfield code="g">year:2010</subfield><subfield code="g">number:3</subfield><subfield code="g">day:25</subfield><subfield code="g">month:02</subfield><subfield code="g">pages:581-594</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s00500-010-0566-2</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">15</subfield><subfield code="j">2010</subfield><subfield code="e">3</subfield><subfield code="b">25</subfield><subfield code="c">02</subfield><subfield code="h">581-594</subfield></datafield></record></collection>
|
score |
7.3987885 |