Variable selection for the partial linear single-index model
Abstract In this paper, we consider the issue of variable selection in partial linear single-index models under the assumption that the vector of regression coefficients is sparse. We apply penalized spline to estimate the nonparametric function and SCAD penalty to achieve sparse estimates of regres...
Ausführliche Beschreibung
Autor*in: |
Wang, Wu [verfasserIn] |
---|
Format: |
Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2017 |
---|
Schlagwörter: |
---|
Anmerkung: |
© Institute of Applied Mathematics, Academy of Mathematics and System Sciences, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg 2017 |
---|
Übergeordnetes Werk: |
Enthalten in: Acta mathematicae applicatae sinica / English series - Springer Berlin Heidelberg, 2002, 33(2017), 2 vom: Apr., Seite 373-388 |
---|---|
Übergeordnetes Werk: |
volume:33 ; year:2017 ; number:2 ; month:04 ; pages:373-388 |
Links: |
---|
DOI / URN: |
10.1007/s10255-017-0666-1 |
---|
Katalog-ID: |
OLC210996751X |
---|
LEADER | 01000naa a22002652 4500 | ||
---|---|---|---|
001 | OLC210996751X | ||
003 | DE-627 | ||
005 | 20230502180559.0 | ||
007 | tu | ||
008 | 230502s2017 xx ||||| 00| ||eng c | ||
024 | 7 | |a 10.1007/s10255-017-0666-1 |2 doi | |
035 | |a (DE-627)OLC210996751X | ||
035 | |a (DE-He213)s10255-017-0666-1-p | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
082 | 0 | 4 | |a 510 |q VZ |
084 | |a 31.80$jAngewandte Mathematik |2 bkl | ||
100 | 1 | |a Wang, Wu |e verfasserin |4 aut | |
245 | 1 | 0 | |a Variable selection for the partial linear single-index model |
264 | 1 | |c 2017 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ohne Hilfsmittel zu benutzen |b n |2 rdamedia | ||
338 | |a Band |b nc |2 rdacarrier | ||
500 | |a © Institute of Applied Mathematics, Academy of Mathematics and System Sciences, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg 2017 | ||
520 | |a Abstract In this paper, we consider the issue of variable selection in partial linear single-index models under the assumption that the vector of regression coefficients is sparse. We apply penalized spline to estimate the nonparametric function and SCAD penalty to achieve sparse estimates of regression parameters in both the linear and single-index parts of the model. Under some mild conditions, it is shown that the penalized estimators have oracle property, in the sense that it is asymptotically normal with the same mean and covariance that they would have if zero coefficients are known in advance. Our model owns a least square representation, therefore standard least square programming algorithms can be implemented without extra programming efforts. In the meantime, parametric estimation, variable selection and nonparametric estimation can be realized in one step, which incredibly increases computational stability. The finite sample performance of the penalized estimators is evaluated through Monte Carlo studies and illustrated with a real data set. | ||
650 | 4 | |a nonparametric link function | |
650 | 4 | |a SCAD penalty | |
650 | 4 | |a semiparametric model | |
650 | 4 | |a spline estimation | |
650 | 4 | |a variable selection | |
700 | 1 | |a Zhu, Zhong-yi |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Acta mathematicae applicatae sinica / English series |d Springer Berlin Heidelberg, 2002 |g 33(2017), 2 vom: Apr., Seite 373-388 |w (DE-627)363762353 |w (DE-600)2106495-7 |w (DE-576)105283274 |x 0168-9673 |7 nnns |
773 | 1 | 8 | |g volume:33 |g year:2017 |g number:2 |g month:04 |g pages:373-388 |
856 | 4 | 1 | |u https://doi.org/10.1007/s10255-017-0666-1 |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_OLC | ||
912 | |a SSG-OLC-MAT | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_2018 | ||
912 | |a GBV_ILN_4277 | ||
936 | b | k | |a 31.80$jAngewandte Mathematik |q VZ |0 106419005 |0 (DE-625)106419005 |
951 | |a AR | ||
952 | |d 33 |j 2017 |e 2 |c 04 |h 373-388 |
author_variant |
w w ww z y z zyz |
---|---|
matchkey_str |
article:01689673:2017----::aibeeetofrhprilieri |
hierarchy_sort_str |
2017 |
bklnumber |
31.80$jAngewandte Mathematik |
publishDate |
2017 |
allfields |
10.1007/s10255-017-0666-1 doi (DE-627)OLC210996751X (DE-He213)s10255-017-0666-1-p DE-627 ger DE-627 rakwb eng 510 VZ 31.80$jAngewandte Mathematik bkl Wang, Wu verfasserin aut Variable selection for the partial linear single-index model 2017 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Institute of Applied Mathematics, Academy of Mathematics and System Sciences, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg 2017 Abstract In this paper, we consider the issue of variable selection in partial linear single-index models under the assumption that the vector of regression coefficients is sparse. We apply penalized spline to estimate the nonparametric function and SCAD penalty to achieve sparse estimates of regression parameters in both the linear and single-index parts of the model. Under some mild conditions, it is shown that the penalized estimators have oracle property, in the sense that it is asymptotically normal with the same mean and covariance that they would have if zero coefficients are known in advance. Our model owns a least square representation, therefore standard least square programming algorithms can be implemented without extra programming efforts. In the meantime, parametric estimation, variable selection and nonparametric estimation can be realized in one step, which incredibly increases computational stability. The finite sample performance of the penalized estimators is evaluated through Monte Carlo studies and illustrated with a real data set. nonparametric link function SCAD penalty semiparametric model spline estimation variable selection Zhu, Zhong-yi aut Enthalten in Acta mathematicae applicatae sinica / English series Springer Berlin Heidelberg, 2002 33(2017), 2 vom: Apr., Seite 373-388 (DE-627)363762353 (DE-600)2106495-7 (DE-576)105283274 0168-9673 nnns volume:33 year:2017 number:2 month:04 pages:373-388 https://doi.org/10.1007/s10255-017-0666-1 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_70 GBV_ILN_2018 GBV_ILN_4277 31.80$jAngewandte Mathematik VZ 106419005 (DE-625)106419005 AR 33 2017 2 04 373-388 |
spelling |
10.1007/s10255-017-0666-1 doi (DE-627)OLC210996751X (DE-He213)s10255-017-0666-1-p DE-627 ger DE-627 rakwb eng 510 VZ 31.80$jAngewandte Mathematik bkl Wang, Wu verfasserin aut Variable selection for the partial linear single-index model 2017 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Institute of Applied Mathematics, Academy of Mathematics and System Sciences, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg 2017 Abstract In this paper, we consider the issue of variable selection in partial linear single-index models under the assumption that the vector of regression coefficients is sparse. We apply penalized spline to estimate the nonparametric function and SCAD penalty to achieve sparse estimates of regression parameters in both the linear and single-index parts of the model. Under some mild conditions, it is shown that the penalized estimators have oracle property, in the sense that it is asymptotically normal with the same mean and covariance that they would have if zero coefficients are known in advance. Our model owns a least square representation, therefore standard least square programming algorithms can be implemented without extra programming efforts. In the meantime, parametric estimation, variable selection and nonparametric estimation can be realized in one step, which incredibly increases computational stability. The finite sample performance of the penalized estimators is evaluated through Monte Carlo studies and illustrated with a real data set. nonparametric link function SCAD penalty semiparametric model spline estimation variable selection Zhu, Zhong-yi aut Enthalten in Acta mathematicae applicatae sinica / English series Springer Berlin Heidelberg, 2002 33(2017), 2 vom: Apr., Seite 373-388 (DE-627)363762353 (DE-600)2106495-7 (DE-576)105283274 0168-9673 nnns volume:33 year:2017 number:2 month:04 pages:373-388 https://doi.org/10.1007/s10255-017-0666-1 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_70 GBV_ILN_2018 GBV_ILN_4277 31.80$jAngewandte Mathematik VZ 106419005 (DE-625)106419005 AR 33 2017 2 04 373-388 |
allfields_unstemmed |
10.1007/s10255-017-0666-1 doi (DE-627)OLC210996751X (DE-He213)s10255-017-0666-1-p DE-627 ger DE-627 rakwb eng 510 VZ 31.80$jAngewandte Mathematik bkl Wang, Wu verfasserin aut Variable selection for the partial linear single-index model 2017 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Institute of Applied Mathematics, Academy of Mathematics and System Sciences, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg 2017 Abstract In this paper, we consider the issue of variable selection in partial linear single-index models under the assumption that the vector of regression coefficients is sparse. We apply penalized spline to estimate the nonparametric function and SCAD penalty to achieve sparse estimates of regression parameters in both the linear and single-index parts of the model. Under some mild conditions, it is shown that the penalized estimators have oracle property, in the sense that it is asymptotically normal with the same mean and covariance that they would have if zero coefficients are known in advance. Our model owns a least square representation, therefore standard least square programming algorithms can be implemented without extra programming efforts. In the meantime, parametric estimation, variable selection and nonparametric estimation can be realized in one step, which incredibly increases computational stability. The finite sample performance of the penalized estimators is evaluated through Monte Carlo studies and illustrated with a real data set. nonparametric link function SCAD penalty semiparametric model spline estimation variable selection Zhu, Zhong-yi aut Enthalten in Acta mathematicae applicatae sinica / English series Springer Berlin Heidelberg, 2002 33(2017), 2 vom: Apr., Seite 373-388 (DE-627)363762353 (DE-600)2106495-7 (DE-576)105283274 0168-9673 nnns volume:33 year:2017 number:2 month:04 pages:373-388 https://doi.org/10.1007/s10255-017-0666-1 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_70 GBV_ILN_2018 GBV_ILN_4277 31.80$jAngewandte Mathematik VZ 106419005 (DE-625)106419005 AR 33 2017 2 04 373-388 |
allfieldsGer |
10.1007/s10255-017-0666-1 doi (DE-627)OLC210996751X (DE-He213)s10255-017-0666-1-p DE-627 ger DE-627 rakwb eng 510 VZ 31.80$jAngewandte Mathematik bkl Wang, Wu verfasserin aut Variable selection for the partial linear single-index model 2017 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Institute of Applied Mathematics, Academy of Mathematics and System Sciences, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg 2017 Abstract In this paper, we consider the issue of variable selection in partial linear single-index models under the assumption that the vector of regression coefficients is sparse. We apply penalized spline to estimate the nonparametric function and SCAD penalty to achieve sparse estimates of regression parameters in both the linear and single-index parts of the model. Under some mild conditions, it is shown that the penalized estimators have oracle property, in the sense that it is asymptotically normal with the same mean and covariance that they would have if zero coefficients are known in advance. Our model owns a least square representation, therefore standard least square programming algorithms can be implemented without extra programming efforts. In the meantime, parametric estimation, variable selection and nonparametric estimation can be realized in one step, which incredibly increases computational stability. The finite sample performance of the penalized estimators is evaluated through Monte Carlo studies and illustrated with a real data set. nonparametric link function SCAD penalty semiparametric model spline estimation variable selection Zhu, Zhong-yi aut Enthalten in Acta mathematicae applicatae sinica / English series Springer Berlin Heidelberg, 2002 33(2017), 2 vom: Apr., Seite 373-388 (DE-627)363762353 (DE-600)2106495-7 (DE-576)105283274 0168-9673 nnns volume:33 year:2017 number:2 month:04 pages:373-388 https://doi.org/10.1007/s10255-017-0666-1 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_70 GBV_ILN_2018 GBV_ILN_4277 31.80$jAngewandte Mathematik VZ 106419005 (DE-625)106419005 AR 33 2017 2 04 373-388 |
allfieldsSound |
10.1007/s10255-017-0666-1 doi (DE-627)OLC210996751X (DE-He213)s10255-017-0666-1-p DE-627 ger DE-627 rakwb eng 510 VZ 31.80$jAngewandte Mathematik bkl Wang, Wu verfasserin aut Variable selection for the partial linear single-index model 2017 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Institute of Applied Mathematics, Academy of Mathematics and System Sciences, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg 2017 Abstract In this paper, we consider the issue of variable selection in partial linear single-index models under the assumption that the vector of regression coefficients is sparse. We apply penalized spline to estimate the nonparametric function and SCAD penalty to achieve sparse estimates of regression parameters in both the linear and single-index parts of the model. Under some mild conditions, it is shown that the penalized estimators have oracle property, in the sense that it is asymptotically normal with the same mean and covariance that they would have if zero coefficients are known in advance. Our model owns a least square representation, therefore standard least square programming algorithms can be implemented without extra programming efforts. In the meantime, parametric estimation, variable selection and nonparametric estimation can be realized in one step, which incredibly increases computational stability. The finite sample performance of the penalized estimators is evaluated through Monte Carlo studies and illustrated with a real data set. nonparametric link function SCAD penalty semiparametric model spline estimation variable selection Zhu, Zhong-yi aut Enthalten in Acta mathematicae applicatae sinica / English series Springer Berlin Heidelberg, 2002 33(2017), 2 vom: Apr., Seite 373-388 (DE-627)363762353 (DE-600)2106495-7 (DE-576)105283274 0168-9673 nnns volume:33 year:2017 number:2 month:04 pages:373-388 https://doi.org/10.1007/s10255-017-0666-1 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_70 GBV_ILN_2018 GBV_ILN_4277 31.80$jAngewandte Mathematik VZ 106419005 (DE-625)106419005 AR 33 2017 2 04 373-388 |
language |
English |
source |
Enthalten in Acta mathematicae applicatae sinica / English series 33(2017), 2 vom: Apr., Seite 373-388 volume:33 year:2017 number:2 month:04 pages:373-388 |
sourceStr |
Enthalten in Acta mathematicae applicatae sinica / English series 33(2017), 2 vom: Apr., Seite 373-388 volume:33 year:2017 number:2 month:04 pages:373-388 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
nonparametric link function SCAD penalty semiparametric model spline estimation variable selection |
dewey-raw |
510 |
isfreeaccess_bool |
false |
container_title |
Acta mathematicae applicatae sinica / English series |
authorswithroles_txt_mv |
Wang, Wu @@aut@@ Zhu, Zhong-yi @@aut@@ |
publishDateDaySort_date |
2017-04-01T00:00:00Z |
hierarchy_top_id |
363762353 |
dewey-sort |
3510 |
id |
OLC210996751X |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">OLC210996751X</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230502180559.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">230502s2017 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s10255-017-0666-1</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC210996751X</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s10255-017-0666-1-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">510</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">31.80$jAngewandte Mathematik</subfield><subfield code="2">bkl</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Wang, Wu</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Variable selection for the partial linear single-index model</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2017</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Institute of Applied Mathematics, Academy of Mathematics and System Sciences, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg 2017</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract In this paper, we consider the issue of variable selection in partial linear single-index models under the assumption that the vector of regression coefficients is sparse. We apply penalized spline to estimate the nonparametric function and SCAD penalty to achieve sparse estimates of regression parameters in both the linear and single-index parts of the model. Under some mild conditions, it is shown that the penalized estimators have oracle property, in the sense that it is asymptotically normal with the same mean and covariance that they would have if zero coefficients are known in advance. Our model owns a least square representation, therefore standard least square programming algorithms can be implemented without extra programming efforts. In the meantime, parametric estimation, variable selection and nonparametric estimation can be realized in one step, which incredibly increases computational stability. The finite sample performance of the penalized estimators is evaluated through Monte Carlo studies and illustrated with a real data set.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">nonparametric link function</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">SCAD penalty</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">semiparametric model</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">spline estimation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">variable selection</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Zhu, Zhong-yi</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Acta mathematicae applicatae sinica / English series</subfield><subfield code="d">Springer Berlin Heidelberg, 2002</subfield><subfield code="g">33(2017), 2 vom: Apr., Seite 373-388</subfield><subfield code="w">(DE-627)363762353</subfield><subfield code="w">(DE-600)2106495-7</subfield><subfield code="w">(DE-576)105283274</subfield><subfield code="x">0168-9673</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:33</subfield><subfield code="g">year:2017</subfield><subfield code="g">number:2</subfield><subfield code="g">month:04</subfield><subfield code="g">pages:373-388</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s10255-017-0666-1</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2018</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4277</subfield></datafield><datafield tag="936" ind1="b" ind2="k"><subfield code="a">31.80$jAngewandte Mathematik</subfield><subfield code="q">VZ</subfield><subfield code="0">106419005</subfield><subfield code="0">(DE-625)106419005</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">33</subfield><subfield code="j">2017</subfield><subfield code="e">2</subfield><subfield code="c">04</subfield><subfield code="h">373-388</subfield></datafield></record></collection>
|
author |
Wang, Wu |
spellingShingle |
Wang, Wu ddc 510 bkl 31.80$jAngewandte Mathematik misc nonparametric link function misc SCAD penalty misc semiparametric model misc spline estimation misc variable selection Variable selection for the partial linear single-index model |
authorStr |
Wang, Wu |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)363762353 |
format |
Article |
dewey-ones |
510 - Mathematics |
delete_txt_mv |
keep |
author_role |
aut aut |
collection |
OLC |
remote_str |
false |
illustrated |
Not Illustrated |
issn |
0168-9673 |
topic_title |
510 VZ 31.80$jAngewandte Mathematik bkl Variable selection for the partial linear single-index model nonparametric link function SCAD penalty semiparametric model spline estimation variable selection |
topic |
ddc 510 bkl 31.80$jAngewandte Mathematik misc nonparametric link function misc SCAD penalty misc semiparametric model misc spline estimation misc variable selection |
topic_unstemmed |
ddc 510 bkl 31.80$jAngewandte Mathematik misc nonparametric link function misc SCAD penalty misc semiparametric model misc spline estimation misc variable selection |
topic_browse |
ddc 510 bkl 31.80$jAngewandte Mathematik misc nonparametric link function misc SCAD penalty misc semiparametric model misc spline estimation misc variable selection |
format_facet |
Aufsätze Gedruckte Aufsätze |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
nc |
hierarchy_parent_title |
Acta mathematicae applicatae sinica / English series |
hierarchy_parent_id |
363762353 |
dewey-tens |
510 - Mathematics |
hierarchy_top_title |
Acta mathematicae applicatae sinica / English series |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)363762353 (DE-600)2106495-7 (DE-576)105283274 |
title |
Variable selection for the partial linear single-index model |
ctrlnum |
(DE-627)OLC210996751X (DE-He213)s10255-017-0666-1-p |
title_full |
Variable selection for the partial linear single-index model |
author_sort |
Wang, Wu |
journal |
Acta mathematicae applicatae sinica / English series |
journalStr |
Acta mathematicae applicatae sinica / English series |
lang_code |
eng |
isOA_bool |
false |
dewey-hundreds |
500 - Science |
recordtype |
marc |
publishDateSort |
2017 |
contenttype_str_mv |
txt |
container_start_page |
373 |
author_browse |
Wang, Wu Zhu, Zhong-yi |
container_volume |
33 |
class |
510 VZ 31.80$jAngewandte Mathematik bkl |
format_se |
Aufsätze |
author-letter |
Wang, Wu |
doi_str_mv |
10.1007/s10255-017-0666-1 |
normlink |
106419005 |
normlink_prefix_str_mv |
106419005 (DE-625)106419005 |
dewey-full |
510 |
title_sort |
variable selection for the partial linear single-index model |
title_auth |
Variable selection for the partial linear single-index model |
abstract |
Abstract In this paper, we consider the issue of variable selection in partial linear single-index models under the assumption that the vector of regression coefficients is sparse. We apply penalized spline to estimate the nonparametric function and SCAD penalty to achieve sparse estimates of regression parameters in both the linear and single-index parts of the model. Under some mild conditions, it is shown that the penalized estimators have oracle property, in the sense that it is asymptotically normal with the same mean and covariance that they would have if zero coefficients are known in advance. Our model owns a least square representation, therefore standard least square programming algorithms can be implemented without extra programming efforts. In the meantime, parametric estimation, variable selection and nonparametric estimation can be realized in one step, which incredibly increases computational stability. The finite sample performance of the penalized estimators is evaluated through Monte Carlo studies and illustrated with a real data set. © Institute of Applied Mathematics, Academy of Mathematics and System Sciences, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg 2017 |
abstractGer |
Abstract In this paper, we consider the issue of variable selection in partial linear single-index models under the assumption that the vector of regression coefficients is sparse. We apply penalized spline to estimate the nonparametric function and SCAD penalty to achieve sparse estimates of regression parameters in both the linear and single-index parts of the model. Under some mild conditions, it is shown that the penalized estimators have oracle property, in the sense that it is asymptotically normal with the same mean and covariance that they would have if zero coefficients are known in advance. Our model owns a least square representation, therefore standard least square programming algorithms can be implemented without extra programming efforts. In the meantime, parametric estimation, variable selection and nonparametric estimation can be realized in one step, which incredibly increases computational stability. The finite sample performance of the penalized estimators is evaluated through Monte Carlo studies and illustrated with a real data set. © Institute of Applied Mathematics, Academy of Mathematics and System Sciences, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg 2017 |
abstract_unstemmed |
Abstract In this paper, we consider the issue of variable selection in partial linear single-index models under the assumption that the vector of regression coefficients is sparse. We apply penalized spline to estimate the nonparametric function and SCAD penalty to achieve sparse estimates of regression parameters in both the linear and single-index parts of the model. Under some mild conditions, it is shown that the penalized estimators have oracle property, in the sense that it is asymptotically normal with the same mean and covariance that they would have if zero coefficients are known in advance. Our model owns a least square representation, therefore standard least square programming algorithms can be implemented without extra programming efforts. In the meantime, parametric estimation, variable selection and nonparametric estimation can be realized in one step, which incredibly increases computational stability. The finite sample performance of the penalized estimators is evaluated through Monte Carlo studies and illustrated with a real data set. © Institute of Applied Mathematics, Academy of Mathematics and System Sciences, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg 2017 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_70 GBV_ILN_2018 GBV_ILN_4277 |
container_issue |
2 |
title_short |
Variable selection for the partial linear single-index model |
url |
https://doi.org/10.1007/s10255-017-0666-1 |
remote_bool |
false |
author2 |
Zhu, Zhong-yi |
author2Str |
Zhu, Zhong-yi |
ppnlink |
363762353 |
mediatype_str_mv |
n |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s10255-017-0666-1 |
up_date |
2024-07-04T04:30:07.106Z |
_version_ |
1803621401478823936 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">OLC210996751X</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230502180559.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">230502s2017 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s10255-017-0666-1</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC210996751X</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s10255-017-0666-1-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">510</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">31.80$jAngewandte Mathematik</subfield><subfield code="2">bkl</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Wang, Wu</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Variable selection for the partial linear single-index model</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2017</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Institute of Applied Mathematics, Academy of Mathematics and System Sciences, Chinese Academy of Sciences and Springer-Verlag Berlin Heidelberg 2017</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract In this paper, we consider the issue of variable selection in partial linear single-index models under the assumption that the vector of regression coefficients is sparse. We apply penalized spline to estimate the nonparametric function and SCAD penalty to achieve sparse estimates of regression parameters in both the linear and single-index parts of the model. Under some mild conditions, it is shown that the penalized estimators have oracle property, in the sense that it is asymptotically normal with the same mean and covariance that they would have if zero coefficients are known in advance. Our model owns a least square representation, therefore standard least square programming algorithms can be implemented without extra programming efforts. In the meantime, parametric estimation, variable selection and nonparametric estimation can be realized in one step, which incredibly increases computational stability. The finite sample performance of the penalized estimators is evaluated through Monte Carlo studies and illustrated with a real data set.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">nonparametric link function</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">SCAD penalty</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">semiparametric model</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">spline estimation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">variable selection</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Zhu, Zhong-yi</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Acta mathematicae applicatae sinica / English series</subfield><subfield code="d">Springer Berlin Heidelberg, 2002</subfield><subfield code="g">33(2017), 2 vom: Apr., Seite 373-388</subfield><subfield code="w">(DE-627)363762353</subfield><subfield code="w">(DE-600)2106495-7</subfield><subfield code="w">(DE-576)105283274</subfield><subfield code="x">0168-9673</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:33</subfield><subfield code="g">year:2017</subfield><subfield code="g">number:2</subfield><subfield code="g">month:04</subfield><subfield code="g">pages:373-388</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s10255-017-0666-1</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2018</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4277</subfield></datafield><datafield tag="936" ind1="b" ind2="k"><subfield code="a">31.80$jAngewandte Mathematik</subfield><subfield code="q">VZ</subfield><subfield code="0">106419005</subfield><subfield code="0">(DE-625)106419005</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">33</subfield><subfield code="j">2017</subfield><subfield code="e">2</subfield><subfield code="c">04</subfield><subfield code="h">373-388</subfield></datafield></record></collection>
|
score |
7.400281 |