Stochastic approximation and nonlinear regression
This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vecto...
Ausführliche Beschreibung
Autor*in: |
Albert, Arthur E. [verfasserIn] |
---|
Format: |
E-Book |
---|---|
Sprache: |
Englisch |
Erschienen: |
Cambridge, Mass: MIT-Press ; 2003 |
---|
Schlagwörter: |
---|
Anmerkung: |
"The MIT Press Includes bibliographical references and index This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home |
---|---|
Umfang: |
Online-Ressource |
Reproduktion: |
Online-Ausg. |
---|
Reihe: |
MIT Press Research Monographs ; No. 42 MIT Press research monograph ; no. 42 |
---|
Links: | |
---|---|
ISBN: |
0-262-31092-9 978-0-262-31092-5 |
Katalog-ID: |
816667810 |
---|
LEADER | 01000cam a22002652 4500 | ||
---|---|---|---|
001 | 816667810 | ||
003 | DE-627 | ||
005 | 20231004172600.0 | ||
007 | cr uuu---uuuuu | ||
008 | 150128s2003 xxu|||||o 00| ||eng c | ||
020 | |a 0262310929 |c electronic bk. |9 0-262-31092-9 | ||
020 | |a 9780262310925 |c : electronic bk. |9 978-0-262-31092-5 | ||
035 | |a (DE-627)816667810 | ||
035 | |a (DE-576)9816667819 | ||
035 | |a (DE-599)GBV816667810 | ||
035 | |a (OCoLC)827697816 | ||
035 | |a (MITPRESS)6276887 | ||
035 | |a (EBP)055121497 | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
044 | |c XD-US | ||
050 | 0 | |a QA280 | |
082 | 0 | |a 519.5/5 |2 23 | |
082 | 0 | |a 519 |2 19 | |
100 | 1 | |a Albert, Arthur E. |e verfasserin |4 aut | |
245 | 1 | 0 | |a Stochastic approximation and nonlinear regression |c Arthur E. Albert, Leland A. Gardner |
264 | 1 | |a Cambridge, Mass |b MIT-Press |c 2003 | |
300 | |a Online-Ressource | ||
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
490 | 0 | |a MIT Press Research Monographs |v No. 42 | |
490 | 0 | |a MIT Press research monograph |v no. 42 | |
500 | |a "The MIT Press | ||
500 | |a Includes bibliographical references and index | ||
500 | |a This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home | ||
520 | |a This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home.MIT Press Research Monograph No. 42. | ||
533 | |a Online-Ausg. | ||
650 | 0 | |a Time-series analysis | |
650 | 0 | |a Regression analysis | |
650 | 0 | |a Stochastic approximation | |
650 | 4 | |a Time-series analysis | |
650 | 4 | |a Regression analysis | |
700 | 1 | |a Gardner, Leland A. |4 oth | |
776 | 1 | |z 9780262511483 | |
776 | 0 | 8 | |i Erscheint auch als |n Druck-Ausgabe |z 9780262511483 |
856 | 4 | 0 | |u https://ieeexplore.ieee.org/book/6276887 |m X:MITPRESS |x Verlag |y IEEE Xplore |z lizenzpflichtig |3 Volltext |
856 | 4 | 2 | |u http://www.gbv.de/dms/bowker/toc/9780262511483.pdf |m V:DE-601 |m X:Bowker |q pdf/application |v 2015-03-18 |x Verlag |y Inhaltsverzeichnis |3 Inhaltsverzeichnis |
912 | |a ZDB-37-IEM |b 2012 | ||
912 | |a GBV_ILN_22 | ||
912 | |a ISIL_DE-18 | ||
912 | |a SYSFLAG_1 | ||
912 | |a GBV_KXP | ||
912 | |a GBV_ILN_22_i22818 | ||
912 | |a GBV_ILN_23 | ||
912 | |a ISIL_DE-830 | ||
912 | |a GBV_ILN_100 | ||
912 | |a ISIL_DE-Ma9 | ||
912 | |a GBV_ILN_370 | ||
912 | |a ISIL_DE-1373 | ||
912 | |a GBV_ILN_2015 | ||
912 | |a ISIL_DE-93 | ||
951 | |a BO | ||
953 | |2 045F |a 519.5/5 | ||
953 | |2 045F |a 519 | ||
980 | |2 22 |1 01 |x 0018 |b 384847073X |h olrm-h228-MITIEEE |y zi22818 |z 03-02-21 | ||
980 | |2 23 |1 01 |x 0830 |b 1521013195 |h olr-MIT |u i |y z |z 31-01-15 | ||
980 | |2 100 |1 01 |x 3100 |b 4472464284 |c 09 |f --%%-- |d eBook MIT Press |e --%%-- |j --%%-- |h OLR-MIT-CEC |k Vervielfältigungen (z.B. Kopien, Downloads) sind nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. |y z |z 30-01-24 | ||
980 | |2 370 |1 01 |x 4370 |b 401121759X |h olr-ebook mitieee |k Vervielfältigungen (z.B. Kopien, Downloads) sind nur von einzelnen Kapiteln oder Seiten und nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. |u i |y z |z 01-12-21 | ||
980 | |2 2015 |1 01 |x DE-93 |b 3740749407 |c 00 |f --%%-- |d --%%-- |e p |j --%%-- |k Campuslizenz |y l01 |z 18-08-20 | ||
981 | |2 22 |1 01 |x 0018 |y Volltextzugang Campus |r https://ieeexplore.ieee.org/book/6276887 | ||
981 | |2 22 |1 01 |x 0018 |y Nur für Angehörige der Universität Hamburg: Volltextzugang von außerhalb des Campus |r http://emedien.sub.uni-hamburg.de/han/ieee/ieeexplore.ieee.org/book/6276887 | ||
981 | |2 23 |1 01 |x 0830 |y MIT Press EBook |r https://ieeexplore.ieee.org/book/6276887 | ||
981 | |2 100 |1 01 |x 3100 |r https://ieeexplore.ieee.org/book/6276887 | ||
981 | |2 100 |1 01 |x 3100 |y für Uniangehörige: Zugang weltweit |r http://han.med.uni-magdeburg.de/han/mitvia-ieee/ieeexplore.ieee.org/book/6276887 | ||
981 | |2 370 |1 01 |x 4370 |y E-Book: Zugriff im HCU-Netz. Zugriff von außerhalb nur für HCU-Angehörige möglich |r https://ieeexplore.ieee.org/book/6276887 | ||
981 | |2 2015 |1 01 |x DE-93 |r https://ieeexplore.ieee.org/book/6276887 | ||
985 | |2 23 |1 01 |x 0830 |a 2018-01805, 2018-01806, 2018-01808 | ||
995 | |2 22 |1 01 |x 0018 |a olrm-h228-MITIEEE | ||
995 | |2 23 |1 01 |x 0830 |a olr-MIT | ||
995 | |2 100 |1 01 |x 3100 |a OLR-MIT-CEC | ||
995 | |2 370 |1 01 |x 4370 |a olr-ebook mitieee | ||
998 | |2 23 |1 01 |x 0830 |0 2015.01.31 | ||
998 | |2 370 |1 01 |x 4370 |0 2021.12.01 |
author_variant |
a e a ae aea |
---|---|
matchkey_str |
book:9780262310925:2003---- |
oclc_num |
827697816 |
hierarchy_sort_str |
2003 |
callnumber-subject-code |
QA |
publishDate |
2003 |
allfields |
0262310929 electronic bk. 0-262-31092-9 9780262310925 : electronic bk. 978-0-262-31092-5 (DE-627)816667810 (DE-576)9816667819 (DE-599)GBV816667810 (OCoLC)827697816 (MITPRESS)6276887 (EBP)055121497 DE-627 ger DE-627 rakwb eng XD-US QA280 519.5/5 23 519 19 Albert, Arthur E. verfasserin aut Stochastic approximation and nonlinear regression Arthur E. Albert, Leland A. Gardner Cambridge, Mass MIT-Press 2003 Online-Ressource Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier MIT Press Research Monographs No. 42 MIT Press research monograph no. 42 "The MIT Press Includes bibliographical references and index This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home.MIT Press Research Monograph No. 42. Online-Ausg. Time-series analysis Regression analysis Stochastic approximation Time-series analysis Regression analysis Gardner, Leland A. oth 9780262511483 Erscheint auch als Druck-Ausgabe 9780262511483 https://ieeexplore.ieee.org/book/6276887 X:MITPRESS Verlag IEEE Xplore lizenzpflichtig Volltext http://www.gbv.de/dms/bowker/toc/9780262511483.pdf V:DE-601 X:Bowker pdf/application 2015-03-18 Verlag Inhaltsverzeichnis Inhaltsverzeichnis ZDB-37-IEM 2012 GBV_ILN_22 ISIL_DE-18 SYSFLAG_1 GBV_KXP GBV_ILN_22_i22818 GBV_ILN_23 ISIL_DE-830 GBV_ILN_100 ISIL_DE-Ma9 GBV_ILN_370 ISIL_DE-1373 GBV_ILN_2015 ISIL_DE-93 BO 045F 519.5/5 045F 519 22 01 0018 384847073X olrm-h228-MITIEEE zi22818 03-02-21 23 01 0830 1521013195 olr-MIT i z 31-01-15 100 01 3100 4472464284 09 --%%-- eBook MIT Press --%%-- --%%-- OLR-MIT-CEC Vervielfältigungen (z.B. Kopien, Downloads) sind nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. z 30-01-24 370 01 4370 401121759X olr-ebook mitieee Vervielfältigungen (z.B. Kopien, Downloads) sind nur von einzelnen Kapiteln oder Seiten und nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. i z 01-12-21 2015 01 DE-93 3740749407 00 --%%-- --%%-- p --%%-- Campuslizenz l01 18-08-20 22 01 0018 Volltextzugang Campus https://ieeexplore.ieee.org/book/6276887 22 01 0018 Nur für Angehörige der Universität Hamburg: Volltextzugang von außerhalb des Campus http://emedien.sub.uni-hamburg.de/han/ieee/ieeexplore.ieee.org/book/6276887 23 01 0830 MIT Press EBook https://ieeexplore.ieee.org/book/6276887 100 01 3100 https://ieeexplore.ieee.org/book/6276887 100 01 3100 für Uniangehörige: Zugang weltweit http://han.med.uni-magdeburg.de/han/mitvia-ieee/ieeexplore.ieee.org/book/6276887 370 01 4370 E-Book: Zugriff im HCU-Netz. Zugriff von außerhalb nur für HCU-Angehörige möglich https://ieeexplore.ieee.org/book/6276887 2015 01 DE-93 https://ieeexplore.ieee.org/book/6276887 23 01 0830 2018-01805, 2018-01806, 2018-01808 22 01 0018 olrm-h228-MITIEEE 23 01 0830 olr-MIT 100 01 3100 OLR-MIT-CEC 370 01 4370 olr-ebook mitieee 23 01 0830 2015.01.31 370 01 4370 2021.12.01 |
spelling |
0262310929 electronic bk. 0-262-31092-9 9780262310925 : electronic bk. 978-0-262-31092-5 (DE-627)816667810 (DE-576)9816667819 (DE-599)GBV816667810 (OCoLC)827697816 (MITPRESS)6276887 (EBP)055121497 DE-627 ger DE-627 rakwb eng XD-US QA280 519.5/5 23 519 19 Albert, Arthur E. verfasserin aut Stochastic approximation and nonlinear regression Arthur E. Albert, Leland A. Gardner Cambridge, Mass MIT-Press 2003 Online-Ressource Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier MIT Press Research Monographs No. 42 MIT Press research monograph no. 42 "The MIT Press Includes bibliographical references and index This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home.MIT Press Research Monograph No. 42. Online-Ausg. Time-series analysis Regression analysis Stochastic approximation Time-series analysis Regression analysis Gardner, Leland A. oth 9780262511483 Erscheint auch als Druck-Ausgabe 9780262511483 https://ieeexplore.ieee.org/book/6276887 X:MITPRESS Verlag IEEE Xplore lizenzpflichtig Volltext http://www.gbv.de/dms/bowker/toc/9780262511483.pdf V:DE-601 X:Bowker pdf/application 2015-03-18 Verlag Inhaltsverzeichnis Inhaltsverzeichnis ZDB-37-IEM 2012 GBV_ILN_22 ISIL_DE-18 SYSFLAG_1 GBV_KXP GBV_ILN_22_i22818 GBV_ILN_23 ISIL_DE-830 GBV_ILN_100 ISIL_DE-Ma9 GBV_ILN_370 ISIL_DE-1373 GBV_ILN_2015 ISIL_DE-93 BO 045F 519.5/5 045F 519 22 01 0018 384847073X olrm-h228-MITIEEE zi22818 03-02-21 23 01 0830 1521013195 olr-MIT i z 31-01-15 100 01 3100 4472464284 09 --%%-- eBook MIT Press --%%-- --%%-- OLR-MIT-CEC Vervielfältigungen (z.B. Kopien, Downloads) sind nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. z 30-01-24 370 01 4370 401121759X olr-ebook mitieee Vervielfältigungen (z.B. Kopien, Downloads) sind nur von einzelnen Kapiteln oder Seiten und nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. i z 01-12-21 2015 01 DE-93 3740749407 00 --%%-- --%%-- p --%%-- Campuslizenz l01 18-08-20 22 01 0018 Volltextzugang Campus https://ieeexplore.ieee.org/book/6276887 22 01 0018 Nur für Angehörige der Universität Hamburg: Volltextzugang von außerhalb des Campus http://emedien.sub.uni-hamburg.de/han/ieee/ieeexplore.ieee.org/book/6276887 23 01 0830 MIT Press EBook https://ieeexplore.ieee.org/book/6276887 100 01 3100 https://ieeexplore.ieee.org/book/6276887 100 01 3100 für Uniangehörige: Zugang weltweit http://han.med.uni-magdeburg.de/han/mitvia-ieee/ieeexplore.ieee.org/book/6276887 370 01 4370 E-Book: Zugriff im HCU-Netz. Zugriff von außerhalb nur für HCU-Angehörige möglich https://ieeexplore.ieee.org/book/6276887 2015 01 DE-93 https://ieeexplore.ieee.org/book/6276887 23 01 0830 2018-01805, 2018-01806, 2018-01808 22 01 0018 olrm-h228-MITIEEE 23 01 0830 olr-MIT 100 01 3100 OLR-MIT-CEC 370 01 4370 olr-ebook mitieee 23 01 0830 2015.01.31 370 01 4370 2021.12.01 |
allfields_unstemmed |
0262310929 electronic bk. 0-262-31092-9 9780262310925 : electronic bk. 978-0-262-31092-5 (DE-627)816667810 (DE-576)9816667819 (DE-599)GBV816667810 (OCoLC)827697816 (MITPRESS)6276887 (EBP)055121497 DE-627 ger DE-627 rakwb eng XD-US QA280 519.5/5 23 519 19 Albert, Arthur E. verfasserin aut Stochastic approximation and nonlinear regression Arthur E. Albert, Leland A. Gardner Cambridge, Mass MIT-Press 2003 Online-Ressource Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier MIT Press Research Monographs No. 42 MIT Press research monograph no. 42 "The MIT Press Includes bibliographical references and index This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home.MIT Press Research Monograph No. 42. Online-Ausg. Time-series analysis Regression analysis Stochastic approximation Time-series analysis Regression analysis Gardner, Leland A. oth 9780262511483 Erscheint auch als Druck-Ausgabe 9780262511483 https://ieeexplore.ieee.org/book/6276887 X:MITPRESS Verlag IEEE Xplore lizenzpflichtig Volltext http://www.gbv.de/dms/bowker/toc/9780262511483.pdf V:DE-601 X:Bowker pdf/application 2015-03-18 Verlag Inhaltsverzeichnis Inhaltsverzeichnis ZDB-37-IEM 2012 GBV_ILN_22 ISIL_DE-18 SYSFLAG_1 GBV_KXP GBV_ILN_22_i22818 GBV_ILN_23 ISIL_DE-830 GBV_ILN_100 ISIL_DE-Ma9 GBV_ILN_370 ISIL_DE-1373 GBV_ILN_2015 ISIL_DE-93 BO 045F 519.5/5 045F 519 22 01 0018 384847073X olrm-h228-MITIEEE zi22818 03-02-21 23 01 0830 1521013195 olr-MIT i z 31-01-15 100 01 3100 4472464284 09 --%%-- eBook MIT Press --%%-- --%%-- OLR-MIT-CEC Vervielfältigungen (z.B. Kopien, Downloads) sind nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. z 30-01-24 370 01 4370 401121759X olr-ebook mitieee Vervielfältigungen (z.B. Kopien, Downloads) sind nur von einzelnen Kapiteln oder Seiten und nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. i z 01-12-21 2015 01 DE-93 3740749407 00 --%%-- --%%-- p --%%-- Campuslizenz l01 18-08-20 22 01 0018 Volltextzugang Campus https://ieeexplore.ieee.org/book/6276887 22 01 0018 Nur für Angehörige der Universität Hamburg: Volltextzugang von außerhalb des Campus http://emedien.sub.uni-hamburg.de/han/ieee/ieeexplore.ieee.org/book/6276887 23 01 0830 MIT Press EBook https://ieeexplore.ieee.org/book/6276887 100 01 3100 https://ieeexplore.ieee.org/book/6276887 100 01 3100 für Uniangehörige: Zugang weltweit http://han.med.uni-magdeburg.de/han/mitvia-ieee/ieeexplore.ieee.org/book/6276887 370 01 4370 E-Book: Zugriff im HCU-Netz. Zugriff von außerhalb nur für HCU-Angehörige möglich https://ieeexplore.ieee.org/book/6276887 2015 01 DE-93 https://ieeexplore.ieee.org/book/6276887 23 01 0830 2018-01805, 2018-01806, 2018-01808 22 01 0018 olrm-h228-MITIEEE 23 01 0830 olr-MIT 100 01 3100 OLR-MIT-CEC 370 01 4370 olr-ebook mitieee 23 01 0830 2015.01.31 370 01 4370 2021.12.01 |
allfieldsGer |
0262310929 electronic bk. 0-262-31092-9 9780262310925 : electronic bk. 978-0-262-31092-5 (DE-627)816667810 (DE-576)9816667819 (DE-599)GBV816667810 (OCoLC)827697816 (MITPRESS)6276887 (EBP)055121497 DE-627 ger DE-627 rakwb eng XD-US QA280 519.5/5 23 519 19 Albert, Arthur E. verfasserin aut Stochastic approximation and nonlinear regression Arthur E. Albert, Leland A. Gardner Cambridge, Mass MIT-Press 2003 Online-Ressource Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier MIT Press Research Monographs No. 42 MIT Press research monograph no. 42 "The MIT Press Includes bibliographical references and index This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home.MIT Press Research Monograph No. 42. Online-Ausg. Time-series analysis Regression analysis Stochastic approximation Time-series analysis Regression analysis Gardner, Leland A. oth 9780262511483 Erscheint auch als Druck-Ausgabe 9780262511483 https://ieeexplore.ieee.org/book/6276887 X:MITPRESS Verlag IEEE Xplore lizenzpflichtig Volltext http://www.gbv.de/dms/bowker/toc/9780262511483.pdf V:DE-601 X:Bowker pdf/application 2015-03-18 Verlag Inhaltsverzeichnis Inhaltsverzeichnis ZDB-37-IEM 2012 GBV_ILN_22 ISIL_DE-18 SYSFLAG_1 GBV_KXP GBV_ILN_22_i22818 GBV_ILN_23 ISIL_DE-830 GBV_ILN_100 ISIL_DE-Ma9 GBV_ILN_370 ISIL_DE-1373 GBV_ILN_2015 ISIL_DE-93 BO 045F 519.5/5 045F 519 22 01 0018 384847073X olrm-h228-MITIEEE zi22818 03-02-21 23 01 0830 1521013195 olr-MIT i z 31-01-15 100 01 3100 4472464284 09 --%%-- eBook MIT Press --%%-- --%%-- OLR-MIT-CEC Vervielfältigungen (z.B. Kopien, Downloads) sind nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. z 30-01-24 370 01 4370 401121759X olr-ebook mitieee Vervielfältigungen (z.B. Kopien, Downloads) sind nur von einzelnen Kapiteln oder Seiten und nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. i z 01-12-21 2015 01 DE-93 3740749407 00 --%%-- --%%-- p --%%-- Campuslizenz l01 18-08-20 22 01 0018 Volltextzugang Campus https://ieeexplore.ieee.org/book/6276887 22 01 0018 Nur für Angehörige der Universität Hamburg: Volltextzugang von außerhalb des Campus http://emedien.sub.uni-hamburg.de/han/ieee/ieeexplore.ieee.org/book/6276887 23 01 0830 MIT Press EBook https://ieeexplore.ieee.org/book/6276887 100 01 3100 https://ieeexplore.ieee.org/book/6276887 100 01 3100 für Uniangehörige: Zugang weltweit http://han.med.uni-magdeburg.de/han/mitvia-ieee/ieeexplore.ieee.org/book/6276887 370 01 4370 E-Book: Zugriff im HCU-Netz. Zugriff von außerhalb nur für HCU-Angehörige möglich https://ieeexplore.ieee.org/book/6276887 2015 01 DE-93 https://ieeexplore.ieee.org/book/6276887 23 01 0830 2018-01805, 2018-01806, 2018-01808 22 01 0018 olrm-h228-MITIEEE 23 01 0830 olr-MIT 100 01 3100 OLR-MIT-CEC 370 01 4370 olr-ebook mitieee 23 01 0830 2015.01.31 370 01 4370 2021.12.01 |
allfieldsSound |
0262310929 electronic bk. 0-262-31092-9 9780262310925 : electronic bk. 978-0-262-31092-5 (DE-627)816667810 (DE-576)9816667819 (DE-599)GBV816667810 (OCoLC)827697816 (MITPRESS)6276887 (EBP)055121497 DE-627 ger DE-627 rakwb eng XD-US QA280 519.5/5 23 519 19 Albert, Arthur E. verfasserin aut Stochastic approximation and nonlinear regression Arthur E. Albert, Leland A. Gardner Cambridge, Mass MIT-Press 2003 Online-Ressource Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier MIT Press Research Monographs No. 42 MIT Press research monograph no. 42 "The MIT Press Includes bibliographical references and index This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home.MIT Press Research Monograph No. 42. Online-Ausg. Time-series analysis Regression analysis Stochastic approximation Time-series analysis Regression analysis Gardner, Leland A. oth 9780262511483 Erscheint auch als Druck-Ausgabe 9780262511483 https://ieeexplore.ieee.org/book/6276887 X:MITPRESS Verlag IEEE Xplore lizenzpflichtig Volltext http://www.gbv.de/dms/bowker/toc/9780262511483.pdf V:DE-601 X:Bowker pdf/application 2015-03-18 Verlag Inhaltsverzeichnis Inhaltsverzeichnis ZDB-37-IEM 2012 GBV_ILN_22 ISIL_DE-18 SYSFLAG_1 GBV_KXP GBV_ILN_22_i22818 GBV_ILN_23 ISIL_DE-830 GBV_ILN_100 ISIL_DE-Ma9 GBV_ILN_370 ISIL_DE-1373 GBV_ILN_2015 ISIL_DE-93 BO 045F 519.5/5 045F 519 22 01 0018 384847073X olrm-h228-MITIEEE zi22818 03-02-21 23 01 0830 1521013195 olr-MIT i z 31-01-15 100 01 3100 4472464284 09 --%%-- eBook MIT Press --%%-- --%%-- OLR-MIT-CEC Vervielfältigungen (z.B. Kopien, Downloads) sind nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. z 30-01-24 370 01 4370 401121759X olr-ebook mitieee Vervielfältigungen (z.B. Kopien, Downloads) sind nur von einzelnen Kapiteln oder Seiten und nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. i z 01-12-21 2015 01 DE-93 3740749407 00 --%%-- --%%-- p --%%-- Campuslizenz l01 18-08-20 22 01 0018 Volltextzugang Campus https://ieeexplore.ieee.org/book/6276887 22 01 0018 Nur für Angehörige der Universität Hamburg: Volltextzugang von außerhalb des Campus http://emedien.sub.uni-hamburg.de/han/ieee/ieeexplore.ieee.org/book/6276887 23 01 0830 MIT Press EBook https://ieeexplore.ieee.org/book/6276887 100 01 3100 https://ieeexplore.ieee.org/book/6276887 100 01 3100 für Uniangehörige: Zugang weltweit http://han.med.uni-magdeburg.de/han/mitvia-ieee/ieeexplore.ieee.org/book/6276887 370 01 4370 E-Book: Zugriff im HCU-Netz. Zugriff von außerhalb nur für HCU-Angehörige möglich https://ieeexplore.ieee.org/book/6276887 2015 01 DE-93 https://ieeexplore.ieee.org/book/6276887 23 01 0830 2018-01805, 2018-01806, 2018-01808 22 01 0018 olrm-h228-MITIEEE 23 01 0830 olr-MIT 100 01 3100 OLR-MIT-CEC 370 01 4370 olr-ebook mitieee 23 01 0830 2015.01.31 370 01 4370 2021.12.01 |
language |
English |
format_phy_str_mv |
Book |
building |
22:i 23 100 370 2015:0 |
institution |
findex.gbv.de |
selectbib_iln_str_mv |
22@i22818 23@ 100@ 370@ 2015@01 |
topic_facet |
Time-series analysis Regression analysis Stochastic approximation |
dewey-raw |
519.5/5 |
isfreeaccess_bool |
false |
authorswithroles_txt_mv |
Albert, Arthur E. @@aut@@ Gardner, Leland A. @@oth@@ |
publishDateDaySort_date |
2003-01-01T00:00:00Z |
dewey-sort |
3519.5 15 |
id |
816667810 |
signature_iln |
100:eBook MIT Press 3100:eBook MIT Press |
signature_iln_str_mv |
100:eBook MIT Press 3100:eBook MIT Press |
signature_iln_scis_mv |
100:eBook MIT Press 3100:eBook MIT Press |
language_de |
englisch |
standort_str_mv |
--%%-- |
callnumber-first |
Q - Science |
series2 |
MIT Press Research Monographs MIT Press research monograph |
standort_iln_str_mv |
100:--%%-- 3100:--%%-- 2015:--%%-- DE-93:--%%-- |
author |
Albert, Arthur E. |
spellingShingle |
Albert, Arthur E. misc QA280 ddc 519.5/5 ddc 519 misc Time-series analysis misc Regression analysis misc Stochastic approximation Stochastic approximation and nonlinear regression |
authorStr |
Albert, Arthur E. |
format |
eBook |
dewey-ones |
519 - Probabilities & applied mathematics |
delete_txt_mv |
keep |
author_role |
aut |
collection |
KXP GVK SWB |
publishPlace |
Cambridge, Mass |
remote_str |
true |
abrufzeichen_iln_str_mv |
22@olrm-h228-MITIEEE 23@olr-MIT 100@OLR-MIT-CEC 370@olr-ebook mitieee |
abrufzeichen_iln_scis_mv |
22@olrm-h228-MITIEEE 23@olr-MIT 100@OLR-MIT-CEC 370@olr-ebook mitieee |
callnumber-label |
QA280 |
last_changed_iln_str_mv |
22@03-02-21 23@31-01-15 100@30-01-24 370@01-12-21 2015@18-08-20 |
illustrated |
Not Illustrated |
topic_title |
QA280 519.5/5 23 519 19 Stochastic approximation and nonlinear regression Arthur E. Albert, Leland A. Gardner Time-series analysis Regression analysis Stochastic approximation |
publisher |
MIT-Press |
publisherStr |
MIT-Press |
topic |
misc QA280 ddc 519.5/5 ddc 519 misc Time-series analysis misc Regression analysis misc Stochastic approximation |
topic_unstemmed |
misc QA280 ddc 519.5/5 ddc 519 misc Time-series analysis misc Regression analysis misc Stochastic approximation |
topic_browse |
misc QA280 ddc 519.5/5 ddc 519 misc Time-series analysis misc Regression analysis misc Stochastic approximation |
format_facet |
Elektronische Bücher Bücher Elektronische Ressource |
standort_txtP_mv |
--%%-- |
format_main_str_mv |
Text Buch |
carriertype_str_mv |
cr |
author2_variant |
l a g la lag |
signature |
eBook MIT Press --%%-- |
signature_str_mv |
eBook MIT Press --%%-- |
dewey-tens |
510 - Mathematics |
isbn |
0262310929 9780262310925 9780262511483 0262511487 |
isfreeaccess_txt |
false |
title |
Stochastic approximation and nonlinear regression |
ctrlnum |
(DE-627)816667810 (DE-576)9816667819 (DE-599)GBV816667810 (OCoLC)827697816 (MITPRESS)6276887 (EBP)055121497 |
exemplarkommentar_str_mv |
100@Vervielfältigungen (z.B. Kopien, Downloads) sind nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. 370@Vervielfältigungen (z.B. Kopien, Downloads) sind nur von einzelnen Kapiteln oder Seiten und nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots. 2015@Campuslizenz |
title_full |
Stochastic approximation and nonlinear regression Arthur E. Albert, Leland A. Gardner |
author_sort |
Albert, Arthur E. |
callnumber-first-code |
Q |
lang_code |
eng |
selektneu_str_mv |
23@2015.01.31 370@2021.12.01 |
isOA_bool |
false |
dewey-hundreds |
500 - Science |
recordtype |
marc |
publishDateSort |
2003 |
contenttype_str_mv |
txt |
author_browse |
Albert, Arthur E. |
selectkey |
22:z 23:z 100:z 370:z 2015:l |
physical |
Online-Ressource |
class |
QA280 519.5/5 23 519 19 |
format_se |
Elektronische Bücher |
countryofpublication_str_mv |
XD-US |
author-letter |
Albert, Arthur E. |
normlink |
2015.01.31 2021.12.01 |
normlink_prefix_str_mv |
2015.01.31 2021.12.01 |
dewey-full |
519.5/5 519 |
title_sort |
stochastic approximation and nonlinear regression |
callnumber |
QA280 |
title_auth |
Stochastic approximation and nonlinear regression |
abstract |
This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home.MIT Press Research Monograph No. 42. "The MIT Press Includes bibliographical references and index This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home |
abstractGer |
This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home.MIT Press Research Monograph No. 42. "The MIT Press Includes bibliographical references and index This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home |
abstract_unstemmed |
This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home.MIT Press Research Monograph No. 42. "The MIT Press Includes bibliographical references and index This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home |
collection_details |
ZDB-37-IEM GBV_ILN_22 ISIL_DE-18 SYSFLAG_1 GBV_KXP GBV_ILN_22_i22818 GBV_ILN_23 ISIL_DE-830 GBV_ILN_100 ISIL_DE-Ma9 GBV_ILN_370 ISIL_DE-1373 GBV_ILN_2015 ISIL_DE-93 |
title_short |
Stochastic approximation and nonlinear regression |
url |
https://ieeexplore.ieee.org/book/6276887 http://www.gbv.de/dms/bowker/toc/9780262511483.pdf |
ausleihindikator_str_mv |
22 23 100:- 370 2015:p |
remote_bool |
true |
author2 |
Gardner, Leland A. |
author2Str |
Gardner, Leland A. |
callnumber-subject |
QA - Mathematics |
mediatype_str_mv |
c |
isOA_txt |
false |
hochschulschrift_bool |
false |
author2_role |
oth |
callnumber-a |
QA280 |
up_date |
2024-07-25T08:56:40.612Z |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000cam a22002652 4500</leader><controlfield tag="001">816667810</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20231004172600.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">150128s2003 xxu|||||o 00| ||eng c</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">0262310929</subfield><subfield code="c">electronic bk.</subfield><subfield code="9">0-262-31092-9</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9780262310925</subfield><subfield code="c">: electronic bk.</subfield><subfield code="9">978-0-262-31092-5</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)816667810</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-576)9816667819</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)GBV816667810</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)827697816</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(MITPRESS)6276887</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(EBP)055121497</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="044" ind1=" " ind2=" "><subfield code="c">XD-US</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">QA280</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">519.5/5</subfield><subfield code="2">23</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">519</subfield><subfield code="2">19</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Albert, Arthur E.</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Stochastic approximation and nonlinear regression</subfield><subfield code="c">Arthur E. Albert, Leland A. Gardner</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Cambridge, Mass</subfield><subfield code="b">MIT-Press</subfield><subfield code="c">2003</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="490" ind1="0" ind2=" "><subfield code="a">MIT Press Research Monographs</subfield><subfield code="v">No. 42</subfield></datafield><datafield tag="490" ind1="0" ind2=" "><subfield code="a">MIT Press research monograph</subfield><subfield code="v">no. 42</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">"The MIT Press</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Includes bibliographical references and index</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home.MIT Press Research Monograph No. 42.</subfield></datafield><datafield tag="533" ind1=" " ind2=" "><subfield code="a">Online-Ausg.</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Time-series analysis</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Regression analysis</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Stochastic approximation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Time-series analysis</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Regression analysis</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Gardner, Leland A.</subfield><subfield code="4">oth</subfield></datafield><datafield tag="776" ind1="1" ind2=" "><subfield code="z">9780262511483</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Druck-Ausgabe</subfield><subfield code="z">9780262511483</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/book/6276887</subfield><subfield code="m">X:MITPRESS</subfield><subfield code="x">Verlag</subfield><subfield code="y">IEEE Xplore</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">http://www.gbv.de/dms/bowker/toc/9780262511483.pdf</subfield><subfield code="m">V:DE-601</subfield><subfield code="m">X:Bowker</subfield><subfield code="q">pdf/application</subfield><subfield code="v">2015-03-18</subfield><subfield code="x">Verlag</subfield><subfield code="y">Inhaltsverzeichnis</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-37-IEM</subfield><subfield code="b">2012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ISIL_DE-18</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_1</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_KXP</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22_i22818</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ISIL_DE-830</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_100</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ISIL_DE-Ma9</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ISIL_DE-1373</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2015</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ISIL_DE-93</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">BO</subfield></datafield><datafield tag="953" ind1=" " ind2=" "><subfield code="2">045F</subfield><subfield code="a">519.5/5</subfield></datafield><datafield tag="953" ind1=" " ind2=" "><subfield code="2">045F</subfield><subfield code="a">519</subfield></datafield><datafield tag="980" ind1=" " ind2=" "><subfield code="2">22</subfield><subfield code="1">01</subfield><subfield code="x">0018</subfield><subfield code="b">384847073X</subfield><subfield code="h">olrm-h228-MITIEEE</subfield><subfield code="y">zi22818</subfield><subfield code="z">03-02-21</subfield></datafield><datafield tag="980" ind1=" " ind2=" "><subfield code="2">23</subfield><subfield code="1">01</subfield><subfield code="x">0830</subfield><subfield code="b">1521013195</subfield><subfield code="h">olr-MIT</subfield><subfield code="u">i</subfield><subfield code="y">z</subfield><subfield code="z">31-01-15</subfield></datafield><datafield tag="980" ind1=" " ind2=" "><subfield code="2">100</subfield><subfield code="1">01</subfield><subfield code="x">3100</subfield><subfield code="b">4472464284</subfield><subfield code="c">09</subfield><subfield code="f">--%%--</subfield><subfield code="d">eBook MIT Press</subfield><subfield code="e">--%%--</subfield><subfield code="j">--%%--</subfield><subfield code="h">OLR-MIT-CEC</subfield><subfield code="k">Vervielfältigungen (z.B. Kopien, Downloads) sind nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots.</subfield><subfield code="y">z</subfield><subfield code="z">30-01-24</subfield></datafield><datafield tag="980" ind1=" " ind2=" "><subfield code="2">370</subfield><subfield code="1">01</subfield><subfield code="x">4370</subfield><subfield code="b">401121759X</subfield><subfield code="h">olr-ebook mitieee</subfield><subfield code="k">Vervielfältigungen (z.B. Kopien, Downloads) sind nur von einzelnen Kapiteln oder Seiten und nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots.</subfield><subfield code="u">i</subfield><subfield code="y">z</subfield><subfield code="z">01-12-21</subfield></datafield><datafield tag="980" ind1=" " ind2=" "><subfield code="2">2015</subfield><subfield code="1">01</subfield><subfield code="x">DE-93</subfield><subfield code="b">3740749407</subfield><subfield code="c">00</subfield><subfield code="f">--%%--</subfield><subfield code="d">--%%--</subfield><subfield code="e">p</subfield><subfield code="j">--%%--</subfield><subfield code="k">Campuslizenz</subfield><subfield code="y">l01</subfield><subfield code="z">18-08-20</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">22</subfield><subfield code="1">01</subfield><subfield code="x">0018</subfield><subfield code="y">Volltextzugang Campus</subfield><subfield code="r">https://ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">22</subfield><subfield code="1">01</subfield><subfield code="x">0018</subfield><subfield code="y">Nur für Angehörige der Universität Hamburg: Volltextzugang von außerhalb des Campus</subfield><subfield code="r">http://emedien.sub.uni-hamburg.de/han/ieee/ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">23</subfield><subfield code="1">01</subfield><subfield code="x">0830</subfield><subfield code="y">MIT Press EBook</subfield><subfield code="r">https://ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">100</subfield><subfield code="1">01</subfield><subfield code="x">3100</subfield><subfield code="r">https://ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">100</subfield><subfield code="1">01</subfield><subfield code="x">3100</subfield><subfield code="y">für Uniangehörige: Zugang weltweit</subfield><subfield code="r">http://han.med.uni-magdeburg.de/han/mitvia-ieee/ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">370</subfield><subfield code="1">01</subfield><subfield code="x">4370</subfield><subfield code="y">E-Book: Zugriff im HCU-Netz. Zugriff von außerhalb nur für HCU-Angehörige möglich</subfield><subfield code="r">https://ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">2015</subfield><subfield code="1">01</subfield><subfield code="x">DE-93</subfield><subfield code="r">https://ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="985" ind1=" " ind2=" "><subfield code="2">23</subfield><subfield code="1">01</subfield><subfield code="x">0830</subfield><subfield code="a">2018-01805, 2018-01806, 2018-01808</subfield></datafield><datafield tag="995" ind1=" " ind2=" "><subfield code="2">22</subfield><subfield code="1">01</subfield><subfield code="x">0018</subfield><subfield code="a">olrm-h228-MITIEEE</subfield></datafield><datafield tag="995" ind1=" " ind2=" "><subfield code="2">23</subfield><subfield code="1">01</subfield><subfield code="x">0830</subfield><subfield code="a">olr-MIT</subfield></datafield><datafield tag="995" ind1=" " ind2=" "><subfield code="2">100</subfield><subfield code="1">01</subfield><subfield code="x">3100</subfield><subfield code="a">OLR-MIT-CEC</subfield></datafield><datafield tag="995" ind1=" " ind2=" "><subfield code="2">370</subfield><subfield code="1">01</subfield><subfield code="x">4370</subfield><subfield code="a">olr-ebook mitieee</subfield></datafield><datafield tag="998" ind1=" " ind2=" "><subfield code="2">23</subfield><subfield code="1">01</subfield><subfield code="x">0830</subfield><subfield code="0">2015.01.31</subfield></datafield><datafield tag="998" ind1=" " ind2=" "><subfield code="2">370</subfield><subfield code="1">01</subfield><subfield code="x">4370</subfield><subfield code="0">2021.12.01</subfield></datafield></record></collection>
|
fulltext |
Introduction … The Scalar-Parameter Case Probability-One and Mean-Square Convergence … Moment Convergence Rates … Asymptotic Distribution Theory … Asymptotic Efficiency … The Vector-Parameter Case Mean-Square and Probability-One Convergence … Complements and Details … Applications … Open Problems … Appendix Lemmas … Through … References … Index … |
_version_ |
1805540708178722816 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000cam a22002652 4500</leader><controlfield tag="001">816667810</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20231004172600.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">150128s2003 xxu|||||o 00| ||eng c</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">0262310929</subfield><subfield code="c">electronic bk.</subfield><subfield code="9">0-262-31092-9</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9780262310925</subfield><subfield code="c">: electronic bk.</subfield><subfield code="9">978-0-262-31092-5</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)816667810</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-576)9816667819</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)GBV816667810</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)827697816</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(MITPRESS)6276887</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(EBP)055121497</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="044" ind1=" " ind2=" "><subfield code="c">XD-US</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">QA280</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">519.5/5</subfield><subfield code="2">23</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">519</subfield><subfield code="2">19</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Albert, Arthur E.</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Stochastic approximation and nonlinear regression</subfield><subfield code="c">Arthur E. Albert, Leland A. Gardner</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Cambridge, Mass</subfield><subfield code="b">MIT-Press</subfield><subfield code="c">2003</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="490" ind1="0" ind2=" "><subfield code="a">MIT Press Research Monographs</subfield><subfield code="v">No. 42</subfield></datafield><datafield tag="490" ind1="0" ind2=" "><subfield code="a">MIT Press research monograph</subfield><subfield code="v">no. 42</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">"The MIT Press</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Includes bibliographical references and index</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">This monograph addresses the problem of "real-time" curve fitting in the presence of noise, from the computational and statistical viewpoints. It examines the problem of nonlinear regression, where observations are made on a time series whose mean-value function is known except for a vector parameter. In contrast to the traditional formulation, data are imagined to arrive in temporal succession. The estimation is carried out in real time so that, at each instant, the parameter estimate fully reflects all available data.Specifically, the monograph focuses on estimator sequences of the so-called differential correction type. The term "differential correction" refers to the fact that the difference between the components of the updated and previous estimators is proportional to the difference between the current observation and the value that would be predicted by the regression function if the previous estimate were in fact the true value of the unknown vector parameter. The vector of proportionality factors (which is generally time varying and can depend upon previous estimates) is called the "gain" or "smoothing" vector.The main purpose of this research is to relate the large-sample statistical behavior of such estimates (consistency, rate of convergence, large-sample distribution theory, asymptotic efficiency) to the properties of the regression function and the choice of smoothing vectors. Furthermore, consideration is given to the tradeoff that can be effected between computational simplicity and statistical efficiency through the choice of gains.Part I deals with the special cases of an unknown scalar parameter-discussing probability-one and mean-square convergence, rates of mean-square convergence, and asymptotic distribution theory of the estimators for various choices of the smoothing sequence. Part II examines the probability-one and mean-square convergence of the estimators in the vector case for various choices of smoothing vectors. Examples are liberally sprinkled throughout the book. Indeed, the last chapter is devoted entirely to the discussion of examples at varying levels of generality.If one views the stochastic approximation literature as a study in the asymptotic behavior of solutions to a certain class of nonlinear first-order difference equations with stochastic driving terms, then the results of this monograph also serve to extend and complement many of the results in that literature, which accounts for the authors' choice of title.The book is written at the first-year graduate level, although this level of maturity is not required uniformly. Certainly the reader should understand the concept of a limit both in the deterministic and probabilistic senses (i.e., almost sure and quadratic mean convergence). This much will assure a comfortable journey through the first fourth of the book. Chapters 4 and 5 require an acquaintance with a few selected central limit theorems. A familiarity with the standard techniques of large-sample theory will also prove useful but is not essential. Part II, Chapters 6 through 9, is couched in the language of matrix algebra, but none of the "classical" results used are deep. The reader who appreciates the elementary properties of eigenvalues, eigenvectors, and matrix norms will feel at home.MIT Press Research Monograph No. 42.</subfield></datafield><datafield tag="533" ind1=" " ind2=" "><subfield code="a">Online-Ausg.</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Time-series analysis</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Regression analysis</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Stochastic approximation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Time-series analysis</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Regression analysis</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Gardner, Leland A.</subfield><subfield code="4">oth</subfield></datafield><datafield tag="776" ind1="1" ind2=" "><subfield code="z">9780262511483</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Druck-Ausgabe</subfield><subfield code="z">9780262511483</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/book/6276887</subfield><subfield code="m">X:MITPRESS</subfield><subfield code="x">Verlag</subfield><subfield code="y">IEEE Xplore</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">http://www.gbv.de/dms/bowker/toc/9780262511483.pdf</subfield><subfield code="m">V:DE-601</subfield><subfield code="m">X:Bowker</subfield><subfield code="q">pdf/application</subfield><subfield code="v">2015-03-18</subfield><subfield code="x">Verlag</subfield><subfield code="y">Inhaltsverzeichnis</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-37-IEM</subfield><subfield code="b">2012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ISIL_DE-18</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_1</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_KXP</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22_i22818</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ISIL_DE-830</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_100</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ISIL_DE-Ma9</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ISIL_DE-1373</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2015</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ISIL_DE-93</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">BO</subfield></datafield><datafield tag="953" ind1=" " ind2=" "><subfield code="2">045F</subfield><subfield code="a">519.5/5</subfield></datafield><datafield tag="953" ind1=" " ind2=" "><subfield code="2">045F</subfield><subfield code="a">519</subfield></datafield><datafield tag="980" ind1=" " ind2=" "><subfield code="2">22</subfield><subfield code="1">01</subfield><subfield code="x">0018</subfield><subfield code="b">384847073X</subfield><subfield code="h">olrm-h228-MITIEEE</subfield><subfield code="y">zi22818</subfield><subfield code="z">03-02-21</subfield></datafield><datafield tag="980" ind1=" " ind2=" "><subfield code="2">23</subfield><subfield code="1">01</subfield><subfield code="x">0830</subfield><subfield code="b">1521013195</subfield><subfield code="h">olr-MIT</subfield><subfield code="u">i</subfield><subfield code="y">z</subfield><subfield code="z">31-01-15</subfield></datafield><datafield tag="980" ind1=" " ind2=" "><subfield code="2">100</subfield><subfield code="1">01</subfield><subfield code="x">3100</subfield><subfield code="b">4472464284</subfield><subfield code="c">09</subfield><subfield code="f">--%%--</subfield><subfield code="d">eBook MIT Press</subfield><subfield code="e">--%%--</subfield><subfield code="j">--%%--</subfield><subfield code="h">OLR-MIT-CEC</subfield><subfield code="k">Vervielfältigungen (z.B. Kopien, Downloads) sind nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots.</subfield><subfield code="y">z</subfield><subfield code="z">30-01-24</subfield></datafield><datafield tag="980" ind1=" " ind2=" "><subfield code="2">370</subfield><subfield code="1">01</subfield><subfield code="x">4370</subfield><subfield code="b">401121759X</subfield><subfield code="h">olr-ebook mitieee</subfield><subfield code="k">Vervielfältigungen (z.B. Kopien, Downloads) sind nur von einzelnen Kapiteln oder Seiten und nur zum eigenen wissenschaftlichen Gebrauch erlaubt. Keine Weitergabe an Dritte. Kein systematisches Downloaden durch Robots.</subfield><subfield code="u">i</subfield><subfield code="y">z</subfield><subfield code="z">01-12-21</subfield></datafield><datafield tag="980" ind1=" " ind2=" "><subfield code="2">2015</subfield><subfield code="1">01</subfield><subfield code="x">DE-93</subfield><subfield code="b">3740749407</subfield><subfield code="c">00</subfield><subfield code="f">--%%--</subfield><subfield code="d">--%%--</subfield><subfield code="e">p</subfield><subfield code="j">--%%--</subfield><subfield code="k">Campuslizenz</subfield><subfield code="y">l01</subfield><subfield code="z">18-08-20</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">22</subfield><subfield code="1">01</subfield><subfield code="x">0018</subfield><subfield code="y">Volltextzugang Campus</subfield><subfield code="r">https://ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">22</subfield><subfield code="1">01</subfield><subfield code="x">0018</subfield><subfield code="y">Nur für Angehörige der Universität Hamburg: Volltextzugang von außerhalb des Campus</subfield><subfield code="r">http://emedien.sub.uni-hamburg.de/han/ieee/ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">23</subfield><subfield code="1">01</subfield><subfield code="x">0830</subfield><subfield code="y">MIT Press EBook</subfield><subfield code="r">https://ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">100</subfield><subfield code="1">01</subfield><subfield code="x">3100</subfield><subfield code="r">https://ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">100</subfield><subfield code="1">01</subfield><subfield code="x">3100</subfield><subfield code="y">für Uniangehörige: Zugang weltweit</subfield><subfield code="r">http://han.med.uni-magdeburg.de/han/mitvia-ieee/ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">370</subfield><subfield code="1">01</subfield><subfield code="x">4370</subfield><subfield code="y">E-Book: Zugriff im HCU-Netz. Zugriff von außerhalb nur für HCU-Angehörige möglich</subfield><subfield code="r">https://ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="981" ind1=" " ind2=" "><subfield code="2">2015</subfield><subfield code="1">01</subfield><subfield code="x">DE-93</subfield><subfield code="r">https://ieeexplore.ieee.org/book/6276887</subfield></datafield><datafield tag="985" ind1=" " ind2=" "><subfield code="2">23</subfield><subfield code="1">01</subfield><subfield code="x">0830</subfield><subfield code="a">2018-01805, 2018-01806, 2018-01808</subfield></datafield><datafield tag="995" ind1=" " ind2=" "><subfield code="2">22</subfield><subfield code="1">01</subfield><subfield code="x">0018</subfield><subfield code="a">olrm-h228-MITIEEE</subfield></datafield><datafield tag="995" ind1=" " ind2=" "><subfield code="2">23</subfield><subfield code="1">01</subfield><subfield code="x">0830</subfield><subfield code="a">olr-MIT</subfield></datafield><datafield tag="995" ind1=" " ind2=" "><subfield code="2">100</subfield><subfield code="1">01</subfield><subfield code="x">3100</subfield><subfield code="a">OLR-MIT-CEC</subfield></datafield><datafield tag="995" ind1=" " ind2=" "><subfield code="2">370</subfield><subfield code="1">01</subfield><subfield code="x">4370</subfield><subfield code="a">olr-ebook mitieee</subfield></datafield><datafield tag="998" ind1=" " ind2=" "><subfield code="2">23</subfield><subfield code="1">01</subfield><subfield code="x">0830</subfield><subfield code="0">2015.01.31</subfield></datafield><datafield tag="998" ind1=" " ind2=" "><subfield code="2">370</subfield><subfield code="1">01</subfield><subfield code="x">4370</subfield><subfield code="0">2021.12.01</subfield></datafield></record></collection>
|
score |
7.167494 |