$$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC
Abstract Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a...
Ausführliche Beschreibung
Autor*in: |
Mishra, Swapnil [verfasserIn] |
---|
Format: |
Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2022 |
---|
Schlagwörter: |
---|
Anmerkung: |
© The Author(s) 2022 |
---|
Übergeordnetes Werk: |
Enthalten in: Statistics and computing - Springer US, 1991, 32(2022), 6 vom: 17. Okt. |
---|---|
Übergeordnetes Werk: |
volume:32 ; year:2022 ; number:6 ; day:17 ; month:10 |
Links: |
---|
DOI / URN: |
10.1007/s11222-022-10151-w |
---|
Katalog-ID: |
OLC2079748971 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | OLC2079748971 | ||
003 | DE-627 | ||
005 | 20230506093434.0 | ||
007 | tu | ||
008 | 221221s2022 xx ||||| 00| ||eng c | ||
024 | 7 | |a 10.1007/s11222-022-10151-w |2 doi | |
035 | |a (DE-627)OLC2079748971 | ||
035 | |a (DE-He213)s11222-022-10151-w-p | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
082 | 0 | 4 | |a 004 |a 620 |q VZ |
100 | 1 | |a Mishra, Swapnil |e verfasserin |0 (orcid)0000-0002-8759-5902 |4 aut | |
245 | 1 | 0 | |a $$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC |
264 | 1 | |c 2022 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ohne Hilfsmittel zu benutzen |b n |2 rdamedia | ||
338 | |a Band |b nc |2 rdacarrier | ||
500 | |a © The Author(s) 2022 | ||
520 | |a Abstract Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder ($$\pi $$VAE). $$\pi $$VAE is a new continuous stochastic process. We use $$\pi $$VAE to learn low dimensional embeddings of function classes by combining a trainable feature mapping with generative model using a VAE. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions such as their integrals. For popular tasks, such as spatial interpolation, $$\pi $$VAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate an elegant and scalable means of performing fully Bayesian inference for stochastic processes within probabilistic programming languages such as Stan. | ||
650 | 4 | |a Bayesian inference | |
650 | 4 | |a MCMC | |
650 | 4 | |a VAE | |
650 | 4 | |a Spatio-temporal | |
700 | 1 | |a Flaxman, Seth |4 aut | |
700 | 1 | |a Berah, Tresnia |4 aut | |
700 | 1 | |a Zhu, Harrison |4 aut | |
700 | 1 | |a Pakkanen, Mikko |4 aut | |
700 | 1 | |a Bhatt, Samir |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Statistics and computing |d Springer US, 1991 |g 32(2022), 6 vom: 17. Okt. |w (DE-627)131007963 |w (DE-600)1087487-2 |w (DE-576)052732894 |x 0960-3174 |7 nnns |
773 | 1 | 8 | |g volume:32 |g year:2022 |g number:6 |g day:17 |g month:10 |
856 | 4 | 1 | |u https://doi.org/10.1007/s11222-022-10151-w |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_OLC | ||
912 | |a SSG-OLC-TEC | ||
912 | |a SSG-OLC-MAT | ||
951 | |a AR | ||
952 | |d 32 |j 2022 |e 6 |b 17 |c 10 |
author_variant |
s m sm s f sf t b tb h z hz m p mp s b sb |
---|---|
matchkey_str |
article:09603174:2022----::iaatcatcrcspirobysade |
hierarchy_sort_str |
2022 |
publishDate |
2022 |
allfields |
10.1007/s11222-022-10151-w doi (DE-627)OLC2079748971 (DE-He213)s11222-022-10151-w-p DE-627 ger DE-627 rakwb eng 004 620 VZ Mishra, Swapnil verfasserin (orcid)0000-0002-8759-5902 aut $$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC 2022 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s) 2022 Abstract Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder ($$\pi $$VAE). $$\pi $$VAE is a new continuous stochastic process. We use $$\pi $$VAE to learn low dimensional embeddings of function classes by combining a trainable feature mapping with generative model using a VAE. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions such as their integrals. For popular tasks, such as spatial interpolation, $$\pi $$VAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate an elegant and scalable means of performing fully Bayesian inference for stochastic processes within probabilistic programming languages such as Stan. Bayesian inference MCMC VAE Spatio-temporal Flaxman, Seth aut Berah, Tresnia aut Zhu, Harrison aut Pakkanen, Mikko aut Bhatt, Samir aut Enthalten in Statistics and computing Springer US, 1991 32(2022), 6 vom: 17. Okt. (DE-627)131007963 (DE-600)1087487-2 (DE-576)052732894 0960-3174 nnns volume:32 year:2022 number:6 day:17 month:10 https://doi.org/10.1007/s11222-022-10151-w lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-TEC SSG-OLC-MAT AR 32 2022 6 17 10 |
spelling |
10.1007/s11222-022-10151-w doi (DE-627)OLC2079748971 (DE-He213)s11222-022-10151-w-p DE-627 ger DE-627 rakwb eng 004 620 VZ Mishra, Swapnil verfasserin (orcid)0000-0002-8759-5902 aut $$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC 2022 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s) 2022 Abstract Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder ($$\pi $$VAE). $$\pi $$VAE is a new continuous stochastic process. We use $$\pi $$VAE to learn low dimensional embeddings of function classes by combining a trainable feature mapping with generative model using a VAE. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions such as their integrals. For popular tasks, such as spatial interpolation, $$\pi $$VAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate an elegant and scalable means of performing fully Bayesian inference for stochastic processes within probabilistic programming languages such as Stan. Bayesian inference MCMC VAE Spatio-temporal Flaxman, Seth aut Berah, Tresnia aut Zhu, Harrison aut Pakkanen, Mikko aut Bhatt, Samir aut Enthalten in Statistics and computing Springer US, 1991 32(2022), 6 vom: 17. Okt. (DE-627)131007963 (DE-600)1087487-2 (DE-576)052732894 0960-3174 nnns volume:32 year:2022 number:6 day:17 month:10 https://doi.org/10.1007/s11222-022-10151-w lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-TEC SSG-OLC-MAT AR 32 2022 6 17 10 |
allfields_unstemmed |
10.1007/s11222-022-10151-w doi (DE-627)OLC2079748971 (DE-He213)s11222-022-10151-w-p DE-627 ger DE-627 rakwb eng 004 620 VZ Mishra, Swapnil verfasserin (orcid)0000-0002-8759-5902 aut $$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC 2022 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s) 2022 Abstract Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder ($$\pi $$VAE). $$\pi $$VAE is a new continuous stochastic process. We use $$\pi $$VAE to learn low dimensional embeddings of function classes by combining a trainable feature mapping with generative model using a VAE. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions such as their integrals. For popular tasks, such as spatial interpolation, $$\pi $$VAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate an elegant and scalable means of performing fully Bayesian inference for stochastic processes within probabilistic programming languages such as Stan. Bayesian inference MCMC VAE Spatio-temporal Flaxman, Seth aut Berah, Tresnia aut Zhu, Harrison aut Pakkanen, Mikko aut Bhatt, Samir aut Enthalten in Statistics and computing Springer US, 1991 32(2022), 6 vom: 17. Okt. (DE-627)131007963 (DE-600)1087487-2 (DE-576)052732894 0960-3174 nnns volume:32 year:2022 number:6 day:17 month:10 https://doi.org/10.1007/s11222-022-10151-w lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-TEC SSG-OLC-MAT AR 32 2022 6 17 10 |
allfieldsGer |
10.1007/s11222-022-10151-w doi (DE-627)OLC2079748971 (DE-He213)s11222-022-10151-w-p DE-627 ger DE-627 rakwb eng 004 620 VZ Mishra, Swapnil verfasserin (orcid)0000-0002-8759-5902 aut $$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC 2022 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s) 2022 Abstract Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder ($$\pi $$VAE). $$\pi $$VAE is a new continuous stochastic process. We use $$\pi $$VAE to learn low dimensional embeddings of function classes by combining a trainable feature mapping with generative model using a VAE. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions such as their integrals. For popular tasks, such as spatial interpolation, $$\pi $$VAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate an elegant and scalable means of performing fully Bayesian inference for stochastic processes within probabilistic programming languages such as Stan. Bayesian inference MCMC VAE Spatio-temporal Flaxman, Seth aut Berah, Tresnia aut Zhu, Harrison aut Pakkanen, Mikko aut Bhatt, Samir aut Enthalten in Statistics and computing Springer US, 1991 32(2022), 6 vom: 17. Okt. (DE-627)131007963 (DE-600)1087487-2 (DE-576)052732894 0960-3174 nnns volume:32 year:2022 number:6 day:17 month:10 https://doi.org/10.1007/s11222-022-10151-w lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-TEC SSG-OLC-MAT AR 32 2022 6 17 10 |
allfieldsSound |
10.1007/s11222-022-10151-w doi (DE-627)OLC2079748971 (DE-He213)s11222-022-10151-w-p DE-627 ger DE-627 rakwb eng 004 620 VZ Mishra, Swapnil verfasserin (orcid)0000-0002-8759-5902 aut $$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC 2022 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s) 2022 Abstract Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder ($$\pi $$VAE). $$\pi $$VAE is a new continuous stochastic process. We use $$\pi $$VAE to learn low dimensional embeddings of function classes by combining a trainable feature mapping with generative model using a VAE. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions such as their integrals. For popular tasks, such as spatial interpolation, $$\pi $$VAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate an elegant and scalable means of performing fully Bayesian inference for stochastic processes within probabilistic programming languages such as Stan. Bayesian inference MCMC VAE Spatio-temporal Flaxman, Seth aut Berah, Tresnia aut Zhu, Harrison aut Pakkanen, Mikko aut Bhatt, Samir aut Enthalten in Statistics and computing Springer US, 1991 32(2022), 6 vom: 17. Okt. (DE-627)131007963 (DE-600)1087487-2 (DE-576)052732894 0960-3174 nnns volume:32 year:2022 number:6 day:17 month:10 https://doi.org/10.1007/s11222-022-10151-w lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-TEC SSG-OLC-MAT AR 32 2022 6 17 10 |
language |
English |
source |
Enthalten in Statistics and computing 32(2022), 6 vom: 17. Okt. volume:32 year:2022 number:6 day:17 month:10 |
sourceStr |
Enthalten in Statistics and computing 32(2022), 6 vom: 17. Okt. volume:32 year:2022 number:6 day:17 month:10 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Bayesian inference MCMC VAE Spatio-temporal |
dewey-raw |
004 |
isfreeaccess_bool |
false |
container_title |
Statistics and computing |
authorswithroles_txt_mv |
Mishra, Swapnil @@aut@@ Flaxman, Seth @@aut@@ Berah, Tresnia @@aut@@ Zhu, Harrison @@aut@@ Pakkanen, Mikko @@aut@@ Bhatt, Samir @@aut@@ |
publishDateDaySort_date |
2022-10-17T00:00:00Z |
hierarchy_top_id |
131007963 |
dewey-sort |
14 |
id |
OLC2079748971 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2079748971</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230506093434.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">221221s2022 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s11222-022-10151-w</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2079748971</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s11222-022-10151-w-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="a">620</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Mishra, Swapnil</subfield><subfield code="e">verfasserin</subfield><subfield code="0">(orcid)0000-0002-8759-5902</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">$$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2022</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© The Author(s) 2022</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder ($$\pi $$VAE). $$\pi $$VAE is a new continuous stochastic process. We use $$\pi $$VAE to learn low dimensional embeddings of function classes by combining a trainable feature mapping with generative model using a VAE. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions such as their integrals. For popular tasks, such as spatial interpolation, $$\pi $$VAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate an elegant and scalable means of performing fully Bayesian inference for stochastic processes within probabilistic programming languages such as Stan.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Bayesian inference</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">MCMC</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">VAE</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Spatio-temporal</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Flaxman, Seth</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Berah, Tresnia</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Zhu, Harrison</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Pakkanen, Mikko</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Bhatt, Samir</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Statistics and computing</subfield><subfield code="d">Springer US, 1991</subfield><subfield code="g">32(2022), 6 vom: 17. Okt.</subfield><subfield code="w">(DE-627)131007963</subfield><subfield code="w">(DE-600)1087487-2</subfield><subfield code="w">(DE-576)052732894</subfield><subfield code="x">0960-3174</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:32</subfield><subfield code="g">year:2022</subfield><subfield code="g">number:6</subfield><subfield code="g">day:17</subfield><subfield code="g">month:10</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s11222-022-10151-w</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-TEC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">32</subfield><subfield code="j">2022</subfield><subfield code="e">6</subfield><subfield code="b">17</subfield><subfield code="c">10</subfield></datafield></record></collection>
|
author |
Mishra, Swapnil |
spellingShingle |
Mishra, Swapnil ddc 004 misc Bayesian inference misc MCMC misc VAE misc Spatio-temporal $$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC |
authorStr |
Mishra, Swapnil |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)131007963 |
format |
Article |
dewey-ones |
004 - Data processing & computer science 620 - Engineering & allied operations |
delete_txt_mv |
keep |
author_role |
aut aut aut aut aut aut |
collection |
OLC |
remote_str |
false |
illustrated |
Not Illustrated |
issn |
0960-3174 |
topic_title |
004 620 VZ $$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC Bayesian inference MCMC VAE Spatio-temporal |
topic |
ddc 004 misc Bayesian inference misc MCMC misc VAE misc Spatio-temporal |
topic_unstemmed |
ddc 004 misc Bayesian inference misc MCMC misc VAE misc Spatio-temporal |
topic_browse |
ddc 004 misc Bayesian inference misc MCMC misc VAE misc Spatio-temporal |
format_facet |
Aufsätze Gedruckte Aufsätze |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
nc |
hierarchy_parent_title |
Statistics and computing |
hierarchy_parent_id |
131007963 |
dewey-tens |
000 - Computer science, knowledge & systems 620 - Engineering |
hierarchy_top_title |
Statistics and computing |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)131007963 (DE-600)1087487-2 (DE-576)052732894 |
title |
$$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC |
ctrlnum |
(DE-627)OLC2079748971 (DE-He213)s11222-022-10151-w-p |
title_full |
$$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC |
author_sort |
Mishra, Swapnil |
journal |
Statistics and computing |
journalStr |
Statistics and computing |
lang_code |
eng |
isOA_bool |
false |
dewey-hundreds |
000 - Computer science, information & general works 600 - Technology |
recordtype |
marc |
publishDateSort |
2022 |
contenttype_str_mv |
txt |
author_browse |
Mishra, Swapnil Flaxman, Seth Berah, Tresnia Zhu, Harrison Pakkanen, Mikko Bhatt, Samir |
container_volume |
32 |
class |
004 620 VZ |
format_se |
Aufsätze |
author-letter |
Mishra, Swapnil |
doi_str_mv |
10.1007/s11222-022-10151-w |
normlink |
(ORCID)0000-0002-8759-5902 |
normlink_prefix_str_mv |
(orcid)0000-0002-8759-5902 |
dewey-full |
004 620 |
title_sort |
$$\pi $$vae: a stochastic process prior for bayesian deep learning with mcmc |
title_auth |
$$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC |
abstract |
Abstract Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder ($$\pi $$VAE). $$\pi $$VAE is a new continuous stochastic process. We use $$\pi $$VAE to learn low dimensional embeddings of function classes by combining a trainable feature mapping with generative model using a VAE. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions such as their integrals. For popular tasks, such as spatial interpolation, $$\pi $$VAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate an elegant and scalable means of performing fully Bayesian inference for stochastic processes within probabilistic programming languages such as Stan. © The Author(s) 2022 |
abstractGer |
Abstract Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder ($$\pi $$VAE). $$\pi $$VAE is a new continuous stochastic process. We use $$\pi $$VAE to learn low dimensional embeddings of function classes by combining a trainable feature mapping with generative model using a VAE. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions such as their integrals. For popular tasks, such as spatial interpolation, $$\pi $$VAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate an elegant and scalable means of performing fully Bayesian inference for stochastic processes within probabilistic programming languages such as Stan. © The Author(s) 2022 |
abstract_unstemmed |
Abstract Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder ($$\pi $$VAE). $$\pi $$VAE is a new continuous stochastic process. We use $$\pi $$VAE to learn low dimensional embeddings of function classes by combining a trainable feature mapping with generative model using a VAE. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions such as their integrals. For popular tasks, such as spatial interpolation, $$\pi $$VAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate an elegant and scalable means of performing fully Bayesian inference for stochastic processes within probabilistic programming languages such as Stan. © The Author(s) 2022 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-TEC SSG-OLC-MAT |
container_issue |
6 |
title_short |
$$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC |
url |
https://doi.org/10.1007/s11222-022-10151-w |
remote_bool |
false |
author2 |
Flaxman, Seth Berah, Tresnia Zhu, Harrison Pakkanen, Mikko Bhatt, Samir |
author2Str |
Flaxman, Seth Berah, Tresnia Zhu, Harrison Pakkanen, Mikko Bhatt, Samir |
ppnlink |
131007963 |
mediatype_str_mv |
n |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s11222-022-10151-w |
up_date |
2024-07-04T01:58:48.915Z |
_version_ |
1803611882301423616 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2079748971</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230506093434.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">221221s2022 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s11222-022-10151-w</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2079748971</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s11222-022-10151-w-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="a">620</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Mishra, Swapnil</subfield><subfield code="e">verfasserin</subfield><subfield code="0">(orcid)0000-0002-8759-5902</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">$$\pi $$VAE: a stochastic process prior for Bayesian deep learning with MCMC</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2022</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© The Author(s) 2022</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder ($$\pi $$VAE). $$\pi $$VAE is a new continuous stochastic process. We use $$\pi $$VAE to learn low dimensional embeddings of function classes by combining a trainable feature mapping with generative model using a VAE. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions such as their integrals. For popular tasks, such as spatial interpolation, $$\pi $$VAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate an elegant and scalable means of performing fully Bayesian inference for stochastic processes within probabilistic programming languages such as Stan.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Bayesian inference</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">MCMC</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">VAE</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Spatio-temporal</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Flaxman, Seth</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Berah, Tresnia</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Zhu, Harrison</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Pakkanen, Mikko</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Bhatt, Samir</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Statistics and computing</subfield><subfield code="d">Springer US, 1991</subfield><subfield code="g">32(2022), 6 vom: 17. Okt.</subfield><subfield code="w">(DE-627)131007963</subfield><subfield code="w">(DE-600)1087487-2</subfield><subfield code="w">(DE-576)052732894</subfield><subfield code="x">0960-3174</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:32</subfield><subfield code="g">year:2022</subfield><subfield code="g">number:6</subfield><subfield code="g">day:17</subfield><subfield code="g">month:10</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s11222-022-10151-w</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-TEC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">32</subfield><subfield code="j">2022</subfield><subfield code="e">6</subfield><subfield code="b">17</subfield><subfield code="c">10</subfield></datafield></record></collection>
|
score |
7.3992214 |