【变分贝叶斯自动编码(AEVB)】Auto-Encoding Variational Bayes

Auto-Encoding Variational Bayes

Diederik P Kingma,Max Welling

【论文+代码(Python):变分贝叶斯自动编码(AEVB)】《Auto-Encoding Variational Bayes》Diederik P Kingma, Max Welling (2014)O网页链接Github(fauxtograph):O网页链接参阅:O爱可可-爱生活

(Submitted on 20 Dec 2013 (v1), last revised 1 May 2014 (this version, v10))

How can we perform efficient inference and learning in directed probabilistic models, in the presence of continuous latent variables with intractable posterior distributions, and large datasets? We introduce a stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case. Our contributions is two-fold. First, we show that a reparameterization of the variational lower bound yields a lower bound estimator that can be straightforwardly optimized using standard stochastic gradient methods. Second, we show that for i.i.d. datasets with continuous latent variables per datapoint, posterior inference can be made especially efficient by fitting an approximate inference model (also called a recognition model) to the intractable posterior using the proposed lower bound estimator. Theoretical advantages are reflected in experimental results.

Subjects:Machine Learning (stat.ML); Learning (cs.LG)

Cite as:arXiv:1312.6114[stat.ML]

(orarXiv:1312.6114v10[stat.ML]for this version)

Download:

PDF

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容

  • 文/㭍年_我走路带风 01 你已不再是个孩子 不记得从什么时候开始,每天挣扎着起床不再是因为要上课点名而是上班打卡...
    㭍年_我走路带风阅读 2,720评论 0 1
  • 早上好!好久不见的阳光!温暖的阳光!听……是谁在唱歌?万物都愉悦了起来…… 我当了一回糟糕的“判官” 一...
    珊瑚虫水晶心阅读 399评论 2 0
  • 今晚第二次作为锦英头马的时间官。演讲状态和熟练程度竟然比第一次都还要糟。究其原因是因为自己的自以为是。自以为是自...
    林索娜阅读 222评论 0 2
  • 1。白羊座 白薇看着眼前青涩的大男孩忽然想起了自己的前男友bear,因为他们都有着一双大眼睛。 “学姐你说她为什么...
    滕沐阅读 9,846评论 56 131