zhuwei3014的个人博客分享 http://blog.sciencenet.cn/u/zhuwei3014

博文

Coursera: Neural Networks for ML- Lecture 10

已有 2292 次阅读 2014-7-31 13:02 |个人分类:Coursera: Neural Networks|系统分类:科研笔记| 神经网络, Dropout, MoE

Ways to improve generalization

ICML2012_Bayesian Posterior Sampling via Stocastic Gradient Fisher Scoring: 这篇讲的是full bayesian learning with mini-batches


JMLR2014_Dropout-A Simple Way to Prevent Neural Networks from Overfitting


ICML2008_Extracting and Composing Robust Features with Denoising Autoencoders

1. Why it helps to combine models
2. Mixtures of Experts
3. The idea of full Bayesian learning
4. Making full Bayesian learning practical
5. Dropout





https://blog.sciencenet.cn/blog-1583812-816127.html

上一篇:Coursera: Neural Networks for ML- Lecture 9
下一篇:Coursera: Neural Networks for ML- Lecture 11
收藏 IP: 70.48.10.*| 热度|

0

该博文允许注册用户评论 请点击登录 评论 (0 个评论)

数据加载中...

Archiver|手机版|科学网 ( 京ICP备07017567号-12 )

GMT+8, 2024-9-1 19:22

Powered by ScienceNet.cn

Copyright © 2007- 中国科学报社

返回顶部