荣斋居士分享 http://blog.sciencenet.cn/u/dalianwang

博文

[转载]进行深度学习时,如何选择激活函数?

已有 1358 次阅读 2019-6-16 04:22 |个人分类:机器学习|系统分类:科研笔记| 深度学习, 激活函数 |文章来源:转载

How to choose the right activation function?

The activation function is decided depending upon the objective of the problem statement and the concerned properties. Some of the inferences are as follows:

  • Sigmoid functions work very well in the case of shallow networks and binary classifiers. Deeper networks may lead to vanishing gradients.

  • The ReLU function is the most widely used, and try using Leaky ReLU to avoid the case of dead neurons. Thus, start with ReLU, then move to another activation function if ReLU doesn't provide good results.

  • Use softmax in the outer layer for the multi-class classification.

  • Avoid using ReLU in the outer layer.




https://blog.sciencenet.cn/blog-2089193-1185209.html

上一篇:一次性搞定Word每段落开头空两格
下一篇:MATLAB求解优化问题时存在的不足
收藏 IP: 184.54.72.*| 热度|

0

该博文允许注册用户评论 请点击登录 评论 (0 个评论)

数据加载中...

Archiver|手机版|科学网 ( 京ICP备07017567号-12 )

GMT+8, 2024-4-26 19:34

Powered by ScienceNet.cn

Copyright © 2007- 中国科学报社

返回顶部