Keras中密集层的默认激活函数是什么
我读了针对Keras的文档,我发现当我们忽略激活函数时,它将只是一个简单的线性函数
activation: Activation function to use. If you don't specify anything, no activation is applied (ie. "linear" activation: a(x) = x).
,但默认情况下,bias
is true> True
和I尝试此示例,然后解决了我的问题,并给了我正确的重量:
那么在这里实际上默认的激活函数,如何检测?
谢谢
i read the documintation for keras, and i found that when we ignore the activation function it will be just a simple linear function
activation: Activation function to use. If you don't specify anything, no activation is applied (ie. "linear" activation: a(x) = x).
but also by default the bias
is True
, and I try this example, and it solved my problem and gave me the correct weight:
So what actually the default activation function here, and how can I detect it?
Thanks
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
如果您不指定激活函数,则神经元的值将是输入和偏见的加权总和。应用激活函数发生 计算总和,因此,如果您没有指定任何内容,它将仅此而保留。
If you don't specify an activation function, the value of a neuron will just be a weighted sum of inputs and biases. Applying an activation function happens after the sum is calculated so if you don't specify any, it will simply remain like that.