神经网络二元分类softmaxlogsofmax和损失函数
我正在构建一个二元分类,其中我想要预测的类仅出现 <2% 的次数。我正在使用 pytorch
最后一层可能是 logosftmax
或 softmax
。
self.softmax = nn.Softmax(dim=1)
或 self.softmax = nn.LogSoftmax(dim=1)
我的问题
I am building a binary classification where the class I want to predict is present only <2% of times. I am using pytorch
The last layer could be logosftmax
or softmax
.
self.softmax = nn.Softmax(dim=1)
or self.softmax = nn.LogSoftmax(dim=1)
my questions
I should use
softmax
as it will provide outputs that sum up to 1 and I can check performance for various prob thresholds. is that understanding correct?if I use
softmax
then can I usecross_entropy
loss? This seems to suggest that it is okay to useif i use
logsoftmax
then can I usecross_entropy
loss? This seems to suggest that I shouldnt.if I use
softmax
then is there any better option thancross_entropy
loss?` cross_entropy = nn.CrossEntropyLoss(weight=class_wts)`
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论