CNN预测负值
我不确定,我想要什么,这就是为什么我可能会错过互联网上的答案。我有完全卷积的神经网络U-NET。桨叶总是“相同的”,激活函数为“ relu”。我正在用0到1之间的像素喂给它。 损耗函数是二进制跨熵,因为我只有1个类。优化器是亚当,默认情况下,度量标准是“准确性”。当我在那里留下准确性时,它可以正常工作。但是,当我将指标更改为iOU时,代码在预测中具有“负值”崩溃,
tensorflow.python.framework.errors_impl.InvalidArgumentError: 2 root error(s) found.
(0) Invalid argument: assertion failed: [`predictions` contains negative values] [Condition x >= 0 did not hold element-wise:] [x (confusion_matrix/Cast:0) = ] [0 0 0...]
[[{{node confusion_matrix/assert_non_negative_1/assert_less_equal/Assert/AssertGuard/else/_10/confusion_matrix/assert_non_negative_1/assert_less_equal/Assert/AssertGuard/Assert}}]]
[[confusion_matrix/assert_non_negative_1/assert_less_equal/Assert/AssertGuard/branch_executed/_17/_87]]
(1) Invalid argument: assertion failed: [`predictions` contains negative values] [Condition x >= 0 did not hold element-wise:] [x (confusion_matrix/Cast:0) = ] [0 0 0...]
[[{{node confusion_matrix/assert_non_negative_1/assert_less_equal/Assert/AssertGuard/else/_10/confusion_matrix/assert_non_negative_1/assert_less_equal/Assert/AssertGuard/Assert}}]]
0 successful operations.
0 derived errors ignored. [Op:__inference_train_function_4589]
因此,我将“准确性”用于训练并检查了预测图像,并且确实有负值预测。为什么?
I'm not sure, what I'm looking for and that's why I might miss the answer on the internet. I have fully convolutional neural net U-Net. The paddings are always "same", the activation function is "relu". I'm feeding images to it with pixels between 0 and 1.
The loss function is Binary Cross entropy since I have only 1 class. Optimizer is ADAM, metric is "accuracy" by default. When I left accuracy there, it works fine. However, when I change the metric to IOU, the code crashes with "negative values in prediction"
tensorflow.python.framework.errors_impl.InvalidArgumentError: 2 root error(s) found.
(0) Invalid argument: assertion failed: [`predictions` contains negative values] [Condition x >= 0 did not hold element-wise:] [x (confusion_matrix/Cast:0) = ] [0 0 0...]
[[{{node confusion_matrix/assert_non_negative_1/assert_less_equal/Assert/AssertGuard/else/_10/confusion_matrix/assert_non_negative_1/assert_less_equal/Assert/AssertGuard/Assert}}]]
[[confusion_matrix/assert_non_negative_1/assert_less_equal/Assert/AssertGuard/branch_executed/_17/_87]]
(1) Invalid argument: assertion failed: [`predictions` contains negative values] [Condition x >= 0 did not hold element-wise:] [x (confusion_matrix/Cast:0) = ] [0 0 0...]
[[{{node confusion_matrix/assert_non_negative_1/assert_less_equal/Assert/AssertGuard/else/_10/confusion_matrix/assert_non_negative_1/assert_less_equal/Assert/AssertGuard/Assert}}]]
0 successful operations.
0 derived errors ignored. [Op:__inference_train_function_4589]
So, I used "accuracy" for training and checked the prediction image and truly there are negative values predictid. Why?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
我发现有人试图更改KERAS的IOU指标中的类数,这有所帮助。
参阅: https://github.com/tensorflow/models/models/sissues/sissues/8138
请 也尝试了这一点,并将类的数量从1更改为2。但是,您需要做的是更改它的数量更大。 10+为我工作。
不过,我不知道为什么。
I found out that someone tried to change the number of classes in the IOU metric from Keras and it helped.
See: https://github.com/tensorflow/models/issues/8138
I tried that as well and changed the number of classes from 1 to 2. However, what you need to do is to change it larger number. 10+ worked for me.
Still, I don't know why.
发现了问题。很明显。由于某种原因,我认为默认激活是“ Sigmoid”,因此最后一个卷积层没有指定“激活”单元。即使由于卷积,所有激活单元都依赖,在最后一层中可能会产生负值。 :facepalm:
Found out the problem. Pretty obvious. For some reason, I thought the default activation is "sigmoid", hence the last convolutional layer does not have specified "activation" unit. And even if the all activation units are relu, due to convolution, there might me negative values in the last layer. :facepalm: