如何将多个损失添加到梯度
我正在测试tf.gradienttape。我编写了一个带有几个输出层的模型,每个模型都有自己的损失,我想在其中集成梯度。我的问题是:是否有特定的技术如何将几种损失作为目标实施? 我知道一种选择是付出损失的平均值。这总是必要的吗?我不能仅输入损失列表,而渐变tape知道哪些损失属于哪个输出层?
I am testing tf.gradienttape. I wrote a model with several output layers, each with an own loss, where i wanted to integrate the gradienttape. My question is: are there specific techniques how to implement the several losses to the gradient as target?
I know one option is to take the mean of the losses. Is that always necessary? Can't I just input a list of losses and the gradienttape knows which losses belong to which output layer?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
示例1:默认值
默认选项在梯度函数中添加两个损失的梯度。
示例2:替代性
替代选项通过允许您检查生成的梯度之间的关系,从而为您提供了更多的自由。
我在Tensorflow 2.5上测试了这两个选项,它们给了我类似的结果。
Example 1:Default
The default option add the gradients of the two losses within the gradient function.
Example 2: Alternative
The alternative options gives you more freedom by allowing you to check the relations between the generated gradients.
I tested both options on Tensorflow 2.5 and they gave me similar results.
在
要计算多个损失,您需要多个磁带。类似:
然后应用。
In the TensorFlow document: Unless you set persistent=True a GradientTape can only be used to compute one set of gradients.
To calculate multiple losses, you need multiple tapes. Something like:
Then apply it.