torch.nn.bceloss()和torch.nn.functional.binary_cross_entropy

发布于 2025-01-27 03:34:18 字数 39 浏览 3 评论 0原文

这两个损失函数之间的基本区别是什么?我已经尝试使用两个损失功能。

What is the basic difference between these two loss functions? I have already tried using both the loss functions.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

人│生佛魔见 2025-02-03 03:34:18

不同之处在于, nn.bceloss 是两个与同一操作的Pytorch接口。

  • torch.nn.bceloss是一类,是从 nn.module 这使得以两步的方式使用它,就像您始终在OOP中一样面向对象的编程:初始化然后使用。初始化处理参数和属性初始化,因为该名称暗示了使用状态运算符(例如参数化层和类型)非常有用的。这是实现自己的类时要走的方法,例如:

     类Trainer():
        def __init __(自我,模型):
            self.model =模型
            self.loss = nn.bceloss()
    
        def __call __(self,x,y)
            y_hat = self.model(x)
            损失= self.loss(y_hat,y)
            回报损失
     

  • 另一方面, selt torch.nn.functional.binary.binary.binary_cross_entropy是功能接口。实际上,它是nn.bceloss使用的基础操作员/Loss.py#l1164“ rel =“ noreferrer”> 这条线 。您可以使用此界面,但是使用状态运营商时,这可能会变得笨拙。在这种特殊情况下,二进制跨透镜损失没有参数(在最一般的情况下),因此您可以做:

     类Trainer():
        def __init __(自我,模型):
            self.model =模型
    
        def __call __(self,x,y)
            y_hat = self.model(x)
            损失= f.binary_cross_entropy(y_hat,y)
            回报损失
     

The difference is that nn.BCEloss and F.binary_cross_entropy are two PyTorch interfaces to the same operations.

  • The former, torch.nn.BCELoss, is a class and inherits from nn.Module which makes it handy to be used in a two-step fashion, as you would always do in OOP (Object Oriented Programming): initialize then use. Initialization handles parameters and attributes initialization as the name implies which is quite useful when using stateful operators such as parametrized layers and the kind. This is the way to go when implementing classes of your own, for example:

    class Trainer():
        def __init__(self, model):
            self.model = model
            self.loss = nn.BCEloss()
    
        def __call__(self, x, y)
            y_hat = self.model(x)
            loss = self.loss(y_hat, y)
            return loss
    
  • On the other hand, the later, torch.nn.functional.binary_cross_entropy, is the functional interface. It is actually the underlying operator used by nn.BCELoss, as you can see at this line. You can use this interface but this can become cumbersome when using stateful operators. In this particular case, the binary cross-entropy loss does not have parameters (in the most general case), so you could do:

    class Trainer():
        def __init__(self, model):
            self.model = model
    
        def __call__(self, x, y)
            y_hat = self.model(x)
            loss = F.binary_cross_entropy(y_hat, y)
            return loss
    
执手闯天涯 2025-02-03 03:34:18

bceloss binary_cross_entropy 损失。
torch.nn.functional.binary_cross_entropy 计算 torch.nn.bceloss()的实际损失

BCEloss is the Binary_Cross_Entropy loss.
torch.nn.functional.binary_cross_entropy calculates the actual loss inside the torch.nn.BCEloss()

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文