卷积神经网络抛出 ErrorTypeError: __init__() 接受 1 个位置参数,但给出了 2 个

发布于 2025-01-20 12:07:16 字数 2184 浏览 4 评论 0原文

我通过以下维度[256,3,560,448]传递了一批256张图像 x_s 。但是,每当我尝试将图像馈送到CNN时,我都会收到以下错误:

ErrorTypeError: __init__() takes 1 positional argument but 2 were given'

不确定“ 1个位置参数”在这里的含义。我正在使用我创建的迭代器传递图像。以下是我的训练循环代码直到打破的位置:

for e in range(num_epochs):
  print(f'Epoch{e+1:04d}/ {num_epochs:04d}', end='\n================\n')
  dl_source_iter = iter(dl_source)
  dl_target_iter = iter(dl_target)

  for batch in range(max_batches):
    optimizer.zero_grad()

    p = float(batch + e * max_batches) / (num_epochs *max_batches)
    grl_lambda = 2. / (1. + np.exp(-10 * p)) - 1

    x_s, y_s = next(dl_source_iter)

    y_s_domain = torch.zeros(256, dtype=torch.long)

    class_pred, domain_pred = Cnn(x_s, grl_lambda) #This is the line which throws an error

这是我的卷积神经网络:

class Cnn(nn.Module):
  def __init__(self):
    super(Cnn, self).__init__()

    self.feature_extract= nn.Sequential(
        nn.Conv2d(3, 64, 5, 1, 1),
        nn.BatchNorm2d(64),
        nn.MaxPool2d(2),
        nn.ReLU(True),
        nn.Conv2d(64, 50, 5, 1, 1),
        nn.BatchNorm2d(50),
        nn.MaxPool2d(2),
        nn.ReLU(True),
        nn.Dropout2d(),
    )

    self.num_cnn_features = 50*5*5
    self.class_classifier = nn.Sequential(
        nn.Linear(self.num_cnn_features, 200),
        nn.BatchNorm1d(200),
        nn.Dropout2d(),
        nn.ReLU(True),
        nn.Linear(200, 200),
        nn.BatchNorm1d(200),
        nn.ReLU(True),
        nn.Linear(200, 182),
        nn.LogSoftmax(dim = 1),

    )

    self.DomainClassifier = nn.Sequential(
        nn.Linear(self.num_cnn_features, 100),
        nn.BatchNorm1d(100),
        nn.ReLU(True),
        nn.Linear(100, 2),
        nn.LogSoftmax(dim=1)    
        
    )

  def forward(self, x, grl_lambda=1.0):

    features = self.feature_extract(x)
    features = features.view(-1, self.num_cnn_features)
    features_grl = GradientReversalFn(features, grl_lambda)
    class_pred = self.class_classifier(features)
    domain_pred = self.DomainClassifier(features_grl)
    return class_pred, domain_pred

有人对为什么会发生这种情况有任何猜测吗?我似乎无法弄清楚出了什么问题。任何帮助将不胜感激。

I am passing a batch of 256 images x_s with the following dimensions [256, 3, 560, 448]. However I whenever i try to feed my images to the CNN i get the following error:

ErrorTypeError: __init__() takes 1 positional argument but 2 were given'

Not sure what it means here by '1 positional argument'. I am passing in the images using an iterator which I created. Below is my code for the training loop up until the point where it breaks:

for e in range(num_epochs):
  print(f'Epoch{e+1:04d}/ {num_epochs:04d}', end='\n================\n')
  dl_source_iter = iter(dl_source)
  dl_target_iter = iter(dl_target)

  for batch in range(max_batches):
    optimizer.zero_grad()

    p = float(batch + e * max_batches) / (num_epochs *max_batches)
    grl_lambda = 2. / (1. + np.exp(-10 * p)) - 1

    x_s, y_s = next(dl_source_iter)

    y_s_domain = torch.zeros(256, dtype=torch.long)

    class_pred, domain_pred = Cnn(x_s, grl_lambda) #This is the line which throws an error

Here is my convolutional neural network:

class Cnn(nn.Module):
  def __init__(self):
    super(Cnn, self).__init__()

    self.feature_extract= nn.Sequential(
        nn.Conv2d(3, 64, 5, 1, 1),
        nn.BatchNorm2d(64),
        nn.MaxPool2d(2),
        nn.ReLU(True),
        nn.Conv2d(64, 50, 5, 1, 1),
        nn.BatchNorm2d(50),
        nn.MaxPool2d(2),
        nn.ReLU(True),
        nn.Dropout2d(),
    )

    self.num_cnn_features = 50*5*5
    self.class_classifier = nn.Sequential(
        nn.Linear(self.num_cnn_features, 200),
        nn.BatchNorm1d(200),
        nn.Dropout2d(),
        nn.ReLU(True),
        nn.Linear(200, 200),
        nn.BatchNorm1d(200),
        nn.ReLU(True),
        nn.Linear(200, 182),
        nn.LogSoftmax(dim = 1),

    )

    self.DomainClassifier = nn.Sequential(
        nn.Linear(self.num_cnn_features, 100),
        nn.BatchNorm1d(100),
        nn.ReLU(True),
        nn.Linear(100, 2),
        nn.LogSoftmax(dim=1)    
        
    )

  def forward(self, x, grl_lambda=1.0):

    features = self.feature_extract(x)
    features = features.view(-1, self.num_cnn_features)
    features_grl = GradientReversalFn(features, grl_lambda)
    class_pred = self.class_classifier(features)
    domain_pred = self.DomainClassifier(features_grl)
    return class_pred, domain_pred

Does anyone have any guesses as to why this might be happening? I can't seem to figure out what is going wrong. Any help would be greatly appreciated.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

一刻暧昧 2025-01-27 12:07:17

您需要创建一个cnn对象,然后才能将数据传递给它。您正在调用cnn类构造函数__ INT __,它期望没有参数,而不是forward 用于cnn <的实例的方法/代码>类,这是您实际想做的。

# outside of loop
model = Cnn()
 
# inside loop
class_pred, domain_pred = model(x_s, grl_lambda)

You need to create a Cnn object before you can pass data to it. You are calling the Cnn class constructor __init__, which expects no arguments, rather than the forward method for an instance of the Cnn class, which is what you actually want to do.

# outside of loop
model = Cnn()
 
# inside loop
class_pred, domain_pred = model(x_s, grl_lambda)
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文