当我在没有参数的情况下定义自己的呼叫功能时,为什么要使用模型(x,triagh = true)?
请注意,当我创建模型时,我用参数= false定义了呼叫函数,当我在function train_step中使用该模型时,我将“某物= true,training = true”放入我的呼叫中,但在我的呼叫中未定义,但是在默认的tf.keras.model调用中。
为什么我可以没有错误执行此操作?输出基本上打印了一堆我的电话。
mnist = tf.keras.datasets.mnist
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0
# Add a channels dimension
x_train = x_train[..., tf.newaxis].astype("float32")
x_test = x_test[..., tf.newaxis].astype("float32")
train_ds = tf.data.Dataset.from_tensor_slices(
(x_train, y_train)).shuffle(10000).batch(32)
class MyModel(Model):
def __init__(self):
super(MyModel, self).__init__()
self.fl = Flatten()
self.d = Dense(10)
######My problem#######
def call(self, x, something=False):
if something:
tf.print('my call')
x = self.fl(x)
return self.d(x)
model = MyModel()
loss_object = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True)
optimizer = tf.keras.optimizers.Adam()
@tf.function
def train_step(X,Y):
with tf.GradientTape() as tape:
######My problem#######
predictions = model(X, something =True, training = True)
loss = loss_object(Y, predictions)
gradients = tape.gradient(loss, model.trainable_variables)
optimizer.apply_gradients(zip(gradients, model.trainable_variables))
for epoch in range(3):
for X,Y in train_ds:
train_step(X,Y)
Notice when I created my model, I defined the call function with argument something = False, when I used the model in function train_step, I put in "something =True, training = True", training is not defined in my call, but it is in the default tf.keras.model call.
Why am I able to execute this with no error? And the output basically prints a bunch of 'my call's.
mnist = tf.keras.datasets.mnist
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0
# Add a channels dimension
x_train = x_train[..., tf.newaxis].astype("float32")
x_test = x_test[..., tf.newaxis].astype("float32")
train_ds = tf.data.Dataset.from_tensor_slices(
(x_train, y_train)).shuffle(10000).batch(32)
class MyModel(Model):
def __init__(self):
super(MyModel, self).__init__()
self.fl = Flatten()
self.d = Dense(10)
######My problem#######
def call(self, x, something=False):
if something:
tf.print('my call')
x = self.fl(x)
return self.d(x)
model = MyModel()
loss_object = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True)
optimizer = tf.keras.optimizers.Adam()
@tf.function
def train_step(X,Y):
with tf.GradientTape() as tape:
######My problem#######
predictions = model(X, something =True, training = True)
loss = loss_object(Y, predictions)
gradients = tape.gradient(loss, model.trainable_variables)
optimizer.apply_gradients(zip(gradients, model.trainable_variables))
for epoch in range(3):
for X,Y in train_ds:
train_step(X,Y)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
在模型类中, call方法文档文档:
实际上,
__调用__
可以采用任何输入参数:def __call __(self, *args,** kwargs):
(在模型类源代码中)您可以找到一个更详细的答案<
In the Model class, the call method documentation :
And indeed, the
__call__
can take any input argument :def __call__(self, *args, **kwargs):
(in Model class source code)You can find a more detailed answer here