如何使用模型将KERAS损失与其他中级输出损失相结合。ADD_LOSS

发布于 2025-01-18 12:30:28 字数 6203 浏览 1 评论 0原文

我正在尝试重现 costa/vae 代码使其在TF2.X和KERAS 2.x上运行而不是1.x。 但是,我在将KERAS损耗(即张量)与某些中间层输出(即kerastensors)结合在一起时面临问题。

这是带有自定义损失函数的代码

def gan_loss(y_true, y_pred,a,ap,b,bp):
      y_true_flat = K.batch_flatten(y_true)
      y_pred_flat = K.batch_flatten(y_pred)
      # # Adversarial Loss
      L_adv = losses.binary_crossentropy(y_true_flat, y_pred_flat)

      # A to A loss
      a_flat = K.batch_flatten(a)
      ap_flat = K.batch_flatten(ap)
      if is_a_binary:
          L_atoa = losses.binary_crossentropy(a_flat, ap_flat)
      else:
          L_atoa = K.mean(K.abs(a_flat - ap_flat))

      # A to B loss
      b_flat = K.batch_flatten(b)
      bp_flat = K.batch_flatten(bp)
      if is_b_binary:
          L_atob = losses.binary_crossentropy(b_flat, bp_flat)
      else:
          L_atob = K.mean(K.abs(b_flat - bp_flat))

      L_code = losses.binary_crossentropy(np.asarray(1).astype('float32').reshape((-1, 1)), code_d(z))

      return L_adv + beta*L_atoa + alpha*L_atob + L_code
      


a = Input(shape=(1, 512, 512),name='a')
b = Input(shape=(3, 512, 512),name='b')    

# A -> A'  converting groundtruth (A) to noise (z) then generate (A') using VAE
encoder = vae.get_layer('vae_encoder')
decoder = vae.get_layer('vae_decoder')
z = encoder(a)
ap = decoder(z)      # ap is the (A') which is short form of A prediction(ap)

# A' -> B'   convert A' to synthesized image (B') via image-to-image translation 
bp = atob(ap)  
  
# Discriminator receives the two generated images
d_in = concatenate([ap, bp], axis=1)
output= d(d_in)
input= [a,b]
gan = Model(input , output)

gan.add_loss(gan_loss(y_true,y_pred,a,ap,b,bp))      <------ HERE IS THE PROBLEM
gan.compile(optimizer=opt, loss=None)

注意:如果我删除了y_true,y_preda,ap,b,bp,则代码正常运行,但是,如果我在计算损失函数的计算中一起使用错误,则会引发一个错误

问题:如果我们遵循gan(例如gan)的标准损失计算形式,我将为y_true和y_pred发送什么。编译(优化器= Opt,lose = gan_loss)

这是完整的错误报告,

TypeError                                 Traceback (most recent call 
last)
<ipython-input-11-208ba3569274> in <module>()
----> 1 train(models, it_train, it_val, params)

4 frames
<ipython-input-9-96f21ec5c376> in train(models, it_train, it_val, 
params)
357 
358         for b in range(batches_per_epoch):
--> 359             train_iteration(models, generators, losses, 
params)
360 
361         # Evaluate how the models is doing on the validation set.

<ipython-input-9-96f21ec5c376> in train_iteration(models, generators, 
losses, params)
282     print(p2p2p_gen)
283 
--> 284     p2p2phist = train_generator(p2p2p, p2p2p_gen, 
batch_size=params.batch_size)
285     print()
286     print('p2p2phist.history')
 
<ipython-input-9-96f21ec5c376> in train_generator(gan, it, 
batch_size)
55     """Train the generator network."""
56     #return gan.fit_generator(it, nb_epoch=1, 
samples_per_epoch=batch_size, verbose=False)
---> 57     return gan.fit(it, epochs=1, steps_per_epoch=batch_size, 
verbose=False)#,initial_epoch=1)
58 
59 def discriminator_generator(it, g, dout_size=(16, 16)):

/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py 
in error_handler(*args, **kwargs)
65     except Exception as e:  # pylint: disable=broad-except
66       filtered_tb = _process_traceback_frames(e.__traceback__)
---> 67       raise e.with_traceback(filtered_tb) from None
68     finally:
69       del filtered_tb

/usr/local/lib/python3.7/dist- 
packages/tensorflow/python/framework/func_graph.py in 
autograph_handler(*args, **kwargs)
1145           except Exception as e:  # pylint:disable=broad-except
1146             if hasattr(e, "ag_error_metadata"):
-> 1147               raise e.ag_error_metadata.to_exception(e)
1148             else:
1149               raise

TypeError: in user code:

File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/training.py", line 1021, in train_function  *
return step_function(self, iterator)
File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/training.py", line 1010, in step_function  **
outputs = model.distribute_strategy.run(run_step, args=(data,))
File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/training.py", line 1000, in run_step  **
outputs = model.train_step(data)
File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/training.py", line 860, in train_step
loss = self.compute_loss(x, y, y_pred, sample_weight)
File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/training.py", line 919, in compute_loss
y, y_pred, sample_weight, regularization_losses=self.losses)
File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/compile_utils.py", line 240, in __call__
total_loss_metric_value, sample_weight=batch_dim)
File "/usr/local/lib/python3.7/dist- 
packages/keras/utils/metrics_utils.py", line 70, in decorated
update_op = update_state_fn(*args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/keras/metrics.py", line 
178, in update_state_fn
return ag_update_state(*args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/keras/metrics.py", line 
456, in update_state  **
sample_weight, values)
File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/keras_tensor.py", line 255, in __array__
f'You are passing {self}, an intermediate Keras symbolic 
input/output, '

TypeError: You are passing KerasTensor(type_spec=TensorSpec(shape=(), 
dtype=tf.float32, name=None), name='Placeholder:0', 
description="created by layer 'tf.cast_5'"), an intermediate Keras 
symbolic input/output, to a TF API that does not allow registering 
custom dispatchers, such as `tf.cond`, `tf.function`, gradient tapes, 
or `tf.map_fn`. Keras Functional model construction only supports TF 
API calls that *do* support dispatching, such as `tf.math.add` or 
`tf.reshape`. Other APIs cannot be called directly on symbolic 
Kerasinputs/outputs. You can work around this limitation by putting 
the operation in a custom Keras layer `call` and calling that layer 
on this symbolic input/output.

感谢您对我的关系。

I am trying to reproduce COSTA/VAE code to make it run on tf2.x and Keras 2.x rather than 1.x.
however, I am facing problem in combining Keras loss (which are tensors) with some intermediate layer outputs (which are KerasTensors).

here is the code with custom loss function

def gan_loss(y_true, y_pred,a,ap,b,bp):
      y_true_flat = K.batch_flatten(y_true)
      y_pred_flat = K.batch_flatten(y_pred)
      # # Adversarial Loss
      L_adv = losses.binary_crossentropy(y_true_flat, y_pred_flat)

      # A to A loss
      a_flat = K.batch_flatten(a)
      ap_flat = K.batch_flatten(ap)
      if is_a_binary:
          L_atoa = losses.binary_crossentropy(a_flat, ap_flat)
      else:
          L_atoa = K.mean(K.abs(a_flat - ap_flat))

      # A to B loss
      b_flat = K.batch_flatten(b)
      bp_flat = K.batch_flatten(bp)
      if is_b_binary:
          L_atob = losses.binary_crossentropy(b_flat, bp_flat)
      else:
          L_atob = K.mean(K.abs(b_flat - bp_flat))

      L_code = losses.binary_crossentropy(np.asarray(1).astype('float32').reshape((-1, 1)), code_d(z))

      return L_adv + beta*L_atoa + alpha*L_atob + L_code
      


a = Input(shape=(1, 512, 512),name='a')
b = Input(shape=(3, 512, 512),name='b')    

# A -> A'  converting groundtruth (A) to noise (z) then generate (A') using VAE
encoder = vae.get_layer('vae_encoder')
decoder = vae.get_layer('vae_decoder')
z = encoder(a)
ap = decoder(z)      # ap is the (A') which is short form of A prediction(ap)

# A' -> B'   convert A' to synthesized image (B') via image-to-image translation 
bp = atob(ap)  
  
# Discriminator receives the two generated images
d_in = concatenate([ap, bp], axis=1)
output= d(d_in)
input= [a,b]
gan = Model(input , output)

gan.add_loss(gan_loss(y_true,y_pred,a,ap,b,bp))      <------ HERE IS THE PROBLEM
gan.compile(optimizer=opt, loss=None)

Note: the code works fine if i removed either y_true,y_pred or a,ap,b,bp from the calculation of loss function, but it throws an error if i used them together in the calculation of loss function

the Question: What shall i send for y_true and y_pred as they are implicitly send if we follow the standard form of loss calculation like gan.compile(optimizer=opt, loss=gan_loss)

here is the full error report

TypeError                                 Traceback (most recent call 
last)
<ipython-input-11-208ba3569274> in <module>()
----> 1 train(models, it_train, it_val, params)

4 frames
<ipython-input-9-96f21ec5c376> in train(models, it_train, it_val, 
params)
357 
358         for b in range(batches_per_epoch):
--> 359             train_iteration(models, generators, losses, 
params)
360 
361         # Evaluate how the models is doing on the validation set.

<ipython-input-9-96f21ec5c376> in train_iteration(models, generators, 
losses, params)
282     print(p2p2p_gen)
283 
--> 284     p2p2phist = train_generator(p2p2p, p2p2p_gen, 
batch_size=params.batch_size)
285     print()
286     print('p2p2phist.history')
 
<ipython-input-9-96f21ec5c376> in train_generator(gan, it, 
batch_size)
55     """Train the generator network."""
56     #return gan.fit_generator(it, nb_epoch=1, 
samples_per_epoch=batch_size, verbose=False)
---> 57     return gan.fit(it, epochs=1, steps_per_epoch=batch_size, 
verbose=False)#,initial_epoch=1)
58 
59 def discriminator_generator(it, g, dout_size=(16, 16)):

/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py 
in error_handler(*args, **kwargs)
65     except Exception as e:  # pylint: disable=broad-except
66       filtered_tb = _process_traceback_frames(e.__traceback__)
---> 67       raise e.with_traceback(filtered_tb) from None
68     finally:
69       del filtered_tb

/usr/local/lib/python3.7/dist- 
packages/tensorflow/python/framework/func_graph.py in 
autograph_handler(*args, **kwargs)
1145           except Exception as e:  # pylint:disable=broad-except
1146             if hasattr(e, "ag_error_metadata"):
-> 1147               raise e.ag_error_metadata.to_exception(e)
1148             else:
1149               raise

TypeError: in user code:

File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/training.py", line 1021, in train_function  *
return step_function(self, iterator)
File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/training.py", line 1010, in step_function  **
outputs = model.distribute_strategy.run(run_step, args=(data,))
File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/training.py", line 1000, in run_step  **
outputs = model.train_step(data)
File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/training.py", line 860, in train_step
loss = self.compute_loss(x, y, y_pred, sample_weight)
File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/training.py", line 919, in compute_loss
y, y_pred, sample_weight, regularization_losses=self.losses)
File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/compile_utils.py", line 240, in __call__
total_loss_metric_value, sample_weight=batch_dim)
File "/usr/local/lib/python3.7/dist- 
packages/keras/utils/metrics_utils.py", line 70, in decorated
update_op = update_state_fn(*args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/keras/metrics.py", line 
178, in update_state_fn
return ag_update_state(*args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/keras/metrics.py", line 
456, in update_state  **
sample_weight, values)
File "/usr/local/lib/python3.7/dist- 
packages/keras/engine/keras_tensor.py", line 255, in __array__
f'You are passing {self}, an intermediate Keras symbolic 
input/output, '

TypeError: You are passing KerasTensor(type_spec=TensorSpec(shape=(), 
dtype=tf.float32, name=None), name='Placeholder:0', 
description="created by layer 'tf.cast_5'"), an intermediate Keras 
symbolic input/output, to a TF API that does not allow registering 
custom dispatchers, such as `tf.cond`, `tf.function`, gradient tapes, 
or `tf.map_fn`. Keras Functional model construction only supports TF 
API calls that *do* support dispatching, such as `tf.math.add` or 
`tf.reshape`. Other APIs cannot be called directly on symbolic 
Kerasinputs/outputs. You can work around this limitation by putting 
the operation in a custom Keras layer `call` and calling that layer 
on this symbolic input/output.

Thanks for bearing this with me.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

花桑 2025-01-25 12:30:28

对于Tensorflow 2.x/keras 2.x,“引擎盖下”有一些复杂的变化。我的建议是将其添加为第一行:

!pip uninstall -y keras

然后,将KERAS的所有导入更改为TensorFlow.keras。通过混合“ Keras”和“ TensorFlow”软件包,您可以看不见地混合类型和功能。

There have been some complex changes "under the hood" for Tensorflow 2.x/Keras 2.x. My advice is to add this as the first line:

!pip uninstall -y keras

Then, change all imports of keras to tensorflow.keras. By mixing the 'keras' and 'tensorflow' packages, you could be mixing up types and functions invisibly.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文