评估每个训练时期之后测试集的模型

发布于 2025-01-24 03:25:29 字数 251 浏览 0 评论 0原文

我正在训练图像数据集上的TensorFlow模型进行分类任务,我们通常将培训集和验证集提供给model.fit方法,我们以后可以输出培训和验证的趋势模型收敛图。我想对测试集进行相同的操作,换句话说,我想在每个时期之后获得测试集上的模型的准确性和丢失(未验证集 - 我无法用测试替换验证集设置是因为我需要这两个图形)。

我设法通过使用一些回调来保存模型的检查点来做到这一点其他回调或与model.fit方法一起使用。

I'm training a tensorflow model on image dataset for a classification task, we usually provide the training set and validation set to the model.fit method, we can later output model convergence graph of training and validation. I want to do the same with the testing set, in other words, I want to get the accuracy and loss of my model on the testing set after each epoch(not validation set - and I can't replace the validation set with the testing set because I need graphs of both of them).

I managed to do that by saving the checkpoints of my model after each epoch using some callback and later load each checkpoint to my model and compute accuracy and loss, but I want to know if there exists some easier way of doing that, maybe with some other callback or some work around with the model.fit method.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

逆流 2025-01-31 03:25:29

您可以使用自定义回调并传递测试数据并做任何您喜欢的事情:

import tensorflow as tf
import pathlib
import numpy as np

dataset_url = "https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz"
data_dir = tf.keras.utils.get_file('flower_photos', origin=dataset_url, untar=True)
data_dir = pathlib.Path(data_dir)

batch_size = 5

train_ds = tf.keras.utils.image_dataset_from_directory(
  data_dir,
  seed=123,
  image_size=(64, 64),
  batch_size=batch_size)

test_ds = train_ds.take(30)

model = tf.keras.Sequential([
  tf.keras.layers.Rescaling(1./255, input_shape=(64, 64, 3)),
  tf.keras.layers.Conv2D(16, 3, padding='same', activation='relu'),
  tf.keras.layers.MaxPooling2D(),
  tf.keras.layers.Conv2D(32, 3, padding='same', activation='relu'),
  tf.keras.layers.MaxPooling2D(),
  tf.keras.layers.Conv2D(64, 3, padding='same', activation='relu'),
  tf.keras.layers.MaxPooling2D(),
  tf.keras.layers.Dropout(0.2),
  tf.keras.layers.Flatten(),
  tf.keras.layers.Dense(128, activation='relu'),
  tf.keras.layers.Dense(5)
])

class TestCallback(tf.keras.callbacks.Callback):
    def __init__(self, test_dataset):
        super().__init__()
        self.test_dataset = test_dataset
        self.test_acc_metric = tf.keras.metrics.SparseCategoricalAccuracy()
        self.loss_fn = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True) 

    def on_epoch_end(self, epoch, logs=None):
        losses = []
        for x_batch_test, y_batch_test in self.test_dataset:
          test_logits = self.model(x_batch_test, training=False)
          losses.append(self.loss_fn(y_batch_test, test_logits))
          self.test_acc_metric.update_state(y_batch_test, test_logits)
        test_acc = self.test_acc_metric.result()
        self.test_acc_metric.reset_states()
        logs['test_loss'] = tf.reduce_mean(tf.stack(losses))  # not sure if the reduction is correct
        logs['test_sparse_categorical_accuracy'] = test_acc

loss_fn = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True) 
model.compile(optimizer='adam',
              loss=loss_fn,
              metrics=tf.keras.metrics.SparseCategoricalAccuracy())
epochs = 5
history = model.fit(train_ds, epochs=epochs, callbacks= [TestCallback(test_ds)])
Found 3670 files belonging to 5 classes.
Epoch 1/5
734/734 [==============================] - 14s 17ms/step - loss: 1.2709 - sparse_categorical_accuracy: 0.4591 - test_loss: 1.0020 - test_sparse_categorical_accuracy: 0.5533
Epoch 2/5
734/734 [==============================] - 13s 18ms/step - loss: 0.9574 - sparse_categorical_accuracy: 0.6275 - test_loss: 0.8348 - test_sparse_categorical_accuracy: 0.6467
Epoch 3/5
734/734 [==============================] - 9s 12ms/step - loss: 0.8136 - sparse_categorical_accuracy: 0.6733 - test_loss: 0.8379 - test_sparse_categorical_accuracy: 0.6467
Epoch 4/5
734/734 [==============================] - 8s 11ms/step - loss: 0.6970 - sparse_categorical_accuracy: 0.7357 - test_loss: 0.5713 - test_sparse_categorical_accuracy: 0.7533
Epoch 5/5
734/734 [==============================] - 8s 11ms/step - loss: 0.5793 - sparse_categorical_accuracy: 0.7834 - test_loss: 0.5656 - test_sparse_categorical_accuracy: 0.7733

您还可以在回调中使用model.evaluate。另请参阅此 post

You could use a custom Callback and pass your test data and do whatever you like:

import tensorflow as tf
import pathlib
import numpy as np

dataset_url = "https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz"
data_dir = tf.keras.utils.get_file('flower_photos', origin=dataset_url, untar=True)
data_dir = pathlib.Path(data_dir)

batch_size = 5

train_ds = tf.keras.utils.image_dataset_from_directory(
  data_dir,
  seed=123,
  image_size=(64, 64),
  batch_size=batch_size)

test_ds = train_ds.take(30)

model = tf.keras.Sequential([
  tf.keras.layers.Rescaling(1./255, input_shape=(64, 64, 3)),
  tf.keras.layers.Conv2D(16, 3, padding='same', activation='relu'),
  tf.keras.layers.MaxPooling2D(),
  tf.keras.layers.Conv2D(32, 3, padding='same', activation='relu'),
  tf.keras.layers.MaxPooling2D(),
  tf.keras.layers.Conv2D(64, 3, padding='same', activation='relu'),
  tf.keras.layers.MaxPooling2D(),
  tf.keras.layers.Dropout(0.2),
  tf.keras.layers.Flatten(),
  tf.keras.layers.Dense(128, activation='relu'),
  tf.keras.layers.Dense(5)
])

class TestCallback(tf.keras.callbacks.Callback):
    def __init__(self, test_dataset):
        super().__init__()
        self.test_dataset = test_dataset
        self.test_acc_metric = tf.keras.metrics.SparseCategoricalAccuracy()
        self.loss_fn = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True) 

    def on_epoch_end(self, epoch, logs=None):
        losses = []
        for x_batch_test, y_batch_test in self.test_dataset:
          test_logits = self.model(x_batch_test, training=False)
          losses.append(self.loss_fn(y_batch_test, test_logits))
          self.test_acc_metric.update_state(y_batch_test, test_logits)
        test_acc = self.test_acc_metric.result()
        self.test_acc_metric.reset_states()
        logs['test_loss'] = tf.reduce_mean(tf.stack(losses))  # not sure if the reduction is correct
        logs['test_sparse_categorical_accuracy'] = test_acc

loss_fn = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True) 
model.compile(optimizer='adam',
              loss=loss_fn,
              metrics=tf.keras.metrics.SparseCategoricalAccuracy())
epochs = 5
history = model.fit(train_ds, epochs=epochs, callbacks= [TestCallback(test_ds)])
Found 3670 files belonging to 5 classes.
Epoch 1/5
734/734 [==============================] - 14s 17ms/step - loss: 1.2709 - sparse_categorical_accuracy: 0.4591 - test_loss: 1.0020 - test_sparse_categorical_accuracy: 0.5533
Epoch 2/5
734/734 [==============================] - 13s 18ms/step - loss: 0.9574 - sparse_categorical_accuracy: 0.6275 - test_loss: 0.8348 - test_sparse_categorical_accuracy: 0.6467
Epoch 3/5
734/734 [==============================] - 9s 12ms/step - loss: 0.8136 - sparse_categorical_accuracy: 0.6733 - test_loss: 0.8379 - test_sparse_categorical_accuracy: 0.6467
Epoch 4/5
734/734 [==============================] - 8s 11ms/step - loss: 0.6970 - sparse_categorical_accuracy: 0.7357 - test_loss: 0.5713 - test_sparse_categorical_accuracy: 0.7533
Epoch 5/5
734/734 [==============================] - 8s 11ms/step - loss: 0.5793 - sparse_categorical_accuracy: 0.7834 - test_loss: 0.5656 - test_sparse_categorical_accuracy: 0.7733

You can also just use model.evaluate in the callback. See also this post.

回忆凄美了谁 2025-01-31 03:25:29

您可以通过将XTEST和YTEST数据提供给验证_data =(Xtest,ytest)来评估每个时期的模型:

model.fit(xTrain, yTrain, validation_data=(xTest, yTest), epochs=20, batch_size=5)

在上代码时代= 20中只是一个标本,您可以将其更改为想要数量的时期。 batch_size = 5也是样品。

You evaluate a model every epoch by giving xTest and yTest data to validation_data=(xTest, yTest):

model.fit(xTrain, yTrain, validation_data=(xTest, yTest), epochs=20, batch_size=5)

In the upper code epochs=20 is just a specimem, you can change it to a wanted number of epochs. The batch_size=5 is the specimen as well.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文