是否有一种方法可以在使用KERAS的ReDucelRonplateau回调时重置每个折叠的学习率?

发布于 2025-02-06 02:51:59 字数 135 浏览 2 评论 0原文

由于标题具有自我描述性,我正在寻找一种在每个折叠上重置学习率(LR)的方法。 reducelronplateau keras的回调管理lr

As the title is self-descriptive, I'm looking for a way to reset the learning rate (lr) on each fold. The ReduceLROnPlateau callback of Keras manages the lr.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

忱杏 2025-02-13 02:51:59

没有可重现的例子,我只能提出建议。如果您查看 REDUCELRONPLATEAU 您可以获得一些灵感并创建一个自定义回调以重置培训开始时的学习率:

class ResetLR(tf.keras.callbacks.Callback):
  def on_train_begin(self, logs={}):
    default_lr = 0.1
    previous_lr = self.model.optimizer.lr.read_value()
    if previous_lr!=defaul_lr:
      print("Resetting learning rate from {} to {}".format(previous_lr, default_lr))
      self.model.optimizer.lr.assign(default_lr)

因此,使用此回调,您可以使用for for loop训练:

custom_callback = ResetLR()
for fold in folds:
  model.fit(...., callbacks=[custom_callback])

如果这不起作用(由于张力频率版本)您可以尝试使用tf.keras.backend喜欢的默认学习率,

class ResetLR(tf.keras.callbacks.Callback):
  def on_train_begin(self, logs={}):
    default_lr = 0.1
    previous_lr = float(tf.keras.backend.get_value(self.model.optimizer.lr))
    if previous_lr!=default_lr:
      print("Resetting learning rate from {} to {}".format(previous_lr, default_lr))
      tf.keras.backend.set_value(self.model.optimizer.lr, default_lr)

我也建议您查看此 post ,以获取更多参考。

With no reproducible example I can only make a suggestion. If you take a look at the source code of ReduceLROnPlateau you can get some inspiration and create a custom callback to reset the learning rate on the beginning of training:

class ResetLR(tf.keras.callbacks.Callback):
  def on_train_begin(self, logs={}):
    default_lr = 0.1
    previous_lr = self.model.optimizer.lr.read_value()
    if previous_lr!=defaul_lr:
      print("Resetting learning rate from {} to {}".format(previous_lr, default_lr))
      self.model.optimizer.lr.assign(default_lr)

So with this callback you train using a for loop:

custom_callback = ResetLR()
for fold in folds:
  model.fit(...., callbacks=[custom_callback])

If this does not work (due to tensorflow versions) you can try assigning the default learning rate using the tf.keras.backend like so:

class ResetLR(tf.keras.callbacks.Callback):
  def on_train_begin(self, logs={}):
    default_lr = 0.1
    previous_lr = float(tf.keras.backend.get_value(self.model.optimizer.lr))
    if previous_lr!=default_lr:
      print("Resetting learning rate from {} to {}".format(previous_lr, default_lr))
      tf.keras.backend.set_value(self.model.optimizer.lr, default_lr)

Also I would suggest taking a look at this post, for more references.

戈亓 2025-02-13 02:51:59

以下是可以完成这项工作的自定义回调。在培训开始时,回调提示用户输入初始学习率的值。

class INIT_LR(keras.callbacks.Callback):
    def __init__ (self, model): # initialization of the callback
        super(INIT_LR, self).__init__()
        self.model=model
    def on_train_begin(self, logs=None): # this runs on the beginning of training
        print('Enter initial learning rate below')
        lr=input('')        
        tf.keras.backend.set_value(self.model.optimizer.lr, float(lr)) # set the learning rate in the optimizer
        lr=float(tf.keras.backend.get_value(self.model.optimizer.lr)) # get the current learning rate to insure it is set
        print('Optimizer learning rate set to ', lr)

在Model.Fit设置参数

callbacks = [INIT_LR(model), rlronp]

注意:模型是您编译的模型的名称,Rlronp是您的REDUCELRONPLATEAU回调的名称。运行模型时,您将是
提示

Enter initial learning rate below # printed by the callback
.001  # user entered initial learning rate
Optimizer learning rate set to  0.0010000000474974513 # printed by the callback

below is a custom callback that will do the job. At the start of training, the callback prompts the user to enter the value of the initial learning rate.

class INIT_LR(keras.callbacks.Callback):
    def __init__ (self, model): # initialization of the callback
        super(INIT_LR, self).__init__()
        self.model=model
    def on_train_begin(self, logs=None): # this runs on the beginning of training
        print('Enter initial learning rate below')
        lr=input('')        
        tf.keras.backend.set_value(self.model.optimizer.lr, float(lr)) # set the learning rate in the optimizer
        lr=float(tf.keras.backend.get_value(self.model.optimizer.lr)) # get the current learning rate to insure it is set
        print('Optimizer learning rate set to ', lr)

in model.fit set the parameter

callbacks = [INIT_LR(model), rlronp]

Note: model is the name of your compiled model, and rlronp is the name of your ReduceLROnPlateau callback. When you run model.fit you will be
prompted with

Enter initial learning rate below # printed by the callback
.001  # user entered initial learning rate
Optimizer learning rate set to  0.0010000000474974513 # printed by the callback
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文