更多时代使损失上升
我有一个时间序列数据集,我使用LSTM培训了它。我使用200个时期的训练,结果是损失值和val_loss值相当不错(IMO)
=“ https://i.sstatic.net/u3828.png” alt =“在此处输入图像描述”>
那么,我认为如果我添加更多的时代,结果仍然会更好,因此我使用400个时代重新训练。但是损失和val_loss正在上升。 png“ alt =”在此处输入图像描述”>
但是结果有所不同。甚至变得更糟。如果更多的时期能使模型恶化
I have a time-series dataset and I trained it using LSTM. I train using 200 epochs and the result is the loss value and val_loss value is pretty good (IMO)
then I think the result still can be better if I add more epochs so I retrain using 400 epochs. but the loss and val_loss is rising
but somehow the result is different. even become worse. is it better still to use the 200 epoch model or there really is a condition if more epochs can worsen the model
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
这可能是因为您的LR(学习率)太大了。您可以尝试减少LR。从图中,训练损失增加了,因此我认为这种情况不是过度拟合的问题。
This is probably because your lr(learning rate) is too large. You could try to reduce your lr. From the graph, the training loss is increased so I think this case is not the problem of overfitting.