LSTM训练难点
我想为表格时间序列数据训练 LSTM 模型。我的数据形状是
((7342689, 50, 5), (7342689,))
我很难处理训练损失。最初我尝试使用默认学习率,但没有帮助。我的班级标签严重倾斜。我添加了焦点损失和类别权重来处理类别不平衡问题。我尝试过再添加一层包含 50 个神经元的层,但损失开始增加而不是减少。我很欣赏你的建议。谢谢!
这是我当前的模型架构:
adam = Adam(learning_rate=0.0001)
model = keras.Sequential()
model.add(LSTM(100, input_shape = (50, 5)))
model.add(Dropout(0.5))
model.add(Dense(1, activation="sigmoid"))
model.compile(loss=tfa.losses.SigmoidFocalCrossEntropy()
, metrics=[keras.metrics.binary_accuracy]
, optimizer=adam)
model.summary()
class_weights = dict(zip(np.unique(y_train), class_weight.compute_class_weight('balanced', classes=np.unique(y_train),y=y_train)))
history=model.fit(X_train, y_train, batch_size=64, epochs=50,class_weight=class_weights)
I wanted to train LSTM model for tabular time series data. My data shape is
((7342689, 50, 5), (7342689,))
I was having a hard time to handle the training loss. Initially I tried with default learning rate , but it didn't help. My class label is severely skewed. I have added focal loss and class weights to handle class imbalance issues. I have tried with adding one more layer with 50 neurons, but that loss started to increase instead of decrease. I appreciate your suggestions. Thanks!
Here is my current model architecture:
adam = Adam(learning_rate=0.0001)
model = keras.Sequential()
model.add(LSTM(100, input_shape = (50, 5)))
model.add(Dropout(0.5))
model.add(Dense(1, activation="sigmoid"))
model.compile(loss=tfa.losses.SigmoidFocalCrossEntropy()
, metrics=[keras.metrics.binary_accuracy]
, optimizer=adam)
model.summary()
class_weights = dict(zip(np.unique(y_train), class_weight.compute_class_weight('balanced', classes=np.unique(y_train),y=y_train)))
history=model.fit(X_train, y_train, batch_size=64, epochs=50,class_weight=class_weights)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

模型的损失先减小后增大,这可能是因为优化过程陷入了局部最优解。也许你可以尝试降低学习率并增加纪元。
The loss of the model first decreased and then increased, which may be because the optimization process got stuck in a local optimal solution. Maybe you can try reducing the learning rate and increasing the epoch.