如何知道反向传播是否训练成功?
我有一个人工智能项目,它使用反向传播神经网络。
它的训练时间约为 1 小时,已训练了所有 100 个输入中的 60-70 个输入。我的意思是,在反向传播的情况下,60-70 个输入是正确的。 (经过训练的输入数量在 60 到 70 之间移动)。
目前,已经完成了超过 10000 个 epoch,每个 epoch 花费了近 0.5 秒。
长期搁置如何知道神经网络能否训练成功? (或者它不能训练得更好?)
I have an AI project, which uses a Backpropagation neural network.
It is training for about 1 hour, and it has trained 60-70 inputs from all 100 inputs. I mean, 60-70 inputs are correct in the condition of Backpropagation. (the number of trained inputs is moving between 60 and 70).
And currently, more than 10000 epochs are completed, and each epoch is taking almost 0.5 seconds.
How to know if the neural network can be trained successfully if I leave it for a long time? (or it can't train better?)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
看看我对此问题的回答:神经网络中的训练集、验证集和测试集有什么区别?
你应该使用 3 组数据:
验证数据set 告诉你什么时候应该停止(正如我在另一个答案中所说):
验证的一个好方法是使用 10 倍(k 倍)交叉验证。此外,还有将数据集分为训练、验证和测试的特定“策略”。它本身就是一门科学,所以你也应该阅读它。
更新
这些资源可以让您更好地理解神经网络(这有点数学繁重,但请参阅下面的更多信息):
Colin Fahey 文章的第 5.9 节对此进行了最好的描述:
后向误差传播公式:
使用以下公式计算神经网络输出处的误差值:
根据神经元主体的输出和输出误差(由连接到神经元主体的链路指定)来调整神经元主体中的误差累积。
每个输出误差值通过以下方式对误差累加器产生影响:
Check out my answer to this question: whats is the difference between train, validation and test set, in neural networks?
You should use 3 sets of data:
The Validation data set tells you when you should stop (as I said in the other answer):
A good method for validation is to use 10-fold (k-fold) cross-validation. Additionally, there are specific "strategies" for splitting your data set into training, validation and testing. It's somewhat of a science in itself, so you should read up on that too.
Update
Regarding your comment on the error, I would point you to some resources which can give you a better understanding of neural networks (it's kinda math heavy, but see below for more info):
Section 5.9 of Colin Fahey article describes it best:
Backward error propagation formula:
The error values at the neural network outputs are computed using the following formula:
The error accumulation in a neuron body is adjusted according to the output of the neuron body and the output error (specified by links connected to the neuron body).
Each output error value contributes to the error accumulator in the following manner: