To generate reproducible results, you need to manually set the random number generator to the same seed/state at the beginning of the code. This can be done in a number of ways (depending on what version of MATLAB you have):
In my experience it works pretty well and helps the neural net converge faster. I've found that it also makes the results more consistent too. I recommend using it as well as the fixed random seed as per Amro's post.
For random weight initialization problem, It seems (I'm not sure) all Matlab initialization functions ("initzero", "initlay”, "initwb”, “initnw”) are almost random. So you should force this functions produce similar results per call.
发布评论
评论(4)
生成可重现的结果 ,您需要在代码开头手动将随机数生成器设置为相同的种子/状态。这可以通过数量来完成方式(取决于您拥有的 MATLAB 版本):
旧样式:
更新样式:
A 新函数,简化了最后一次调用:
推荐使用后一种语法。
To generate reproducible results, you need to manually set the random number generator to the same seed/state at the beginning of the code. This can be done in a number of ways (depending on what version of MATLAB you have):
The old style:
The updated style:
A new function was introduced in R2011a that simplifies the last call:
The latter syntax is the recommended approach.
作为旁注,而不是直接答案,有一个叫做 Nguyen Widrow初始化并已在 Matlab 的神经网络工具箱中实现。
根据我的经验,它效果很好,可以帮助神经网络更快地收敛。我发现它也使结果更加一致。我建议按照 Amro 的帖子<使用它以及固定随机种子/a>.
As a side note, and not a direct answer, there's something called Nguyen Widrow initialization and it's already implemented in Matlab's Neural Net toolbox.
In my experience it works pretty well and helps the neural net converge faster. I've found that it also makes the results more consistent too. I recommend using it as well as the fixed random seed as per Amro's post.
Matlab 神经网络工具箱结果不同有两个原因: 1-随机数据除法 2-随机权重初始化
对于不同的数据除法问题,使用函数“divideblock”或“divideint”而不是“dividerand”,如下所示:
net.dividefcn ='分割块;
net.divideparam.trainratio=.7;
net.divideparam.valratio=.15;
net.divideparam.testratio=.15;
对于随机权重初始化问题,似乎(我不确定)所有Matlab初始化函数(“initzero”,“initlay”,“initwb”,“initnw” ”)几乎是随机的。因此,您应该强制此函数在每次调用时产生类似的结果。
RandStream.setGlobalStream(RandStream('mrg32k3a','Seed', 1234));
然后使用其中之一:
net.initFcn='initlay';
net.layers{i}.initFcn='initnw';
Different Matlab Neural networks toolbox results is because of two reasons: 1-random data division 2-random weight initialization
For different data division problem use function "divideblock" or "divideint" instead of "dividerand" like this:
net.dividefcn='divideblock;
net.divideparam.trainratio=.7;
net.divideparam.valratio=.15;
net.divideparam.testratio=.15;
For random weight initialization problem, It seems (I'm not sure) all Matlab initialization functions ("initzero", "initlay”, "initwb”, “initnw”) are almost random. So you should force this functions produce similar results per call.
RandStream.setGlobalStream (RandStream ('mrg32k3a','Seed', 1234));
And then use one of them:
net.initFcn='initlay';
net.layers{i}.initFcn='initnw';