我对 R 和 AI / ML 技术还很陌生。我想使用神经网络进行预测,因为我是新手,我只想看看这是否应该如何完成。
作为测试用例,我根据之前的 2 个值来预测 sin()
的值。为了进行训练,我使用y = sin(x)
、x1 = sin(x-1)
、x2 = sin(x-2)<创建了一个数据框/code>,然后使用公式y ~ x1 + x2
。
它似乎有效,但我只是想知道这是否是正确的方法,或者是否有更惯用的方法。
这是代码:
require(quantmod) #for Lag()
requre(nnet)
x <- seq(0, 20, 0.1)
y <- sin(x)
te <- data.frame(y, Lag(y), Lag(y,2))
names(te) <- c("y", "x1", "x2")
p <- nnet(y ~ x1 + x2, data=te, linout=TRUE, size=10)
ps <- predict(p, x1=y)
plot(y, type="l")
lines(ps, col=2)
谢谢
[编辑]
这对于预测调用更好吗?
t2 <- data.frame(sin(x), Lag(sin(x)))
names(t2) <- c("x1", "x2")
vv <- predict(p, t2)
plot(vv)
我想我想通过查看 nnet 的预测(应该近似于正弦波)来了解 nnet 是否实际工作。
I'm still pretty new to R and AI / ML techniques. I would like to use a neural net for prediction, and since I'm new I would just like to see if this is how it should be done.
As a test case, I'm predicting values of sin()
, based on 2 previous values. For training I create a data frame withy = sin(x)
, x1 = sin(x-1)
, x2 = sin(x-2)
, then use the formula y ~ x1 + x2
.
It seems to work, but I am just wondering if this is the right way to do it, or if there is a more idiomatic way.
This is the code:
require(quantmod) #for Lag()
requre(nnet)
x <- seq(0, 20, 0.1)
y <- sin(x)
te <- data.frame(y, Lag(y), Lag(y,2))
names(te) <- c("y", "x1", "x2")
p <- nnet(y ~ x1 + x2, data=te, linout=TRUE, size=10)
ps <- predict(p, x1=y)
plot(y, type="l")
lines(ps, col=2)
Thanks
[edit]
Is this better for the predict call?
t2 <- data.frame(sin(x), Lag(sin(x)))
names(t2) <- c("x1", "x2")
vv <- predict(p, t2)
plot(vv)
I guess I'd like to see that the nnet is actually working by looking at its predictions (which should approximate a sin wave.)
发布评论
评论(1)
我真的很喜欢
caret
包,因为它为各种模型提供了一个很好的、统一的接口,例如nnet
。此外,它还使用交叉验证或引导重新采样自动调整超参数(例如大小和衰减)。缺点是所有这些重新采样都需要一些时间。它还以适当的比例进行预测,因此您可以直接比较结果。如果您对神经网络感兴趣,您还应该看看
neuralnet
和RSNNS
包。caret
目前可以调整nnet
和neuralnet
模型,但尚未提供RSNNS
接口。/edit:
caret
现在有一个RSNNS
接口。事实证明,如果您向软件包维护者发送电子邮件并要求将模型添加到插入符中,他通常会这样做!/edit:
caret
现在还支持来自 brnn 包。此外,插入符号现在还可以更轻松地指定您自己的自定义模型 ,与您喜欢的任何神经网络包交互!I really like the
caret
package, as it provides a nice, unified interface to a variety of models, such asnnet
. Furthermore, it automatically tunes hyperparameters (such assize
anddecay
) using cross-validation or bootstrap re-sampling. The downside is that all this re-sampling takes some time.It also predicts on the proper scale, so you can directly compare results. If you are interested in neural networks, you should also take a look at the
neuralnet
andRSNNS
packages.caret
can currently tunennet
andneuralnet
models, but does not yet have an interface forRSNNS
./edit:
caret
now has an interface forRSNNS
. It turns out if you email the package maintainer and ask that a model be added tocaret
he'll usually do it!/edit:
caret
also now supports Bayesian regularization for feed-forward neural networks from the brnn package. Furthermore, caret now also makes it much easier to specify your own custom models, to interface with any neural network package you like!