从线性模型绘制交互效应的最佳方法

发布于 2024-08-04 08:40:50 字数 828 浏览 2 评论 0原文

为了帮助填充此处的 R 标签,我发布了一些经常从学生那里收到的问题。多年来我已经对这些问题提出了自己的答案,但也许还有我不知道的更好的方法。

问题:我刚刚使用连续的 yx 进行回归,但因子 f (其中 levels(f)生成 c("level1","level2"))

 thelm <- lm(y~x*f,data=thedata)

现在我想绘制按组细分的 yx 的预测值由f定义。我得到的所有情节都很丑陋并且显示了太多线条。

我的答案:尝试使用 predict() 函数。

##restrict prediction to the valid data 
##from the model by using thelm$model rather than thedata

 thedata$yhat <- predict(thelm,
      newdata=expand.grid(x=range(thelm$model$x),
                          f=levels(thelm$model$f)))

 plot(yhat~x,data=thethedata,subset=f=="level1")
 lines(yhat~x,data=thedata,subset=f=="level2")

是否还有其他想法(1)对于新手来说更容易理解和/或(2)从其他角度来看更好?

In an effort to help populate the R tag here, I am posting a few questions I have often received from students. I have developed my own answers to these over the years, but perhaps there are better ways floating around that I don't know about.

The question: I just ran a regression with continuous y and x but factor f (where levels(f) produces c("level1","level2"))

 thelm <- lm(y~x*f,data=thedata)

Now I would like to plot the predicted values of y by x broken down by groups defined by f. All of the plots I get are ugly and show too many lines.

My answer: Try the predict() function.

##restrict prediction to the valid data 
##from the model by using thelm$model rather than thedata

 thedata$yhat <- predict(thelm,
      newdata=expand.grid(x=range(thelm$model$x),
                          f=levels(thelm$model$f)))

 plot(yhat~x,data=thethedata,subset=f=="level1")
 lines(yhat~x,data=thedata,subset=f=="level2")

Are there other ideas out there that are (1) easier to understand for a newcomer and/or (2) better from some other perspective?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

妄想挽回 2024-08-11 08:40:50

效果包具有良好的绘图方法来可视化回归的预测值。

thedata<-data.frame(x=rnorm(20),f=rep(c("level1","level2"),10))
thedata$y<-rnorm(20,,3)+thedata$x*(as.numeric(thedata$f)-1)

library(effects)
model.lm <- lm(formula=y ~ x*f,data=thedata)
plot(effect(term="x:f",mod=model.lm,default.levels=20),multiline=TRUE)

The effects package has good ploting methods for visualizing the predicted values of regressions.

thedata<-data.frame(x=rnorm(20),f=rep(c("level1","level2"),10))
thedata$y<-rnorm(20,,3)+thedata$x*(as.numeric(thedata$f)-1)

library(effects)
model.lm <- lm(formula=y ~ x*f,data=thedata)
plot(effect(term="x:f",mod=model.lm,default.levels=20),multiline=TRUE)
帅气称霸 2024-08-11 08:40:50

哈 - 仍然试图让我的大脑围绕 expand.grid() 。只是为了进行比较,这就是我的做法(使用 ggplot2):

thedata <- data.frame(predict(thelm), thelm$model$x, thelm$model$f)

ggplot(thedata, aes(x = x, y = yhat, group = f, color = f)) + geom_line()

我认为 ggplot() 逻辑非常直观 - 按 f 对线条进行分组和着色。随着组数量的增加,不必为每个组指定一个层会变得越来越有帮助。

Huh - still trying to wrap my brain around expand.grid(). Just for comparison's sake, this is how I'd do it (using ggplot2):

thedata <- data.frame(predict(thelm), thelm$model$x, thelm$model$f)

ggplot(thedata, aes(x = x, y = yhat, group = f, color = f)) + geom_line()

The ggplot() logic is pretty intuitive, I think - group and color the lines by f. With increasing numbers of groups, not having to specify a layer for each is increasingly helpful.

请帮我爱他 2024-08-11 08:40:50

我不是 R 专家。但我使用:

xyplot(y ~ x, groups= f, data= Dat, type= c('p','r'), 
   grid= T, lwd= 3, auto.key= T,)

这也是一个选项:

interaction.plot(f,x,y, type="b", col=c(1:3), 
             leg.bty="0", leg.bg="beige", lwd=1, pch=c(18,24), 
             xlab="", 
             ylab="",
             trace.label="",
             main="Interaction Plot")

I am no expert in R. But I use:

xyplot(y ~ x, groups= f, data= Dat, type= c('p','r'), 
   grid= T, lwd= 3, auto.key= T,)

This is also an option:

interaction.plot(f,x,y, type="b", col=c(1:3), 
             leg.bty="0", leg.bg="beige", lwd=1, pch=c(18,24), 
             xlab="", 
             ylab="",
             trace.label="",
             main="Interaction Plot")
指尖上得阳光 2024-08-11 08:40:50

这是对 Matt 的出色建议的一个小改动,以及一个类似于 Helgi 但使用 ggplot 的解决方案。与上面的唯一区别是我使用了 geom_smooth(method='lm) 直接绘制回归线。

set.seed(1)
y = runif(100,1,10)
x = runif(100,1,10)
f = rep(c('level 1','level 2'),50)
thedata = data.frame(x,y,f)
library(ggplot2)
ggplot(thedata,aes(x=x,y=y,color=f))+geom_smooth(method='lm',se=F)

Here is a small change to the excellent suggestion by Matt and a solution similar to Helgi but with ggplot. Only difference from above is that I have used the geom_smooth(method='lm) which plots regression lines directly.

set.seed(1)
y = runif(100,1,10)
x = runif(100,1,10)
f = rep(c('level 1','level 2'),50)
thedata = data.frame(x,y,f)
library(ggplot2)
ggplot(thedata,aes(x=x,y=y,color=f))+geom_smooth(method='lm',se=F)
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文