python虚拟变量回归_虚拟变量回归中的截距项问题

python虚拟变量回归_虚拟变量回归中的截距项问题

本人在做虚拟变量回归时如果包含截距项那么拟合优度R平方值比较低为0.7多,截距项和其他回归系数值均显著;但是如果去掉截距项R平方值会达到0.998,截距项和其他回归系数值也均显著。现在想请教下各位大虾到底要不要截距项啊???You need to post how you did it. According the limited information above, it is imposible.

Here is a simulation for your problem. The coef for x and r-square are the same. The only difference is the interpretation of dummy(c=0,1,2) + intercept. They are same.

data t1;

do i = 1 to 100;

c=mod(i,3);

x=rannor(123); error=rannor(123);

y=c+1*x + error;

output;

end;

run;

proc glm data=t1;

class c;

model y=c x/solution;

run;

quit;

proc glm data

Original: https://blog.csdn.net/weixin_36042403/article/details/113988917
Author: 苏安桥
Title: python虚拟变量回归_虚拟变量回归中的截距项问题

原创文章受到原创版权保护。转载请注明出处:https://www.johngo689.com/634257/

转载文章受原作者版权保护。转载请注明原作者出处!

(0)

大家都在看

亲爱的 Coder【最近整理,可免费获取】👉 最新必读书单  | 👏 面试题下载  | 🌎 免费的AI知识星球