问题如下:
The population is 6000 programmer which is supposed to be normally distributed. A sample with 100 size is drawn from the population. Based on z-statistic, 95% confidence interval of sample mean of annual salary is 32.5 (in thousands) dollars ranges from 22 (in thousands) dollars to 43 (in thousands) dollars .Calculate the standard error of mean annual salary:
选项:
A.1.96.
B.3.99.
C.5.36.
解释:
C is correct.
At the 95% level of significance, the critical value is .
So the confidence interval is ,
From the equation t 32.5 + 1.96 σ = 43 or 32.5 - 1.96σ = 22, we get
我画一个正态分布图,mean 是 32.5,22 和 43 是距离 mean 有 1.96 倍标准差 的位置,也就是 32.5±1.96*σ,那这个σ求出来了,再除以 根号下 n,也就是 根号下100,这样的算法为啥不对呢?
翻到前面的解释是这样 的:
区间应该是 X bar±1.96*X bar的标准差。这里“X bar的标准差”(或者描述为样本均值的标准差)就已经是标准误的概念了,所以计算出来也不用再除以根号n了。
不是很理解这句话:这里“X bar的标准差”(或者描述为样本均值的标准差)就已经是标准误的概念了
所以我的算法为啥不对呢?