- May 21, 2001
- 24,998
- 3,326
- 126
Suppose I have 10,000 data points and this results in a total error of +- 15 (units are not necessary for this question, but if you really want to know it is microvolts). Now suppose instead I use 100 data points. What is the error I should expect?
It has been ~ 6 years since my statistics class and I think the formula is that error is proportional to n^(-0.5). If that is true then the error with 100 data points would be 10 times greater than with 10,000 data points, giving +- 150.
Can anyone confirm this? Is my memory correct or am I way off?
Yes I can google. I'm busy and don't have time to read through a bunch of websites.
It has been ~ 6 years since my statistics class and I think the formula is that error is proportional to n^(-0.5). If that is true then the error with 100 data points would be 10 times greater than with 10,000 data points, giving +- 150.
Can anyone confirm this? Is my memory correct or am I way off?
Yes I can google. I'm busy and don't have time to read through a bunch of websites.