DemocratSinceBirth
(1000+ posts)
Send PM |
Profile |
Ignore
|
Mon Nov-01-04 11:54 AM
Original message |
Need Help From A DUer With A Statistics Background... |
|
I understand the Margin Of Error but I have a harder time grasping the confidence level measure in a poll..
It has been explained me that a confidence level of .95 means there is one in twenty chance that you are not accurately measuring what you intended to measure...
So far so good...
This is where it gets confusing does the confidence level assume that there is a one in twenty chance you can get truly bizarre results such as 50% of African Americans vote for Bush or does it mean there is a one in twenty chance your sample is just wrong....
|
dsc
(1000+ posts)
Send PM |
Profile |
Ignore
|
Mon Nov-01-04 11:58 AM
Response to Original message |
|
The confidence level measures the chances that your poll will be outside of the margin of error due to the sample being bad. It doesn't address problems with questions or other methodological questions (ie not choosing enough of particular subgroups). So it comes closest to your second statement than to your first one.
|
JuniorPlankton
(1000+ posts)
Send PM |
Profile |
Ignore
|
Mon Nov-01-04 11:58 AM
Response to Original message |
|
To illustrate: Suppose you are polling AAs. We know that about 10% of them vote for * (we can't comprehend it, but then it's hard to understand why would ANYONE vote for the bastard)
There is probability that ALL of the people you polled are from that 10%. Clearly, the more people you poll the lower this probability becomes. However, it doesn't mean that all AAs might vote for the chimp.
Is it what you asked about? :)
|
CityHall
(332 posts)
Send PM |
Profile |
Ignore
|
Mon Nov-01-04 12:31 PM
Response to Original message |
3. Attempted simple explanation |
|
If you pick a sample that is typical of the population, you can expect that if you took the poll 100 times (of 100 voters each time), on average the result will be extremely clost to the result you'd get if you polled every person in the country. But if you only take the poll once, you'll see some difference between the poll result and the national "correct" result, but it's equally likely to be off in either direction.
Based purely on the sample size, the margin of error measures how likely it is that you just randomly happened to pick too many Bush supporters, for example. If you graph the probability of being off by a certain amount (because your sample didn't represent the population), you get a bell-curve shaped graph (at least for samples larger than 20 or so).
The +-3% numbers are the range where it's 95% likely that the sample was within 3% of the result you'd get if you surveyed the entire population. This is computed purely based on the chance of getting a questionable sample.
You can still be off for reasons other than the chance of getting a bad sample - you can get a greater number of deliberate refusals from one party than the other, the questions could be biased, you could be ignoring a group like cell-phone users, and so forth.
The polls try to make up for the sources of bias by sampling Republicans, Democrats, and independents seperately and weighting the results by how many of each are registered and how likely they are to vote. If Republican turnout isn't higher like it normally is, this would be another source of error, one that people keep mentioning for Gallup polls that oversample Republicans.
(It's also possible that some polls are reporting margins of error that are themselves guesses and not based on the sample size issue discussed above, but this wouldn't be standard statistical practice.)
|
Tweed
(1000+ posts)
Send PM |
Profile |
Ignore
|
Mon Nov-01-04 12:33 PM
Response to Original message |
4. Where is Truth is All? |
Tweed
(1000+ posts)
Send PM |
Profile |
Ignore
|
Mon Nov-01-04 12:33 PM
Response to Original message |
5. Where is Truth is All? |
Glenda
(1000+ posts)
Send PM |
Profile |
Ignore
|
Mon Nov-01-04 12:49 PM
Response to Original message |
6. I question the polls for a number of reasons |
|
A 1 in 20 chance that the confidence interval does not contain the true population mean.
However, surveys have a lot of assumptions that impact the sampling error %, and I think the assumptions are probably being violated lately. - assumptions about who is Likely Voters - bias of not including cell phone users. Do the numbers adjust for that? - I also think telemarketers ruined it for pollsters. People are likely not to answer any questions anymore if a stranger calls. And what about the elderly told not to give any info out over the phone. - assumptions about the make-up of Dems to Repukes in the population - other biases about who will and won't answer the phone. For example, are Dems or Reps more likely to have Caller ID and not answer the phone?
When John Zogby (on TDS) said it takes 10,000 phone calls to get 1000 respondents (my numbers may not be exactly what he said but close), that's not a high response rate. The lower the response rate, the more one has to "KNOW YOUR NON-RESPONDENTS."
ANd do pollsters really know the 90% of people they call who are non-respondents??
G.
|
DU
AdBot (1000+ posts) |
Sat May 04th 2024, 10:18 PM
Response to Original message |