About 70 percent of all persons polled in all polls do not complete the poll - they hang up, or simply go away if on the street - and thus are not included in the results.
Pollsters and academics like to produce studies (arguments, really) showing that the poll results would be the same if all those sampled participated. These are little more than weak attempts to legitimate their professions.
There is no way to say how these 70 percent feel. The group who refuse to participate will vary from poll to poll. People liable to hang up will tend to have a different personality and hence outlook than those who stay on the line. There is no reliable way to weight the 30 percent who do respond so that the results correspond to what the 100 percent really think.
Here is an excellent article from retropoll.org, a classic in debunking the bad science typical of opinion surveys.
http://retropoll.org/polling_fraud.htmPolls usually report out a statistical "margin of error" for their results. The margin of error that polls report depends not upon the number of people called but upon the number who responded, the sample size. They usually report a margin of error of about 3% for a sample size of 1000. But this margin of error statistic that makes polls look highly accurate is, in essence, a cover to hide the 70% who refused to participate. Even if 99% refused to participate and we had to speak to 100,000 people to find 1,000 who would talk with us, the margin of error statistic would still be reported as the same 3%. It would be hiding the problem of non-responders. So the margin of error statistic is not only inappropriate in this circumstance; it suggests a level of certainty that is fraudulent.
While it is always possible that those refusing have similar views to those agreeing to be polled, Retro Poll has found evidence to the contrary. When we asked over a thousand people, "would you take a few minutes to respond to a poll on the impact of the war on terrorism on the rights of the American people", one woman responded: "You wouldn't want to hear our view on that. People wouldn't like what we think."
"That's ok", we said, "your views are important; they should be counted and reported as part of the democratic process. We want your opinion to count." "No," the woman said insistently. "We're against the war the way they did it. We think they should just bomb all of them, not send our troops over there...." We didn't ask whether she meant bomb everyone in Iraq or some larger group of Muslims, or nations of people, but the woman's self-awareness that her views were outside the "norm" caused her to refuse to participate. Undoubtedly others have specific and different reasons for non-participation that we have difficulty ascertaining because most won't talk about it.
If the "bomb them all" couple may seem the exception among non-responders, consider this: Fewer African Americans and Latin Americans agreed to be polled in both of our national samples (in the current poll 5.7% were African Americans and in the prior poll 4%; for Latin Americans the corresponding figures were 6.2% and 8%. Each of these groups make up about 12% of the U.S. population, actually 12.5% for Latinos). As a result, our poll sample ended up at 79.4% "caucasian" ( i.e. European American) but the actual White/non-Hispanic European American proportion of the population is 69.1% according to the 2000 Census.
http://retropoll.org/polling_fraud.htm