I post in the past about polls.
Here's how polls can be inaccurate:
Samples can be too small in size or unrepresentative of the population
It's normally too expensive or time-consuming to survey everyone in population; thus, we must rely on samples to gauge the opinions of everyone. A reliable, scientific poll questions a large enough sample of people to ensure statistical accuracy and includes a representative selection of respondents. Thus, a poll designed to represent American public opinion wouldn't be very reliable if it only included 10 people or included only white males. It's rare that news reports will mention details of the information sample or how the survey was conducted. Viewers and readers usually just take the poll results as fact. For example, what if I reported a poll that said 96 percent of Americans are pro-choice? This obviously doesn't reflect American public opinion, but if the source was a survey of the feminist magazine Bitch readers, the results would be understandable. A clever or sloppy journalist can obscure the source and portray public opinion in an inaccurate way. Think about all the polls that are done today and how easy results can become unrepresentative. Web polls exclude people without web access and those who don't visit that particular site. Polls also exclude those that don't have the time or interest to respond. Think about TV polls. Fox generally has more conservative viewers; CNN generally has more liberal viewers. Thus, their polls results may be skewed to the conservative or liberal side regardless of the issue. The chances for error or bias are endless.
Polls can ask leading questions
Questions can be worded in a way that leads a respondent to an answer that may or may not reflect his true feelings. For example, I could ask the question "Do you want to stop the war in Iraq so the lives of innocent civilians can be spared?" Virtually every American wants to prevent innocent loss of life, so many respondents may answer yes to this question, even if they think the war is morally just. But reporters summarizing the results may say "...95 percent of respondents answered yes when asked if they wanted to stop the war". The questioner can also surround the question with information that biases the answer. For example, "Seventy percent of homeless shelter residents are single mothers and their children. Should the next fiscal budget include an increase in funds to local shelters?" Respondents may believe the money is better spent on other areas, but the extra information points people in the direction of one answer.
Polls can omit some of the possible answers, leading to either-or answers that don't reflect reality
Answers to poll questions are often more complicated that yes-no or among a small list of choices. For example, a poll may ask "Do you support a war with Iran?" The only choices may be yes or no. But many people may say "Yes, but only if they are making nuclear weapons" or "Yes, but only if it is sanctioned by the U.N." Another example is a consumer confidence question that asks, "Do you consider yourself rich or poor?" Many people will want to answer something in between, but that isn't a choice.
People recording survey results may be dishonest or sloppy in recording results
Whether the poll is done in person, by phone, by mail, or by web, a human being usually has to eventually tally & report the results. That causes problems for two reasons. One, a human is prone to mistakes. If you're tallying thousands of responses, you're bound to make mistakes. Even if a computer handles the tally, computers are still programmed by humans. Second, the person may be dishonest and wants to achieve a certain result. For example, assume I'm a passionate advocate for banning the death penalty and am taking a phone survey. A strong poll result showing the public in favor of a death-penalty ban may convince some politicians to take action. When taking a poll, it's easy for me to put some extra chalk marks in the anti-death penalty column even when people are answering pro-death penalty in the phone calls. Eventually, I may just achieve the poll result that I want.
Poll results can be presented in a misleading way
Most news stories don't present the raw data behind a poll and let you draw your own conclusion. Instead, the results will be presented in summary format as part of an analysis article. For example, a poll question may ask "Do you support military action to unseat the Islamic fundamentalist regime of Iran (Yes | No | Unsure)?" The raw data result may be: 29 percent support, 28 percent oppose, 43 percent unsure. The correct conclusion to draw from this poll is that the public generally hasn't made up its mind or needs more information. However, a biased reporter may selectively draw from the results and give the wrong impression. For example, "The idea of military action against Iran is increasingly unpopular. A recent poll concluded that only 29 percent support action, handcuffing the hawks of the Bush administration."
Even if polls are scientifically accurate and are done by unbiased, profession polling organizations, there are still other problems that make polls unreliable.
Also media polls can often be used to form public opinion instead of gaging public opinion.
We may believe that polls tell us what Americans are thinking. But polls also gauge the effectiveness of media spin — and contribute to it. Opinion polls don’t just measure; they also manipulate, helping to shape thoughts and tilting our perceptions of how most people think.
Polls routinely invite the respondents to choose from choices that have already been prepared for them. Results hinge on the exact phrasing of questions and the array of multiple-choice answers, as candid players in the polling biz readily acknowledge
I'm not saying to discount the polls entirely. What I am saying though is to take a moment and examine them closely to see what they are really saying.
This makes a lot of sense. So many polls have been off. For instance, they had George Smitherman trouncing Rob Ford for Mayor....was the opposite result. They had NDP Adrian Dixon NDP trouncing the Liberals provincially in BC...result...The NDP got trounced. I think it's a way for the media to try to influence voters. I have never seen so many polls as in this election and leading up to it...there seems to be a poll on everything every day. It's getting foolish.
ReplyDeleteI agree that the media is trying so hard to influence this election. Don't think they're going to be successful in the end.
DeleteThat's Adrian Dix, not Dixon.
DeleteLilley didn't mention it, but many pollsters are in bed with certain political Parties,and their methods reflect that. It's SO easy to manipulate polls,it's become a subject of comedy.
https://www.youtube.com/watch?v=G0ZZJXw4MTA
Its not surprising that polls are so inaccurate and skewed as its the corrupt Media that commission most polling done during an election, and the Media clearly have a preconceived agenda. The Media are hopelessly corrupt and dishonest so nuancing the polls in order to push the Medias agenda is in no way surprising, its manipulative and dishonest but not surprising... IMO, Media polls will not produce the desired results the Media are looking to achieve, a leftist victory or a leftist "coalition".
ReplyDeletethese polls have to be accurate after all one said that David Suzuki was the most admired Canadian, now how could that be untrue
ReplyDeleteSure it's possible to have biased polls. That does not mean polls cannot be accurate, though. That's why hq polls like crazy throughout the election. That's why governments of all stripes do LOTS of polls.
ReplyDeleteThis comment has been removed by the author.
ReplyDeletechocolate day images free download
ReplyDeletechocolate day messages for friends
happy hug day couple images
happy hug day card message