Results 1 to 2 of 2

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

  1. #1
    Senior Member Brian503a's Avatar
    Join Date
    May 2005
    Location
    California or ground zero of the invasion
    Posts
    16,029

    Experts caution voters on pollsters

    http://www.azcentral.com/arizonarepubli ... 09030.html

    Experts caution voters on pollsters
    Some surveys biased, not always accurate


    Karissa Marcum
    The Arizona Republic
    Sept. 3, 2006 12:00 AM

    As election season heats up, voters can expect a flood of pollsters measuring how they feel about candidates and issues ranging from immigration to Iraq to same-sex marriage.

    The message to the critical consumer: Read the fine print.

    Polls can yield surprisingly different results on any subject. And the differences can be even more pronounced when it comes to complex topics like immigration, gay marriage and taxes.

    For example, a recent Rocky Mountain Poll in Phoenix found that most Arizonans favor immigration reform that allows foreign workers to enter the U.S. without breaking the law. Many Arizonans questioned the findings, which seemed to contrast sharply with other surveys showing voters want to stop undocumented immigrants at the border.

    Randy Pullen, an anti-illegal-immigration activist and Republican national committeeman for Arizona, said the poll questions were "biased to result in an answer to be very favorable to the pro-immigration."

    "It's all a matter of how you ask a question. I could turn all these questions around and get opposite answers," Pullen said.

    Elias Bermudez, CEO of the advocacy group Immigrants Without Borders, said the poll was fair. "I agree almost with everything that they contend here. I'm very happy that another voice is being heard from someone that is trying to find the truth," he said.

    The poll addressed illegal immigration differently from other surveys. One question asked people if they agree that "securing our borders should be our top priority, but humane and fair treatment of foreign workers is also very important." An overwhelming majority agreed.

    To Earl de Berge, researcher director with Phoenix-based Rocky Mountain Poll, the results demonstrated that people don't see immigration as a black-and-white issue. He said immigration generates dramatic responses and sometimes outrage.

    "When you poll it's just part of the territory. People have strong points of view; that's democracy," he said.


    Deciphering opinions


    The job of a pollster is a tough one: actively decipher the views of thousands or even millions of people using a small group of them. So, critical consumers should be aware of who is conducting a poll and why. There are two major types of polls: those conducted in the public interest and those conducted for private interests such as non-governmental organizations, special-interest groups and political candidates. They often pay for polls and release the data only if they are favorable to their cause.

    Political consultant Kevin DeMenna often commissions polls for his clients who are running for public office. He said polls could be misleading if they are taken out of context and if consumers aren't given all the specifics. DeMenna said the public should be wary when special-interest groups do polls to convince politicians or the public that there is support for their cause. He is referring to "push polls" that are becoming increasingly popular. They use polling data more to persuade than to inform. DeMenna said they aren't used frequently because they are often dismissed, but people should still be aware they are out there.

    Kurt Davis, a Phoenix-based public-affairs consultant, said pollsters try hard to stay nonpartisan, but most "tend to break with who their clients are." He emphasizes that the "break" is not intentional or intended to sway public opinion. He said most pollsters are ethically responsible.

    Most pollsters analyze data in a few paragraphs of introduction; some provide little analysis while others do a lot of data interpretation. "The essence of it is that the data doesn't lie, it's all about interpretation," DeMenna said. Consumers should read the poll questions and come up with their own conclusions.


    The random sample


    To ensure a poll is scientifically valid, pollsters must use a random selection technique, sample population and sample size to decipher the views of the population.

    The sample will represent the population if every person has an equal chance of being chosen for the poll.

    Most national polling companies such as Gallup rely on a computer to select a random phone number. If a person does not answer the phone the first time, the number is stored and tried again a few hours later and even a few days later depending on the survey period. The procedure is designed to correct a possible bias that could occur in favor of people who answered the phone the first time they were called.

    The sample size helps determine the reliability of a poll. As the sample size increases the margin of error decreases.

    With a standard sample size of 1,000 adults nationwide, results are likely to be accurate within a margin of error of plus or minus three percentage points. For example, if a poll finds that President Bush's approval rating is 50 percent, the margin of error indicates the rating is likely to be between 47 to 53 percent. Often, when similar polls conflict the difference can be attributed to the margin of error.

    Pollsters in Arizona often sample 500 to 800 people to keep the cost of a poll down, but they sacrifice a small bit of accuracy.

    According to the National Council on Public Polls, the laws of chance say that because of sampling error one poll out of 20 may be skewed away from the public's true views.


    Sources of bias


    The wording and order of questions can affect people's response.

    In public-opinion surveys experts say the wording of questions is probably the greatest source of bias and error in the results, followed by question order. Pollsters need experience to write clear and unbiased questions, but even with this discipline questions could be biased.

    "The public can only react to how the polls are asked, and there are only a limited number of choices," said Bruce Merrill of the Walter Cronkite School of Journalism at Arizona State University. Individual words can also carry bias. Pollsters must decide whether to use terms like "welfare" or "programs for the poor" and "sending" troops vs. "contributing" troops in their questions.

    When possible, pollsters rely on questions that have been proven unbiased historically, such as "How do you feel about the direction of the country?" But, when pollsters are measuring public attitudes on issues it can get more complicated. For brand new question areas, large polling companies such as Gallup will often test several different wordings.

    The longer a poll goes the less attention people tend to pay to answering the questions thoughtfully, so question order is also important.


    'Snapshots in time'
    Timing is also critical in determining the validity of a poll. Pollsters refer to polls as "snapshots in time." That means that a poll is intended to reflect opinions only at the moment it is taken and does not necessarily forecast the future. Opinions change over time, so as a poll ages it becomes less relevant, although not necessarily wrong. Events can also dramatically alter the results of a poll. Political candidates' polls often go up or down depending on how television commercials or direct-mail pieces are perceived. Experts said consumers should use a series of polls to find trends over time.

    "Any one poll I would be careful about, but when you get a dozen polls done by legitimate organizations, then they're probably pretty close to the truth," said Patrick Kenney, chair of political science at ASU.

    Despite all the math and statistics behind polling, people don't always agree with the data. Experts said the majority of polls are an accurate reflection of the public's views at a given moment.

    But across the board they are concerned with a disturbing trend: People are allowing the opinions of others given in polls to shape their own opinions about key issues. If people don't know much about candidates and they're ahead in the polls, then people tend to believe they are more credible. Kenney said there has been a lot of research looking into the phenomenon, although there is still no definitive answer. The same is true with issue polling. Pollsters wonder if they have a measure of true opinion or just something a person might have heard lately.
    Support our FIGHT AGAINST illegal immigration & Amnesty by joining our E-mail Alerts at http://eepurl.com/cktGTn

  2. #2
    Senior Member
    Join Date
    Mar 2006
    Location
    Dallas, TX
    Posts
    1,672
    I work in this area of research and have long been telling ALIPAC users this very same thing. Studies can be biased. No one should do a politicol poll unless it's for Gallop, that is my opinion on this matter.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •