Election Poll – Fordham Now https://now.fordham.edu The official news site for Fordham University. Tue, 19 Nov 2024 18:34:01 +0000 en-US hourly 1 https://now.fordham.edu/wp-content/uploads/2015/01/favicon.png Election Poll – Fordham Now https://now.fordham.edu 32 32 232360065 Five Facts About Election Polling https://now.fordham.edu/politics-and-society/five-facts-about-election-polling/ Fri, 26 Oct 2012 13:00:14 +0000 http://news.fordham.sitecare.pro/?p=30558 At this stage of an election, public polls—conducted by the media, universities and private polling companies—provide us with daily doses of information about how Americans feel about the candidates. Most importantly, they tell us who is ahead and who is behind in the vote. Lately there has been some controversy surrounding election polls and their accuracy. In order to facilitate understanding of the issue and the polls themselves, here are some things Americans should know about election polls.


1. Why pollsters conduct election polls:
Election pollsters poll because survey results sell. For media outlets, polls produce content that the public eagerly consumes. For universities and private polling firms, conducting and publicizing election polls provides name recognition and credibility (Quinnipiac who?). As a result, “horserace” numbers—the head-to-head election match-up results—come at a rapid pace at this stage of a campaign. Americans are conflicted over this deluge. In a 2008 Pew poll, a majority of Americans said they would like to see less coverage of horserace polls. At the same time, research shows that the public is fascinated by horserace coverage, and even seeks it out. Horserace results are a guilty pleasure for the public: like reality TV, no one seems to like it, but an awful lot of people consume it.

2. Who pollsters are actually polling.The survey population varies depending on the poll’s purpose, or, in the case of elections, when the poll is conducted. There are three relevant populations in election polling: American adults; registered voters; and likely voters. When conducting a poll well in advance of an election, many pollsters will interview the general population because it is too early to know if people will register and vote. During an election year, registered voters are a common population because they are a concrete and eligible group (and more likely to vote than the non-registered). As the election nears, pollsters begin the search for “likely” voters, those who will actually show up to vote.

Monika L. McDermott, Ph.D., is professor of American politics/political behavior. McDermott is also a survey research practitioner who has conducted election surveys at the Los Angeles Times Poll, the CBS News Election and Survey Unit, and the Center for Survey Research and Analysis at the University of Connecticut. She is currently an election and polling analyst for CBS News and The New York Times. Photo by Chris Taggart

3. What a “likely voter” is. Likely voters are a slippery bunch to net. Since pollsters want to interview only those who are actually going to vote (to accurately gauge the state of the race), they need to weed out those who say they will vote, but won’t. There are two common ways to do this: 1. Drop individuals from the analysis who are deemed unlikely to vote; or 2. Apply a statistical weight to the data, depending on a person’s probability of voting. Both methods typically consider key factors such as an individual’s past turnout record, their level of engagement with the election, and their stated likelihood of voting.

4. What poll bias is and where it comes from. The nature of a likely voter sample directly impacts the results of a survey. Counter to accusations, however, reputable pollsters do not try to bias their samples. In fact, it is in their best interests to be as accurate as possible—credibility depends on accuracy. At the same time, likely voter models are not always as accurate as a pollster might like. This year the big danger is over-representing Democratic voters. Democratic voters turned out in exceptionally high numbers in 2008—making up 7 percent more of the electorate than did Republicans, a larger gap than usual. Few experts expect Democratic turnout in 2012 to match that of 2008, leaving open the possibility that likely voter models based on 2008 turnout will be biased towards Obama. That said, pollsters are not likely to consider only the 2008 election when constructing voting models, and no two models are exactly alike. As a result, it is implausible that these models are systematically skewed in any direction.

5. Attacking pollsters’ methods is the last refuge of a trailing candidate. As sure as the sun rises every day, the candidate who is trailing in the polls will cast aspersions at specific polling organization, or polling method, or both. For example, the Romney campaign, trailing in the polls recently, launched a national fight against the likely voter models of the major polls, claiming they are biased towards Obama because they include too many Democrats. But before anyone considers Romney a temporary sore loser, we have to remember back to May of this year when an Obama aide said (on MSNBC) about a CBS News poll that had the President trailing: “We can’t put the methodology of that poll aside. Because the methodology was significantly biased. It is a biased sample.” Campaigns are usually silent about polling method when their candidate is in the lead.

]]>
30558
Five Things You Should Know about Election Polling https://now.fordham.edu/politics-and-society/five-things-you-should-know-about-election-polling/ Mon, 15 Oct 2012 20:12:32 +0000 http://news.fordham.sitecare.pro/?p=7049 By Monika L. McDermott, Ph.D.

 

Monika L. McDermott, Ph.D., is professor of American politics/political behavior. McDermott is also a survey research practitioner who has conducted election surveys at the Los Angeles Times Poll, the CBS News Election and Survey Unit, and the Center for Survey Research and Analysis at the University of Connecticut. She is currently an election and polling analyst for CBS News and The New York Times. Photo by Chris Taggart
Monika L. McDermott, Ph.D., is professor of American politics/political behavior. McDermott is also a survey research practitioner who has conducted election surveys at the Los Angeles Times Poll, the CBS News Election and Survey Unit, and the Center for Survey Research and Analysis at the University of Connecticut. She is currently an election and polling analyst for CBS News and The New York Times.
Photo by Chris Taggart

At this stage of an election, public polls—conducted by the media, universities and private polling companies—provide us with daily doses of information about how Americans feel about the candidates. Most importantly, they tell us who is ahead and who is behind in the vote. Lately there has been some controversy surrounding election polls and their accuracy. In order to facilitate understanding of the issue and the polls themselves, here are some things Americans should know about election polls.


1. Why pollsters conduct election polls:
Election pollsters poll because survey results sell. For media outlets, polls produce content that the public eagerly consumes. For universities and private polling firms, conducting and publicizing election polls provides name recognition and credibility (Quinnipiac who?). As a result, “horserace” numbers—the head-to-head election match-up results—come at a rapid pace at this stage of a campaign. Americans are conflicted over this deluge. In a 2008 Pew poll, a majority of Americans said they would like to see less coverage of horserace polls. At the same time, research shows that the public is fascinated by horserace coverage, and even seeks it out. Horserace results are a guilty pleasure for the public: like reality TV, no one seems to like it, but an awful lot of people consume it.

2. Who pollsters are actually polling.The survey population varies depending on the poll’s purpose, or, in the case of elections, when the poll is conducted. There are three relevant populations in election polling: American adults; registered voters; and likely voters. When conducting a poll well in advance of an election, many pollsters will interview the general population because it is too early to know if people will register and vote. During an election year, registered voters are a common population because they are a concrete and eligible group (and more likely to vote than the non-registered). As the election nears, pollsters begin the search for “likely” voters, those who will actually show up to vote.

3. What a “likely voter” is. Likely voters are a slippery bunch to net. Since pollsters want to interview only those who are actually going to vote (to accurately gauge the state of the race), they need to weed out those who say they will vote, but won’t. There are two common ways to do this: 1. Drop individuals from the analysis who are deemed unlikely to vote; or 2. Apply a statistical weight to the data, depending on a person’s probability of voting. Both methods typically consider key factors such as an individual’s past turnout record, their level of engagement with the election, and their stated likelihood of voting.

4. What poll bias is and where it comes from. The nature of a likely voter sample directly impacts the results of a survey. Counter to accusations, however, reputable pollsters do not try to bias their samples. In fact, it is in their best interests to be as accurate as possible—credibility depends on accuracy. At the same time, likely voter models are not always as accurate as a pollster might like. This year the big danger is over-representing Democratic voters. Democratic voters turned out in exceptionally high numbers in 2008—making up 7 percent more of the electorate than did Republicans, a larger gap than usual. Few experts expect Democratic turnout in 2012 to match that of 2008, leaving open the possibility that likely voter models based on 2008 turnout will be biased towards Obama. That said, pollsters are not likely to consider only the 2008 election when constructing voting models, and no two models are exactly alike. As a result, it is implausible that these models are systematically skewed in any direction.

5. Attacking pollsters’ methods is the last refuge of a trailing candidate. As sure as the sun rises every day, the candidate who is trailing in the polls will cast aspersions at specific polling organization, or polling method, or both. For example, the Romney campaign, trailing in the polls recently, launched a national fight against the likely voter models of the major polls, claiming they are biased towards Obama because they include too many Democrats. But before anyone considers Romney a temporary sore loser, we have to remember back to May of this year when an Obama aide said (on MSNBC) about a CBS News poll that had the President trailing: “We can’t put the methodology of that poll aside. Because the methodology was significantly biased. It is a biased sample.” Campaigns are usually silent about polling method when their candidate is in the lead.

]]>
7049
Professor Finds Voters are More Likely to Return to Polls After Being Thanked https://now.fordham.edu/politics-and-society/professor-finds-voters-are-more-likely-to-return-to-polls-after-being-thanked/ Wed, 03 Nov 2010 17:12:45 +0000 http://news.fordham.sitecare.pro/?p=32194 As scholars and pundits deplore the low rates of voter participation in the United States, particularly when compared to other industrialized democracies, Fordham political scientist Costas Panagopoulos, Ph.D., has discovered a deceptively simple way of getting more citizens to the polls:

Say thank you.

Research by Costas Panagopoulos, Ph.D., indicates that voters return to the polls more frequently after having been thanked.

In three randomized field experiments conducted over the past year, Panagopoulos found that thanking voters for having participated in a prior election boosts the likelihood that they will vote in a subsequent election. His research showed that turnout rates were 2 to 3 percentage points higher for prior voters who had been thanked, compared to those who were not.

The three experiments took place in Staten Island for a February 2009 special election to fill a City Council vacancy; in New Jersey for the gubernatorial contest of November 2009; and in Georgia for primary elections in July 2010.

In each case, one group of voters, assigned through random sampling, received postcards that reminded them about the upcoming election and encouraged them to vote. Another group received postcards that also thanked them for voting in a previous election and urged them to vote in the upcoming one. The control group received no mailing.

Both mailings were strictly nonpartisan and timed so that the voters would receive them approximately three to seven days before the election.

On Staten Island, Panagopoulos found that voters receiving the so-called “gratitude postcard” voted at a 2.4 percentage rate higher than those who received no mailing and 2 percent more than those who received the so-called “reminder postcard.” The results were similar in New Jersey, where voters receiving the gratitude postcard voted at a 2.5 percent higher rate than those who received no postcard. In Georgia, voters receiving the gratitude postcard voted at a 2.4 to 3.1 percent higher rate than those who did not.

“It turns out that gratitude expression is an effective motivator of pro-social behavior like voting,” said Panagopoulos, an assistant professor of political science at Fordham, where he directs the master’s program in Elections and Campaign Management, and the Center for Electoral Politics and Democracy.

“The effect may not be huge, but differences of this magnitude can be consequential, especially in close contests,” he added, noting that “turnout is especially critical in midterm elections, in which participation tends to lag 15 to 20 percentage points behind that of presidential elections.”

Preliminary numbers from the 2010 midterm elections confirm his assertion. Voter turnout was projected to be 42 percent, as opposed to almost 57 percent in the presidential election year of 2008.

Panagopoulos’s work has found its way into the popular media. A New York Times Magazine article published the Sunday before Election Day 2010 mentioned his research in New Jersey, and how it was adopted by poitical operatives across the country.

The study, “Thank You for Voting: Gratitude Expression and Voter Mobilization,” will be published in theJournal of Politics in 2011.

]]>
32194