AJR  Features
From AJR,   January/February 2001

Polled Enough for Ya?   

News organizations subjected their audiences to a relentless barrage of polls during the campaign. Sure, everyone wants to know who's winning the horse race, but to many this seemed like overkill.

By Lori Robertson
Lori Robertson (robertson.lori@gmail.com), a former AJR managing editor, is a senior contributing writer for the magazine.      



PEOPLE LIKE COMPETITION. Winners and losers. The underdog that still has a fighting chance. The dominating winner that rolls over anything in her path. And how do we determine the victor and the vanquished? We award them points.
In the contest we call the 2000 presidential campaign, the pre-election points didn't matter a whole lot. It was simply close, as in "too close to call." Yet Americans were polled, polled and re-polled. And the media reported what were often two- and three-point statistically insignificant leads faithfully. And with vigor. After all, when you've got George W. Bush ahead by 48 percentage points and Al Gore knocking on the door with 46, we've got ourselves a presidential election.
Voters wishing to keep up to speed on this ball game needed only to flip on the TV or the computer, or pick up a paper and find a poll. Like the CBS News poll, or the CNN/USA Today/ Gallup poll, or the Reuters/MSNBC/Zogby poll. Or maybe the Pew Research Center poll, the Washington Post poll, the ABC poll. Oh yeah, and then there's the Voter.com poll, Harris poll, Fox poll and NBC/Wall Street Journal poll. And the Hotline poll.
And the Los Angeles Times poll. And the Marist College (does anybody even know where that is?) poll. AndÐOK, so during this election journalists and the public saw an increase in the number of polls measuring the race. "Did we," says Larry Bivins, a Gannett News Service correspondent. He sounds weary.
The poll phenomenon intensified the closer the calendar got to November 7, though that usually happens as voters get ready to mark official ballots. (Oh, didn't mean to leave out the Rasmussen Research poll and the Investor's Business Daily/Christian Science Monitor/TIPP poll.) "By the end of the campaign, there were not only multiple national tracking polls...but multiple state battleground tracking polls," says Craig Gilbert, Washington bureau chief at the Milwaukee Journal Sentinel.
It's difficult to quantify the increase. But a count of polls available in the Roper Center for Public Opinion Research database shows about 130 taken in the two months preceding the 2000 election. In the two months before the 1980 election, there were about 20.
With so many polls, at least reporters can compare and contrast, and "discern a middle ground," Gilbert says. Other reporters and pollsters agree. But there are inherent problems with numbers flying left and right. Polls face many of the same troubles they've always facedÐexcept now those drawbacks are magnified times 10. These are estimates, everyone needs to be reminded, and "snapshots in time," not predictions of what's going to happen on Election Day. The last few election cycles have also embraced a proliferation of the daily tracking poll--what used to be a tool of campaign managers to measure instant reaction to events, such as ads, and has found its way, some say unnecessarily, into the press.
Why have we seen such a poll explosion, and how is it affecting journalism? Do the media truly focus too much on the horse race and report polls haphazardly? Is there any truth to the argument that campaign reporters write to the polls, creating a slew of favorable coverage for the candidate who's ahead?

I T COSTS AT LEAST $15,000 to survey about 1,500 people using an eight- to nine-minute questionnaire over a three-day period. That's CBS News Director of Surveys Kathleen A. Frankovic's estimate for an in-house job. Commissioning a poll from an outside firm would probably be more expensive.
Polling isn't cheap, but it certainly costs less than in the days when interviewers traveled door to door. Because the cost has gone down considerably, starting in the 1970s, says Frankovic, the number of media-sponsored polls--not just during presidential elections--has gone up. Add to that the increase in the number of media outlets, and you've got quite a pollfest.
The closeness of the 2000 contest partly contributed to an increase as well, says Howard Fienberg, a research analyst with Statistical Assessment Service, an organization that helps journalists understand scientific and social research. "People decided that we had to track it every day," he says. But Fienberg gives two other explanations for burgeoning polls: One, "people want to know what everyone else thinks," and two, journalists use them as a "crutch."
"It's hard to make politics interesting," he says. "Adding the poll numbers used to be a sure-fire way to spark some interest."
That's true, but the plethora of numbers this year may have outpaced necessity. To some, the onslaught of poll after poll after poll seemed like overkill. To Brookings Institution senior fellow Stephen Hess, it seemed like a waste of dollars. "This is not the best way to spend [the media's] money," he says. "But I'm not against the polls. It's just I'd rather they spend the money on something else, like reporters."
Gannett News Service's Bivins says the volume of polling "confuses the heck out of the electorate." Daily tracking polls and instant polls taken after the debates went too far this year.
Of course, the media don't have to publish or broadcast the numbers. It's not that the sheer amount of polling is bad, Fienberg says. The problem arises when it becomes "the basis for all political reporting, or at least the majority of it.... We don't need to know it all the time."
Every four years, media critics and journalists cry for less horse race coverage and more focus on issues. Ron Hutcheson, a national political correspondent for Knight Ridder, remembers one such news meeting early in the campaign. Everyone was making the usual "we want to look at the issues" plea. Hutcheson stood up and said, "Look, I think the horse race is interesting. When you talk to people, they all want to know, ŒWell, who do you think is going to be the next president?' "
Most critics and journalists agree that reporting on polls is a necessary component of campaign coverage, and that the much-pummeled horse race--especially in such a close presidential contest--is a story well worth pursuing. And, not to worry: The media gave the horse race adequate coverage. Perhaps (yes, let's revisit the criticism again) too much. Hutcheson does think some issue coverage was sacrificed for the game. "I thought it broke down this election cycle more than ever before," he says. Because the race was so close, the tendency was natural. "The pull was stronger than ever to look at polls."
Hutcheson says he didn't write more horse race pieces during this election, and his editors discourage that kind of reporting. But, he says, "there were times when poll data was edited into my story." Hutcheson adds that such editing didn't bother him, because sometimes "people in Washington have a better perspective on things than people inside the campaign bubble."
Marvin Kalb, executive director of the Washington office of the Joan Shorenstein Center on the Press, Politics and Public Policy, and Andrew Kohut, director of the Pew Research Center For The People & The Press, say they believe in horse race coverage. But not when it smothers everything else. "If the reporting of polling is done in the absence of substantive reporting as well, on the candidates' domestic and foreign policy positions, then that is a huge failing," Kalb says. "But if the polling is done as a supplement to substantive coverage, then it is an asset, and when it is a close election, fully understandable."
Most of the criticism this time around falls on the networks. Of course, there is less space in a half-hour newscast than on a 24-hour cable news network or in a newspaper. As Jill Zuckman, a national correspondent with the Chicago Tribune's Washington bureau, says, newspapers "have a lot of inches to fill, and you can't do that with just polls." In fact, most print outlets went beyond the poll numbers, filling their pages with issue story upon issue story.
Zuckman criticizes both networks and cable for going overboard on the presidential scorecard. "I just think that was their sole focus," she says.
Well, that wasn't the sole focus, but a good percentage of it. Brookings' Hess found evidence that NBC, ABC and CBS have indeed devoted a larger percentage of their total campaign coverage to horse race-type stories than in the past. Hess, with data from the Center for Media and Public Affairs, conducted a content analysis of nightly newscasts during the two months preceding the 2000 election. The horse race garnered 71 percent of the coverage in 2000, as compared with 48 percent in '96 and 55 percent in '92.
Why? "Because it was the easiest way to frame the story," Hess says. He's not bashing the formula, but says "the problem is, there has to be some balance.... What do you learn from a horse race story? The truth is, not very much."
Some critics point out that it costs less to report on polls than to do heavy-duty coverage of weighty matters, a not insignificant consideration to today's budget-conscious networks. "It all ends up being a financial decision," says Kalb, once a widely known network correspondent. "This is journalism." But it's journalism "as practiced through the prism of budgetary constraints."

A NDY KOHUT SAYS THE day-in-and-day-out tracking polls of last year were "loopy." He, Kalb and others interviewed for this article object most strongly to these daily measurements, which littered the media in addition to the large amount of typical polls. Pew did not conduct tracking polls, which, Kohut says, tried "to measure something that didn't exist...movement in the race."
Kalb says the increasing use of tracking polls "represents less than the best of contemporary journalism--putting it most charitably."
What's a tracking poll? Different organizations have different methodologies for conducting them, but the general format goes like this: A smaller sample, maybe 400 people or so, is polled each day. (A regular poll samples anywhere from 1,000 to 3,000.) The daily results are averaged over a three- to four-day period and presented as the latest day's tracking poll. The next day, new results are averaged in, and the earliest day is dropped off.
Political junkies may feed on these polls, gauging voters' reactions to every little blip in the campaign. But Kohut and Kalb say they're not very stable and not very reliable--the small sample sizes mean less accuracy, particularly over the long term. Fienberg calls tracking polls "the best example of overdoing it with polling."
Bivins and Hutcheson tried to avoid looking at such polls. That wasn't easy. Bivins calls the daily tracking polls, with wild fluctuations, "worthless." And Hutcheson says of CNN's tracking poll: "They ought to just shoot that and put it out of its misery."
Every day it seemed the cable networks were reporting another lead change, he says. "It's kind of ludicrous in the end."
Hotline columnist Howard Mortman throws in more fuel by pointing out that the CNN/USA Today/Gallup tracking poll had Bush up by 13 points on October 27, the same day the CNN/Time magazine poll had W. up by 6.
Tracking polls "probably will not be evaluated as being as useful for predicting the outcome of the presidential elections," says Richard C. Rockwell, executive director of the Roper Center for Public Opinion Research at the University of Connecticut. He's not positive that will be the result, but with small sample sizes, he says, "you can see shifts that don't stand up over time."
In 1992, the Gallup Poll, along with its media partners CNN and USA Today, decided it was worth the investment to compile daily tracking polls, says Gallup's editor in chief, Frank Newport. He does expect the surveys to jump up and down, reflecting the indecision of the public. But Gallup's tracking poll and others were accurate the day before the election in demonstrating that the race could go either way, he says. And, he theorizes, if Gallup's final polls were correct, then the polls two or three weeks out were probably fairly accurate as well. Gallup, like other organizations, did increase the number of people polled daily, from 400 to 1,150, close to the election. Of those samples, Gallup would report results over three-day periods for likely voters (about 700 likely voters in the initial tracking polls, then about 2,000 in later polls). "A tracking poll can be very accurate," says Newport, "but it can also be measuring a moving target."
Or a statistically insignificant target. On Monday, November 6, USA Today reported that Bush's lead over Gore narrowed in the latest tracking poll--47 percent to 45 percent. That was down from the previous day's Gallup poll of 48 to 43. But with a + or - 2 percent margin of error, that lead could have been nonexistent--say, 47 to 47. Or, as the margin of error is applied to both candidates, Gore could have been leading 47 to 45. The earlier poll could have been 47 to 45 as well, or even 46 Bush, 45 Gore. Newport is right: Gallup, just before the election, correctly showed that the outcome was too close to call. However, the question remains: Did the media have to keep reporting the numbers?

I N THE SPRING OF 2000, Dean Cappello, vice president of programming and operations at New York's WNYC radio, posed a question to his staff. What if the AM/FM station banned polls in its own reporting?
The staff, says Cappello, characterized him as "overreactive." But every time he switched on the radio, it seemed newscasts were leading with polls. "We were getting into the habit of reporting poll movement as news," he says. Cappello thought it was "a runaway train, and everybody's just jumping on board."
WNYC Managing Editor Mark McDonald and Cappello had long conversations to figure out how to put polls in context, not be misleading and make sure issues get the most prominence in the news. Cappello thinks the NPR member station was much more cautious during this election than it had been before. "I do think that we have to take responsibility for what the accumulated effect of the coverage is," he says.
Ken Bunting, executive editor of the Seattle Post-Intelligencer, says his paper is always cautious about polls, but it was particularly skeptical this year. The volatile tracking polls were a big tip-off that polls weren't that stable. "We limited mention of polls in our coverage," he says. "We didn't ban them, but we didn't run them every day, because we didn't believe them."
The Seattle paper has commissioned its own polls in the past, but this presidential election, it didn't order up any. Bunting says he doubted the pollsters could produce an accurate snapshot, and with so many undecided voters, "it just didn't seem like a great idea."
That philosophy certainly bucked the trend--as does the Philadelphia Inquirer's thinking on polls, year after election year. The Inquirer is one of the few major newspapers that does not commission polls, a policy that's been in place since former Executive Editor Gene Roberts joined the paper in 1972. As current Editor Robert J. Rosenthal explains, the paper doesn't want to be part of the story. By commissioning surveys, he says, "you're really interjecting the Philadelphia Inquirer poll results into the campaign. [The results] obviously hurt or help somebody, and...the thinking was to try and cover the campaign, not only the horse race element but also the issues, personalities, the records of the candidates." Roberts also says that he didn't want to stake the paper's credibility on a poll--an Inquirer poll would be identified as the paper's work regardless of what organization conducted the survey.
The Inquirer will report the results of other polls, but Rosenthal feels the paper gives more weight to issues, adding comments from actual voters and moving away from "the tightness of the race reporting." One of the best stories the Inky published in the fall, he says, was an explanation of polls and why results were swinging so wildly.
To assist journalists in understanding the numbers, many major news organizations employ pollsters who can separate the good from the bad. It's often difficult for individual reporters to do so. In general, however, well-done surveys can add something to a journalist's work: Polls provide some expert, scientific opinion on what Americans think. They can add weight to theories and inspire new ideas. Hutcheson says a New York Times poll in Florida about 10 days before the election changed his way of thinking about the state, when it showed Gore doing better with likely voters than with registered voters.
Both Hutcheson and the Tribune's Zuckman say polls often reinforce reporters' own assessments about the dynamics of a campaign. But there's a danger in that as well. "If I have a working premise from talking with voters," says Hutcheson, "and I compare it with a poll, and it doesn't line up.... I'd probably jettison my premise." It's difficult, he acknowledges, to write against the polls.
That means it's easy to write with them, a phenomenon explored in stories by a handful of news outlets when charges of political bias cropped up (see "Playing Favorites?"). The Milwaukee Journal Sentinel's Craig Gilbert says there is truth to the idea that when George W. Bush was up in the polls, he received favorable coverage, and when Al Gore sped ahead, he suddenly was portrayed as a far better candidate than he was when his numbers were down. "When the polls showed Gore surging," he says, "it generated a ton of stories on why Gore was doing well and Bush was doing badly."
Of course, once the numbers rise, a candidate's confidence may surge, and he may become more effective on the hustings. "Success breeds success," says Hutcheson, noting that Bush would be energized by encouraging poll results. While the candidates may preach that polls don't matter, when one guy's up, "the candidate's in a better mood, might be better on the stump," Zuckman says. "A lot of it is very ephemeral." So, it may not be the reporters who are writing to the polls, but the candidates who are playing to the polls and whom the journalists then write about.
Thankfully, for both reporters and the public, there are a lot of good polls out there. "In general," says the Roper Center's Rockwell, "they have gotten more accurate." Since the 1948 presidential election, the final polls have correctly predicted the showings of the major candidates within an average of 1.9 percentage points, according to the National Council on Public Polls.
For estimates, that's a good track record, but it's not perfect.
CBS' Frankovic says that, after years of exposure, people have become more comfortable with polls--perhaps too comfortable. "You can find many examples of people, journalists and others, expressing perhaps too great a belief in poll results," she says. The hard numbers lend a false air of precision. "It may be an improvement if people were to be a little more skeptical and a little more cautious."
Some news organizations, such as the Seattle Post-Intelligencer and the Hartford Courant, ran stories last year that emphasized the limitations of polls--they're snapshots, they're not always correct, don't forget the margins of error. (The Courant's piece was an editorial.) But these technicalities aren't always spelled out when the polls make the headlines, and many times, readers find them in the small print at the bottom of a chart or graph.
Do such disclosures actually belong in the fine print? Maybe. Imagine a radio newscast that included: "George W. Bush holds a narrow lead over Al Gore in the latest Zogby poll, 48 percentage points to 45. But with a margin of error of plus or minus 2 percentage points, this really means Gore could have the lead. Or it could be tied. Or Bush could be leading 50 to 43. And with a confidence level of 95, there's a 5 percent chance that this poll is just plain wrong."
As Cappello says, that kind of context is "unwieldy in the newscast." Instead, journalists need to think more about the importance of individual polls before reporting them.
It's not clear whether polls affect voters' behavior. Newport points out that if polls had a powerful impact, everyone would vote for the front-runner every time. Reporting on polls does, however, create that winner/loser mentality, and if one candidate is too far behind, supporters' energies may wane. If an election is portrayed as a close contest, the public may be more motivated to stand in line at the polls.
Gallup's data shows that people like polls. But, adds Newport, when you ask them if they think a survey of 1,500 to 2,000 people can represent the views of all Americans, "they say no. They find that hard to believe.... I do think that the average American does not understand the principles" behind averages and polling.
Despite the poll-ridden climate, it's doubtful most Americans are going to sign up for probability and statistics classes or hold debates on sampling error at water coolers and local bars. Nor should they. Unless, maybe, they happen to be journalists.

###