The election is going to be close — or is it? CNN political data reporter Harry Enten on the science and art of polling.
The U.S. presidential election is five days away, and it’s going to be close. Or is it?
CNN Senior Political Data Reporter Harry Enten noted last week that despite polling showing a historically close race, either Donald Trump or Kamala Harris could still capture more than 300 out of 538 electoral votes – a relative landslide by recent standards. Storybench spoke to Enten about his approach to polling analysis and what the data might reveal about this election’s final stretch.
Enten began his career as a journalist at FiveThirtyEight, where he also co-hosted their Politics podcast. While studying at Dartmouth, he published his blog, Margin of Error, which has since evolved into a CNN podcast. Enten’s data-driven insights on polling and political trends traverse various platforms, building a reputation for clear, analytical commentary.
You’ve been involved in election analysis as far back as the early 2000s with your ‘Margins of Error’ blog, at FiveThirtyEight and then at CNN. How has polling evolved in the past couple of decades, and how has polling’s influence in politics evolved?
I think coverage of polling has gone in many different directions. One of the key little nuggets that has changed, certainly since I was a young lad, is we’re much more interested now in aggregation of data than individual polling data. That started off with Real Clear Politics in 2000 and 2004 and then really exploded in 2008 when you had pollster.com, as well as FiveThirtyEight. All of a sudden … you could get polling 24/7. You didn’t have to just wait until the nightly news or wait until a cable segment came on. You could see it all at one time, you could go get it, and you had people who were distilling it down.
I think my old boss, Mr. [Nate] Silver, had much to do with that. But then some interesting things happened after 2012 when the polling averages were pretty gosh strong. They were pretty gosh strong in 2008 and 2004, and people began to expect a lot from the polling data and thought that it would be right on perfect every single time. And then obviously 2016 happened and that was not the case. And then 2020 happened, and again, that was not the case. And so what I think has changed a little bit over the last decade is that people are much more suspect that the polls are going to be right on, as they should be.
Polling is a tool. It’s not going to get it exactly right every time. These polls come with margins of error; the aggregation, the averages of the polls come with margins of error. And those are important to note. And I think we are getting a little bit better with that where people aren’t expecting the polls to be right on. But I think that there’s still work to be done in understanding that if a candidate’s up by 20 points, they’re very, very likely going to win, but if a candidate’s up by two points, I think noting that the historical average for a state polling average since 72 to be off is 3.4 and noting that these are close races and that folks shouldn’t necessarily expect too much from the polling data.
So you mentioned the horse race, and I know that the media sometimes gets criticized for focusing too much on the polls and not on policy issues or the candidates themselves. Do you think that’s a fair criticism of how we cover polls?
I think you can do both, and I think a good news organization does both. If anything, I’ve actually seen the ball go back towards the other way. We’re sitting here on Sunday, Oct.20. I don’t think a mainstream media outlet published a poll today and we are, what, 16 days out from the election? That’s a real shocker, at least for me it’s a shocker. So when people say they’re focusing too much on the horse race, I would make the argument that we’re doing it less than we used to do it. In some ways we’re doing it with less data, but I think it’s important to do both.
Look, there are a few things. Number one, polling can cover many things outside of the horse race. It can cover issues. And I think it’s important to know when these candidates say something, whether or not the American people are behind them or not. But it’s also the case that if people are always going to be interested in who wins and loses, and the question is do we do that with the best available data to us? Do we spend the money so that we really have the best idea or are we going to sit back and allow that void to be filled by outside actors who may have different intents and purposes than fully explaining in an accurate and unbiased way where the race is. So I think it’s important that we recognize that there is a thirst for this type of coverage, but to do it in an accurate, nonpartisan, and unbiased way.
Why is polling data important? What should viewers understand about polling data?
Any time it’s important to show your work, no matter what industry you’re in, I think [using the] scientific method is important. Being able to reproduce the results is important and I think it’s important for the audience to understand where you’re getting that data from so that they trust the data to the fullest extent possible.
I think the horse race itself is kind of important. I think we’ve actually seen why it’s important the last four years. I mean Donald Trump lost the 2020 election and has gone on and said over and over and over again that he won it, which is not the case. So part of the reason I think that a lot of people believe that he actually did lose it – the majority of people, two-thirds of the public believe it – is because the polling data beforehand suggested he was going to lose that race. It was something that actually reinforces the faith in democracy that the results accurately reflected the will of the American public. And so I think we have a very good sort of case study right there for why the polling is important.
Do you think polling could impact voter turnout?
It could. Obviously when races are more competitive, more tightly fought, you would think that the turnout would go up, especially when there’s a lot more on the line and a poll may in fact influence you into thinking that the race is close, therefore I want to go out there and vote. But 20, 30 years ago, turnout was lower 20, 30 years ago, 1996, 2000. It’s one funny thing to think of how much lower the turnout was in 1996 or 2000 in terms of the voting eligible population compared to where it was today or four years ago, which obviously featured record high turnout at least since 18 year olds got the vote.
Throughout your career, you’ve used a wide variety of platforms and methods to share polling analysis and data-driven insights, including blogs, podcasts, social media and broadcast television. Have you found that one medium is more effective in engaging certain audiences?
I think they’re all effective in different ways, and that’s why I like doing all of them. Obviously when you’re on TV, you have the widest audience that’s watching you at any one time. But of course you get maybe two-and-a-half, three minutes and you have to speak it. So you have to be concise and you want to simplify it as much as possible without losing the nuance. And so that’s fun and interesting and unique. You can write a piece and then you can really get lost in the nuance, but with recognition that there’s not going to be as many people concurrently watching you…. And then you could do a podcast where you’re having a discussion and that’s sort of a different type of thing and someone’s really going to take it with them and kind of hold it. You can talk with different people and it can come alive in a different way.
The key when you’re delivering the message, is to deliver it in the way that’s most impactful for the medium that you’re actually on. So if you’re on TV, you can’t drone on in a way that you might be able to do in a podcast, but you might lose a little bit of the wider explanation and the asterisk that’s going on there. They’re all challenging in their own different ways, but I enjoy doing all of them.
How do you simplify polling analysis for a TV audience without losing nuance?
So, what’s the most interesting? Can I form a thesis around this data point? Is this going to be the type of thing that people are going to want to talk about? What is the audience going to be interested in? Because just because I’m interested in something doesn’t mean they’re going to be interested in it. You want to try and find the middle of the two. I can’t tell you how long I spend on the graphics that I put up on the air. It is a constant fight. How big is that font? I want that font to be as big as possible.
I’m of the belief that you have three different audiences when you’re doing a TV segment, you have the folks who are watching and listening. You have the folks that are just watching with mute on, and you have the folks that are just listening to it on SiriusXM. So I’m always trying to develop the segment to hit all of those audiences, and that is the great fight. I’m a big believer that you should have faces, but I’m also a big believer that the numbers can’t be very, very small compared to the faces which are so large because everyone knows who Kamala Harris is, and Donald Trump is. Those images are burned into our audience’s minds. What’s not burning in their minds is the number. I’m always saying to myself, “bigger font, bigger font, bigger font.” Because you don’t know, they could be watching on a large screen TV, but it’s very different if you put a graphic on a computer, the person could zoom in.
How do you think your experience covering the 2020 election will inform your coverage of the upcoming election?
It is very important to just be as transparent as possible about the results and when we expect the results. It doesn’t really matter when they’re counted as much as how accurate those counts are, and those counts are accurate. So making sure that we explain it to the audience, it may take ‘x’ number of days, but we’re going to take as long as needed to be as accurate as we can be. So I think that remembering that from 2020 continues to be important. Probably the biggest difference between 2020 and 2024 in terms of the coverage will be how close I can be to my colleagues while we’re reporting the results.
Do you think that the upcoming election is as close as it seems?
Look, it’s the closest race I’ve ever covered. It’s the closest race in the polls in 20 years. Is that result going to actually match up? I don’t know the answer to that question. It very well could be. Arguably, the most likely outcome is one candidate wins all the swing states, and the second most likely outcome is the other candidate wins all the key six or seven states, depending on how big you want to make your map, I would call it seven. Polls are not perfect. And usually if they miss in one state in one direction, they’re going to miss in similar states in that similar direction. So the idea that it’s necessarily going to be 270 to 268 or 280 to 258 [electoral votes for either candidate] – it could be, but it could easily be north of 300 for both of the candidates. And that result shouldn’t really surprise us at all.
One last question, out of my own curiosity: I was watching a TED Talk that you did at Dartmouth in 2011. You talked about predicting the results of the presidential election using an equation of economy plus deaths plus congress plus term. And you were saying that you could use this kind of equation to predict the next election, which was 2012, with about 97% accuracy. Were you able to?
We got it right. If you watch that TED Talk, I had Obama winning the popular vote by two or three points, I don’t remember exactly what it was. And he won it by four? I mean it was pretty gosh darn close. TED Talks are always a little bit provocative, but there’s no doubt that the fundamentals do definitely matter. If you look at the economic fundamentals right now, it’s kind of split a little bit. I think actually disposable income, which is what that model used is I think among the worst metrics for the incumbent party. I’ve not rerun the equation. I don’t know what it says about 2024. But I mean, look, I think if you look at right-direction, wrong track, you look at shifts in party registration, which are mirrored in shifts in party identification nationwide. You look at the president’s approval rating, those are all bad for the incumbent party.
But the differences between now and 13 years ago are not just that I think I wear better fitting pants, but it’s also that we’ve entered an era in which elections have become exceedingly close, and these fundamental-based models have wide margins of error associated with them. And I myself have become much more curious about the world because I recognize how little I know compared to back then. And so I find it to be very interesting in terms of going to myself, “well, I see X, Y, and Z favoring this candidate in QR and S favoring this candidate, which way is it going to go?” It’s kind of the fun of it all. I don’t know the answer of which way it’s going to go. And this is probably the first time that I’ve covered an election in which I really didn’t know. Four years ago, I thought Joe Biden was more likely going to win. Eight years ago, I thought Hillary Clinton was more likely going to win. This one, I really don’t know. That’s what makes it interesting, at least from my point of view going forward. But the beautiful thing is we’ll find out soon enough.