Executive Branch

Only 17 Percent of Americans Believe Trump is Telling the Truth About Comey

Benjamin Wittes
Thursday, July 13, 2017, 1:00 AM

For some time, I have been frustrated by the poor state of public opinion data on national security matters. There’s virtually no long-term temperature polling of public attitudes towards major national security policy areas, and I’ve been talking to a variety of people about how Lawfare might fix that.

Published by The Lawfare Institute
in Cooperation With
Brookings

For some time, I have been frustrated by the poor state of public opinion data on national security matters. There’s virtually no long-term temperature polling of public attitudes towards major national security policy areas, and I’ve been talking to a variety of people about how Lawfare might fix that.

One instrument I’m particularly interested in is Google Surveys, which allows virtually anyone to become a pollster. A few months ago, Emma Kohse and I used it to produce this paper on public attitudes towards privacy matters. And I’ve been thinking more recently about developing a consistent stream of data on national security law and policy attitudes.

Partly because I’ve been thinking about this, more generally, this week, when President Trump tweeted his latest attack on former FBI Director James Comey, I put together a quick survey on the following question: “Who do you believe is telling the truth about the interactions between President Donald Trump and former FBI Director James Comey?”

The results are, in my judgment anyway fascinating. For one thing, they show a remarkable degree of public confusion and uncertainty as to what or whom to believe. Nearly 36 percent of the public responded “I don’t know”—more than any other answer. An additional 18 percent of respondents said they believe “neither.” That’s 54 percent of a national sample that either mistrusts both or just doesn’t know what to think.

The other striking feature of the poll is that the mistrust of the President is pretty extreme. Only 17 percent of Americans believe their president is telling the truth about his interactions with his FBI Director. If you exclude those who “don’t know” or believe “neither” and focus only on respondents who choose between the two men, 64 percent believe Comey over Trump. That’s an extraordinary lack of confidence in the President’s honesty, one that numerically must reflect at least some significant number of Republicans who don’t believe the President on this matter.

These results are broadly consistent with, if somewhat more dramatic than, other polling on the matter. An ABC/Washington Post poll conducted from June 2-4 found that 21 percent of respondents trusted Trump’s comments on Russian election interference and similar issues “a great deal” or “a good amount,” versus 36 percent who trusted Comey to that degree. Seventy-two percent trusted Trump “just some” or “not at all” on the matter, while 55 percent felt the same toward Comey. There are also polls that, like mine, have Comey and Trump facing off directly: a Politico/Morning Consult poll conducted between June 8-12 found that 45 percent of respondents trusted Comey more “to tell you the truth,” while 32 percent trusted Trump more and 23 percent didn’t know or had no opinion. Likewise, a Rasmussen poll conducted between June 6-7 reported that 45 percent of respondents trusted Comey more than Trump, 37 percent trusted Trump more than Comey, and 18 percent were undecided. Notably, these polls all agree that Comey enjoys a substantially greater degree of public trust than does Trump, but the Google Surveys poll shows a far greater percentage of individuals who remain undecided and a far lesser percentage who believe the President. This could reflect different question wording (this poll did not ask about trust but about whom the respondent believes), or it could reflect an erosion of confidence in the President’s truthfulness over time.

Just to be clear, this is not one of those unscientific internet polls like you can do on Twitter. Google Surveys is a real public opinion instrument. I had nothing to do with identifying the sample of people who responded to this poll, and they had no idea the poll had come from me. All I did was write the question. For those who are interested in how Google Surveys works, here is how Emma Kohse and I described it in our paper:

Sample sizes are relatively small, and the methodology is still controversial. On the other hand, Google Surveys allows for inexpensive polling on single questions by people and organizations who are not public opinion professionals.

Google Surveys is a public opinion platform that piggybacks off of the world’s addiction to Google. It works by providing small incentives to users to answer individual questions on either their smartphones or in the course of their web browsing. Some web sites use Google Surveys as a gateway to premium content: answer a question in exchange for access. And Google also has a mobile app that allows users to answer questions in exchange for credits on Google Play, the company’s entertainment platform. The idea is to leverage the gigantic sample of people using the internet into a public opinion research tool available to companies that pay Google for access to it. Surveys are easy to create on a simple and user-friendly interface. And results come in quickly.

Studies assessing the accuracy of Google Surveys data have concluded that it is relatively comparable to more traditional survey techniques. Nate Silver’s post-2012 election evaluation of polling accuracy ranked Google Surveys as the second-most-accurate of 23 polling firms, with less than half the average error of well-respected polls like Quinnipiac and Gallup. More recently, his website has given Google Surveys a “B” in its “pollster rankings”—a ranking that is not top grade but certainly respectable. It has been used in academic research in a variety of fields ranging from psychology to computer science.

The Pew Research Center did a comprehensive study in 2012 comparing the results of Google Surveys with those from Pew dual frame (landline and cell phone) telephone surveys, and found that the median difference between the two groups of results was three percentage points, and the mean difference was six percentage points. The researchers attributed some of the difference to differences in the structure and administration of the questions. Because Google Surveys does not use a true probability sampling method (i.e. random selection of respondents), Pew expressed concern about differences in the composition of the sample, but actually found that the sample “appears to conform closely to the demographic composition of the overall internet population.”

Of course, only approximately 84 percent of American adults use the internet, but it seems that the Google Surveys sample is a representative sample of at least this portion of the population. And though heavy internet users are slightly overrepresented in Google Surveys samples, this bias appears to be small. Google’s inferred demographic information was not very accurate for individual respondents, especially with respect to age, Pew found, but the overall pool is nonetheless representative. . . .

Google itself has done substantial research on the accuracy of its methods. In one study, it compared Google Surveys-obtained data and data from probability and non-probability based internet surveys to national media and health benchmarks. Specifically, it compared Google Surveys results and the results of other, more traditional surveys, to very accurate data: viewership information from Video on Demand, Digital Video Record, and satellite dish information, and health information from the Center for Disease Control (CDC). The study found that Google Surveys deviated least from the benchmarks in terms of average absolute error; it also had the smallest absolute error and the highest percentage of responses within 3.5 percentage points.

Still, Google noted the limitations of the platform. The internet population tends to be younger, better-educated, and wealthier than the general population at large, for example. What’s more, because Google Surveys asks each respondent only one or two questions, it can be hard to assess the relationships between responses, which do not always involve the same survey samples. In addition, some questions might be regarded as suspicious when they appear to be blocking content on a website, leading to bias in the responses.

To be sure, some reports are more critical, particularly about the unreliability of inferred demographic information, and they suggest that Google Surveys is more properly used as a supplement to probability based surveys. Still, a recent report cautiously concluded that the inferred demographics “may be sufficiently sound for post-stratification weighting and explorations of heterogeneity.” The researchers were able to reproduce the results of four canonical social science studies using Google Surveys, and determined that Google Surveys was likely to be a useful tool for survey experimenters.


Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance Studies at the Brookings Institution. He is the author of several books.

Subscribe to Lawfare