26

Various websites are repeating the claim in this tweet, which says:

Completed analysis of database of 180 million voter registrations.

Number of non-citizen votes exceeds 3 million.

Consulting legal team.

and this tweet which says:

We have verified more than three million votes cast by non-citizens.

both of which were tweeted by Greg Phillips from VoteStand.com.

Sites repeating the claim include:

So I wonder if it's true or not?

user21795
  • 1,111
  • 12
  • 15
  • 6
    When asked to explain his methodology, the source of this claim [said](https://twitter.com/JumpVote/status/797996056881414144): "We will publish all for review." He has not done that as of right now. – ff524 Nov 15 '16 at 01:16
  • 4
    Also, the sources repeating the claim don't appear to have done anything to independently verify it; in fact, they mostly get Greg Phillips's affiliation [wrong](https://twitter.com/JumpVote/status/798307287492227072). – ff524 Nov 15 '16 at 01:19
  • 1
    Well if states do not require id for voting, how can the guy knows it's a citizen that's voting? – user21795 Nov 15 '16 at 06:02
  • He says he has an [enhanced database](https://twitter.com/JumpVote/status/797948714052681729) of some kind that allows him to provide ["irrefutable evidence"](https://twitter.com/JumpVote/status/798364544024223744). When asked for explanation, he says [we will explain it in the report](https://twitter.com/JumpVote/status/798226080805044224), [in court](https://twitter.com/JumpVote/status/797977434968555520), or [to the public in an open format](https://twitter.com/JumpVote/status/798343188914245632). (But his method is ["proprietary"](https://twitter.com/JumpVote/status/797949158028152833).) – ff524 Nov 15 '16 at 06:13
  • 15
    But when asked if there is an expected timeline for that release he said [No. We have to be right. It's worth the time.](https://twitter.com/JumpVote/status/798318145043775489). Apparently it's important to make sure you're right before releasing evidence, but it's not necessary to make sure you're right before tweeting. – ff524 Nov 15 '16 at 06:14
  • 9
    As of the 2010 census 22480000 US residents where foreign born and non-naturalized (i.e. not US citizens) of these 7% were under the age of 18. For this claim to be true that means approximately 14% or 1 in 7 of all non-citizen US residents over the age of 18 must have both **successfully** registered to vote *and* voted (presumably without detection, prior to this tweet). That's a very high barrier to cross. – tallus Nov 15 '16 at 06:26
  • http://www.census.gov/prod/2012pubs/acs-19.pdf - source for stats – tallus Nov 15 '16 at 06:33
  • @tallus "How many non-citizens participate in U.S. elections? More than 14 percent of non-citizens in both the 2008 and 2010 samples indicated that they were registered to vote." https://www.washingtonpost.com/news/monkey-cage/wp/2014/10/24/could-non-citizens-decide-the-november-election/ (just passing this Washington Post article along, not necessarily agreeing with it) – DavePhD Nov 15 '16 at 15:43
  • 6
    The linked article attempts to answers that question (2.2% in 2010) it also has a prominent note that the methodology has been criticized in a peer review article.That article concludes: "the likely percent of non-citizen voters in recent US elections is 0". http://www.sciencedirect.com/science/article/pii/S0261379415001420, [Its clear](https://www.washingtonpost.com/news/monkey-cage/wp/2014/10/27/methodological-challenges-affect-study-of-non-citizens-voting/) there are problems: 42% of alleged non-citizens 2010 voters were US born, 71% had been previously identified as citizens. – tallus Nov 15 '16 at 17:41
  • Isn’t voting as a non-citizen a criminal offence? (I did read that a woman was jailed for voting in two states, which would be about the same severity). So that would be 2 million criminals. Surely we would have heard of at least a few ten thousand arrests and convictions. – gnasher729 Aug 26 '20 at 09:10
  • @gnasher729 well, it also seems the twitter account linked to above and in the comments is abandoned now (no tweets visible to me, 7 followers). – Hulk Aug 26 '20 at 14:55
  • @gnasher729 [yes it is, and grounds for removal](https://www.usatoday.com/story/news/2020/02/21/rosa-maria-ortega-texas-woman-sentenced-8-years-illegal-voting-paroled-and-faces-deportation/4798922002/). – phoog Aug 27 '20 at 03:47
  • 1
    @phoog That's what I thought. So with 3 million illegal votes, surely you wouldn't have found one case, but thousands. – gnasher729 Sep 02 '20 at 12:48

1 Answers1

28

This will be long, so I'll include a summary at the end.

Harvard's Cooperative Congressional Election Study (http://projects.iq.harvard.edu/cces/home) is made up of multiple groups of researchers who query over 50,000 people before and after American elections. In the 2010 election survey*

The 2010 CCES involved 30 teams, yielding a Common Content sample of 55,400 cases. The subjects for this study were recruited during the fall of 2010. Each research team purchased a 1,000 person national sample survey, conducted in October and November of 2010 by YouGov/Polimetrix of Palo Alto, CA. Each survey has approximately 120 questions. For each survey of 1,000 persons, half of the questionnaire was developed and controlled entirely by each the individual research team, and half of the questionnaire is devoted to Common Content.

Source: CCES 2010 Guide.pdf Page 4, downloadable here: https://dataverse.harvard.edu/dataset.xhtml?persistentId=hdl:1902.1/17705

In 2014, Richman, Chattha, and Earnest published a paper in Electoral Studies examining the rate of non-citizens voting found by CCES:

Of 339 non-citizens identified in the 2008 survey, Catalyst matched 140 to a commercial (e.g. credit card) and/or voter database. The vote validation procedures are described in detail by Ansolabehere and Hersh (2012). The verification effort means that for a bit more than 40 percent of the 2008 sample, we are able to verify whether non-citizens voted when they said they did, or didn't vote when they said they didn't.

Source: "Do non-citizens vote in U.S. elections?", section 2. Data

In that paper, they estimate:

How many non-citizen votes were likely cast in 2008? Taking the most conservative estimate – those who both said they voted and cast a verified vote – yields a confidence interval based on sampling error between 0.2% and 2.8% for the portion of non-citizens participating in elections. Taking the least conservative measure – at least one indicator showed that the respondent voted – yields an estimate that between 7.9% and 14.7% percent of non-citizens voted in 2008. Since the adult non-citizen population of the United States was roughly 19.4 million (CPS, 2011), the number of non-citizen voters (including both uncertainty based on normally distributed sampling error, and the various combinations of verified and reported voting) could range from just over 38,000 at the very minimum to nearly 2.8 million at the maximum.

Source: "Do non-citizens vote in U.S. elections?", section 3.3 Voting

Two Principal Investigators of CCES and a representative of YouGov responded to the paper in late 2014:

Suppose a survey question is asked of 20,000 respondents, and that, of these persons, 19,500 have a given characteristic (e.g., are citizens) and 500 do not. Suppose that 99.9 percent of the time the survey question identifies correctly whether people have a given characteristic, and 0.1 percent of the time respondents who have a given characteristic incorrectly state that they do not have that characteristic. (That is, they check the wrong box by mistake.) That means, 99.9 percent of the time the question correctly classifies an individual as having a characteristic—such as being a citizen of the United States—and 0.1 percent of the time it classifies someone as not having a characteristic, when in fact they do. This rate of misclassification or measurement error is extremely low and would be tolerated by any survey researcher. It implies, however, that one expects 19 people out of 20,000 to be incorrectly classified as not having a given characteristic, when in fact they do.

Source: http://projects.iq.harvard.edu/cces/news/perils-cherry-picking-low-frequency-events-large-sample-surveys

Indeed, they go further to identify the rate at which we can expect citizens to be classified as non-citizens in the survey:

First, the citizenship classification in the CCES has a reliability rate of 99.9 percent. The citizenship question was asked in the 2010 and 2012 waves of a panel study conducted by CCES. Of those who stated that they were citizens in 2010, 99.9 percent stated that they were citizens in 2012, but 0.1 percent indicated on the 2012 survey form that they were non-citizen immigrants. This is a very high reliability rate and very low misclassification rate for self-identification questions. See Table 1.

Table 1 of "The Perils of Cherry Picking Low Frequency Events in Large Sample Surveys"

Source: "The Perils of Cherry Picking Low Frequency Events in Large Sample Surveys"

Third, the panel shows clear evidence that the respondents who were identified as non-citizen voters by Richman et al. were misclassified. Clearly misclassified observations are the 20 respondents who reported being citizens in 2010 and non-citizens in 2012. Of those 20 respondents, a total of 3 respondents are classified by Catalist as having voted in 2010. Additionally, exactly 1 person is estimated to have voted in 2010, having been a non-citizen in 2010 and a citizen in 2012. (Note: This might not be an error as the person could have legally become a citizen in the intervening two years.) Both of these categories might include some citizens who are incorrectly classified as non-citizens in one of the waves.

Importantly, the group with the lowest likelihood of classification errors consists of those who reported being non-citizens in both 2010 and 2012. In this set, 0 percent of respondents cast valid votes. That is, among the 85 respondents who reported being non-citizens in 2010 and non-citizens in 2012, there are 0 valid voters for 2010. 1

Table 2 of "The Perils of Cherry Picking Low Frequency Events in Large Sample Surveys"

Source: "The Perils of Cherry Picking Low Frequency Events in Large Sample Surveys"

In summary:

  1. It is true that a certain number of respondents on a large election survey both reported that they had voted and they were not citizens.
  2. It is not true that the citizenship status of any of these people has been verified - it is entirely self-reported. Voting status is verified through Catalyst databases of voters.
  3. It is entirely possible that a tiny error rate of self-reported citizenship could create a very large and impressive number of supposed non-citizens who are voting, just through extrapolation from the data set. The number of people in the survey who reported that they were citizens in 2010, but were not citizens in 2012, indicates the viability of this explanation.

*I looked for a similar 2008 CCES guide and summary, but did not find it on the CCES website - it might improve this answer if anyone can find one.

Esteve
  • 859
  • 6
  • 7
  • Interesting, do you know how/if studies like these account for people who *are* legally citizens, voted, but don't *describe* themselves to be conventional citizens? I'm thinking people for whom it might be a political statement like the "freeman on the land" movement and others – user56reinstatemonica8 Nov 15 '16 at 18:22
  • The CCES questionnaire does not provide a good option for people who might identify in this way. Here's the ways answers to most recent version of the citizenship question are categorized: – Esteve Nov 15 '16 at 18:29
  • Immigrant CitizenImmigrant non-citizen First generation Second generation Third generation Skipped Not Asked – Esteve Nov 15 '16 at 18:29
  • The guide to that may be found here, https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/XFXJVY Guide to the 2014 CCES Common Content.pdf Page 38 – Esteve Nov 15 '16 at 18:30
  • 2
    While this answer addresses the claims of Richman et al that 38,000-2.8 million non-citizens voted in 2008, it's not clear whether the "3 million non-citizen votes cast in 2016" claim is based on the same methodology. (Since the source of the latter claim has not provided enough information for anyone to criticially evaluate the claim.) – ff524 Nov 15 '16 at 19:11
  • 1
    @ff524 I agree, but the claims are very similar, although it is impossible without more information to determine how similar the methodologies are. – Esteve Nov 15 '16 at 19:43
  • 7
    [Welcome to Skeptics!](http://meta.skeptics.stackexchange.com/questions/1505/welcome-to-new-users) Nice first answer! Hope you'll hang around to answer some more. – Oddthinking Nov 16 '16 at 00:19
  • Thanks! This is a fascinating place - I'd like to stick around :) – Esteve Nov 16 '16 at 15:05
  • Out of 20,000 people, you would expect some to just tick the wrong box. Being a citizen in 2010 but not in 2012 would be very, very, very rare. The other way round would be slightly uncommon. – gnasher729 Aug 26 '20 at 09:12
  • Given that about 7% of US residents are not citizens, a random survey should have had 1400 non-citizens instead of about 100. Any information how the people were picked for this survey? – gnasher729 Aug 26 '20 at 09:42
  • Survey respondents are not random: "How are respondents recruited into the CCES? A large portion of the CCES respondents are YouGov panelists. These are people who have made an account on yougov.com to receive periodic notifications about new surveys." more here: https://cces.gov.harvard.edu/frequently-asked-questions – Esteve Aug 27 '20 at 13:46