You may have heard that the Green and Lacour study on using canvassing to change opinions was retracted. If not, that’s actually kind of good, because that makes debunking a bit easier as you don’t have the wrong idea in your head already. I almost had to write my own retraction because I was pondering writing a post based on Green and Lacour’s findings when I learned that the data was manipulated to get a headline-making result. I find those “everything you think is wrong” stories to be irresistible click bait, so when I heard one of the reports on the study, in a recent This American Life, The Incredible Rarity of Changing Your Mind, and being someone who does a lot of canvassing (by volunteer standards) and has run some doorknocks myself, this just screamed near future blog post. I don’t know which is worse, admitting that I procrastinated about writing, or admitting that procrastinating really helped. So I didn’t write up how amazing these findings were and how we might use them, but I did discuss it in some private conversations, and I’m really hoping those individuals are reading this.
The study came from a good impulse. Proposition 8 in California in 2008 put a ban on marriage equality in the state constitution after it had already been legalized. The “no” campaign expected to win between its lead in the polls, the large turnout the Obama campaign was generating, and California’s general liberal leaning, so defeat was a surprise. After its unexpected loss, the “no” campaign cooperated in the experiment to see if it could send canvassers into areas where they lost and sway opinion face to face.
FiveThirtyEight summarized the study in it’s article on the retraction:
The article, published last December in Science Magazine by UCLA graduate student Michael J. LaCour and Columbia University political scientist Donald P. Green, appeared to show that an in-person conversation with an openly gay person made voters feel much more positively about same-sex marriage, an effect that persisted and even spread to the people those voters lived with, who weren’t part of the conversation. The result of that purported effect was an affirmation of the power of human contact to overcome disagreement.
By describing personal contact as a powerful political tool, the paper influenced many campaigns and activists to shift their approach to emphasize the power of the personal story. The study was featured by Bloomberg, on “This American Life” and in activists’ playbooks, including those used by backers of an Irish constitutional referendum up for a vote Friday that would legalize same-sex marriage.
They published a study on abortion rights finding the same thing (the Bloomberg link in the quote), namely that canvassers who were directly affected, meaning women who had an abortion and mentioned this at the door, had a long lasting persuasive effect (the This American Life story has a recording of a canvasser’s conversation with a pro-life voter). Read the rest of the FiveThirtyEight article for the political science parts like how the fabricated numbers were detected and how the peer review process works. I’m more interested in the ramifications where we’re actually doing the doorknocking. And yes, I’m ticked off enough to engage in a bit of a rant.
What Lacour did by faking the numbers (it looks like Green’s culpability is in not catching what Lacour did) was not deceive only the publisher and readers of the article, but he wasted the time and work of the canvassers who carried out the experiment. Presumably they were careful about following the experiment’s protocols, and here this deceiver wastes all that effort by making up a spectacular result. Maybe the real result showed nothing, but that happens. Experiments aren’t failures because they show no effect or tell us what we already knew, but such experiments don’t get headlines and don’t make the careers. To compound the wasted effort, think about those campaigns FiveThirtyEight mentioned that used these fake results to plan their canvassing. The Irish pro-marriage campaign won easily anyway, but think about working on that campaign if they had lost narrowly, and then learned the strategy was based on fake research. I’d like to make Lacour think about it to be sure. At least, it appears, he blew his assistant professorship at Princeton, though I’m guessing this article was part of how he got it, and I’m guessing that’s why he did it, so as much as I hate to see anyone lose their job, I’m not bothered this time.
That doesn’t quite feel like a rant. Lacour, you’re a useless wanker. There, that’s better.
So what do we really know? There has been research showing something different, and the similarity of the results among different researchers was part of what raised suspicions. It turns out canvassers have their best effect when they’re demographically similar to the people they’re talking to. The Washington Post’s Monkey Wrench blog explains it here and here. From the first link:
Responding to the retracted LaCour and Green study, Eitan Hersh noted in this space [that’s the second Monkey Cage link] that the findings notably contradicted a significant body of research on get-out-the-vote canvassing. That research has consistently found that the most effective campaigners are like the target population: either from the same neighborhood, or sharing racial or ethnic identity. Hersh concludes that probably the best canvassers on behalf of same-sex marriage would not be LGBT canvassers but actually canvassers who share an identity with their targets. This is precisely what we found in 15 experiments over the past five years as part of a book manuscript under contract with Yale University Press, tentatively titled “Listen, We Need to Talk.”
I’m not saying you can’t try to persuade people as tried in Green and Lacour retracted study, though I am saying that if you do, you can’t cite their study as evidence that you’re approach will work. This American Life suggested this shows a need to rethink conventional wisdom, which in this case was described as the canvassing strategy of finding your supporters and then getting them to vote. Put me down for conventional wisdom in this case, since that’s absolutely how I approach canvassing. I try to persuade when I run into someone on the other side, but I don’t try to find such persons and don’t hold out much hope that a stranger at the door can change an opinion in a few minutes of conversations. Even when persuading, I’m really just seeking points of agreement and hoping to extend a bit past that. Given how many “conventional wisdom is wrong” posts I’ve written, I’m usually inclined to presume conventional wisdom is based on old information, habit, or a few people’s experience or even just their perceptions of their experiments, which is a limited dataset even if accurate. So to be sure, challenge conventional wisdom, and test whether it’s really right, but sometimes, after testing, it turns out to be right. Sometimes you get exactly the result anyone with a bit of logic might have predicted. I’m thinking of when my senate district experimented with registering new voters by going door to door to addresses not on the list of registered voters, and we guessed we would get the most registrations in new apartment buildings where all the residents would be recently moved in. After trying different areas what we got was — exactly that. There’s the comfort of knowing that you’re actually knowing instead of guessing, but still, not exciting.
Now let’s ask some questions about how to apply what we know. As real life tends to do, it has some complications for us. It’s not as simple as just sending out canvassers demographically similar to the people they’ll be talking to. Especially when your canvassers are volunteers, you use whomever you have, and assigning turfs demographically isn’t practical. That’s assuming you even have the demographic information about the voters, which you might not if you’re still building your voter database. I also noted that all the research mentioned in the linked articles is in regard to referenda or specific issues, which I suspect is different from an election. An election has a third demographic involved, the candidate’s. I have a feeling the candidate’s demographics more important than the canvasser’s, though I know of no research to that effect.
I also suspect canvassing for an election differs in that we’re asking voters to perhaps violate their party loyalty. People vote their identity. They vote the way their family votes. We’re asking them to vote for someone they think they disagree with and they’re likely right, whereas asking someone to change their mind on just one issue is much less to ask. A referendum might not even require that much, like “no” campaigns on state constitutional amendments sometimes gain votes from people who support the concept, but come to disagree with putting it in the constitution.
It’s probably become clear I’m skeptical about persuading voters on elections. It’s not just a matter of being hard to persuade partisans, but swing voters and ticket splitters, which I suspect overlap considerably, are decreased as a proportion of all voters and are less likely to turn out than strong partisans. That means that especially in a low turnout election, campaigns are spinning their wheels to pursue such voters. Yes, try to persuade once you have such a voter at the door, but I question the wisdom of targeting such voters even though campaigns try it election after election. There are Democratic campaigns which try to comb the haystack for the needles of ticket splitters. I’ve heard that there are ticket splitters who support one party heavily but, for their own reasons, pick some race where they cross parties, and candidates try to be the one Democrat they pick, which actually puts them in competition with other candidates on the same ticket. If you’ve really found the Democratic voters and there just aren’t enough of them, I suppose you’re stuck, but I’ve seen candidates in competitive races skipping the unregistered eligible voters, working the same turfs over and over, so I get skeptical of campaigns which decide on a strategy of persuading Republican leaners to flip. Yes, that skepticism applies even to red districts where a strategy of increasing Democratic turnout is least likely to work, because campaigns might as well pick the low hanging fruit before setting up the ladders.
Moreover, a persuasion strategy may be failing to recognize that a swing district might not be full of swing voters. I can think of state legislative districts where my best guess is that’s actually the case, but swing districts can be full of partisans of roughly equal numbers. Digging for the last maybe-they’ll-vote ticket splitters seems like a losing strategy. I certainly don’t understand looking for such voters before looking for every Democrat. If someone finds a successful persuasion strategy, terrific, but the strategy of digging for Democrats amid unregistered eligible voters or registered infrequent voters is much more the proven strategy, if Democrats will just do it. Rep. Keith Ellison, CD5, just wrote this article in The Nation about a better GOTV strategy. I worked with his campaign in my role as DFL chair of a state senate district in his congressional district, so I’ll vouch that he tried this stuff before recommending it. I have a guess that he’s aiming this at other Democratic elected officials, particularly those in deep blue districts who do enough to keep getting reelected, and that’s about it. Rep. Ellison has had a huge impact because rather than do the minimum to avid an intraparty challenge and then cruise through the general election, he puts his resources into a ground game into a deliberate attempt to help the rest of the ticket. His district has gone from consistently the lowest turnout to matching any other CD in the state, and Minnesota is usually the highest turnout state. Given how blue this district is, those additional voters are heavily DFL. In other words, what he’s doing worked, much to the benefit of not only statewide candidates, but DFLers are now consistently winning the swing legislative districts in the 5th CD. The Twin Cities inner suburbs may have gotten bluer anyway, but there’s no doubt that his focus on digging for Democrats has increased the number of Democratic votes in both urban and suburban parts of the district, helping the whole ticket including statewide candidates (in other words, no improved ground game in the 5th CD, no Sen. Franken or Gov. Dayton).
If this strategy works in a district with decent turnout to start with, how much better where turnout is low? Even where turnout is generally good, the odds are heavy that Democrats are passing up votes of non-voters and infrequent voters who would vote Democratic if they voted. Other writers here on MPP have run campaigns where they dug for Democrats in tough races and won, but I’ll leave it up to them how much they care to reveal in such a public forum.
Given the evidence that digging for Democrats works, and the lack of evidence that persuasion works, forget persuasion as anything more than an experiment — though I do enjoy campaigning experiments, so if that’s your intent, don’t let me discourage you. But if that’s how you plan to win, please stop. Find the Democrats who aren’t reliably voting. Start now, even if your race isn’t until next year (and if you’re running this year and haven’t started your ground game yet, ouch). Accept that ticket-splitting is much less common than it used to be and run as a ticket, not as one candidate. Remember that if even just irregular registered voters turn out, Democrats win. Also keep in mind that each additional Democrat you get to the polls means a vote for the other Democrats on the ticket, and wouldn’t you rather have elected officials who will support your policies rather than try to thwart them?