Monday, October 31, 2022

Pollsters are getting a bit nervous. Something is not computing. Updated post midterms.

Update: What actually happened and why. Pollsters got it wrong in 2018, 2020, and 2022. Here’s why political polling is no more than statistical sophistry | Fortune 

 The ‘Red Wave’ Washout: How Skewed Polls Fed a False Election Narrative - The New York Times (nytimes.com)   Analysis of why polls got it wrong and the red wave was a ripple.   

The original blogpost: Oct.31, 2022 follows.  I was on target, it appears.

If the GOP touts polls that the voters favor their issues, the economy, crime, and immigration, they should be crushing the Democrats, but races are closer than they should be, remarked an experienced pollster this morning.  Something is going on "out there" they are not measuring.  It could be what is "out there" is being missed by just asking the wrong question, or screening litmus test questions are misread.  We will not know how much is not measured "out there" will change the predicted and expected outcome of the midterms, but if the pollsters are so off base, there may be some explanations. Opinion | Frustrated With Polling? Pollsters Are, Too - The New York Times (nytimes.com)

Here are some post-election day explanations and excuses if the polls turn out to be so wrong this year. Turnout by certain large demographics were missed in attempting to predict turnout in a way that made a one or two-percent difference in key races.  Women, thought to have gotten used to the end of Roe v Wade and were more concerned about inflation, may have been a sleeper vote.  They were angrier than the pollsters thought. GOP opposition to gun control and fear of school shootings may also figure into an angrier suburban women vote than predicted. Students who appreciated at least more than a token reduction in debt and the attempt by the GOP to turn back the clock on gender and race and student debt relief may vote in larger numbers than pollsters predict. Seniors may have realized late in the game that social security and medicare were at risk.  There are other underlying emotions that make voter decisions hard to measure by just asking public policy questions or certain issues.  Those are cultural and values factors.  Asking, for example, as a few pollsters do, "are you worried about democracy" as an issue that motivates voters, misses the point without testing for "do you believe the 2020 election was stolen?" or "do you fear your rights or your vote count are jeopardized by partisan interference?" or  "are you disgusted by political violence." This might have revealed differing attitudes by those who "worry about democracy and blaming which party?".  Some voters are turned on by love or hate of Donald Trump and the candidates he backs that have little to do with the issues themselves.  How can pollsters measure that?  Some pollsters do not even screen for these imponderables and solely rely on attitudes on legislative public policy issues. Some polls are questions posed to get a specific answer to benefit the candidate or party strategy.  Those are called "push polls", but do not measure voter preferences in the actual ratio and even shape the thoughts of the voters.   For that reason, I am treating polls published for public consumption with a grain of salt. There is another reason for my skepticism.  In my early years of political activism, I conducted polling and constructed questions myself or worked under the direction and tutelage of professional pollsters. Experience has jaded my trust in polls as an accurate predictor when they find no overwhelming wave or consensus and everything is in the margin of error, as it is this year. 

No comments:

Post a Comment