Ask Away! Kristen answers your March mailbag questions.
Friends don't let friends field 161 question surveys but they do encourage them to grow chili peppers, and other musings.
Welcome to Codebook, a newsletter that decodes our world through polling and research. This edition—"Ask Away!"—will be a new feature, rounding up the best reader questions.
Please subscribe here and follow my page on Facebook for the latest. And don't forget: Every Monday is a premium subscriber-only roundup of the numbers you need heading into the week.
The Mailbag is back! We've got questions on "push polls", overly long questionnaires, and chili peppers.
Read on, friends. If you want to get a question in for a future Ask Away, there are a few ways to do it.
1. Comment on Bulletin under any post. I see all the comments!
2. Comment on a post on my Kristen Soltis Anderson writer Facebook page or send a message to that profile. I read your notes!
3. Reply to me on other social media platforms (Instagram and Twitter: @ksoltisanderson).
4. Send presents.
On to your questions!
***
First question comes from Scott K. via Facebook, as a follow-up to last month's post on the shifting politics of COVID-19 measures:
Hey Kristen Soltis Anderson, really enjoy the data driven approach to understanding what's going on. But I wonder, as most people have sided, and are within the narrative bubble of their side, how willing they are to deviate from the narrative publicly or in surveys, no matter what they personally think? Are they repeating their sides narrative in their replies?
This is a big issue in issue polling. Are people responding based on their true views on an issue, or are they just sort of outsourcing their views to whatever they perceive "their side's position" to be?
The answer is a bit of "yes" and a bit of "so what?". Here's what I mean by that: voters are busy people. They have a lot of stuff they are doing besides developing opinions on every political issue under the sun. Even big ones that make the news, like "should we have a no-fly zone in Ukraine" require more time and devotion to research and study and analysis than most average people have in their lives.
Even something like "should we keep requiring masks" is something that people might have a gut instinct about but may not have spent a ton of time figuring out.
So what do we do? We look for cues in questions to tell us how our side probably would break on this and outsource that decision making to those pre-existing views. You see this most prominently in polls that specifically name leaders; if I do a survey asking voters what they think about "The First Step Act, a bill enacted with bipartisan support that aims to reduce the number people in prison by better supporting prisoners' transition out of incarceration and by increasing partnerships with prison education programs," I would be willing to bet a lot of Democrats would say that sounds wonderful!
But if I asked if they supported "The First Step Act, a bill enacted with bipartisan support and signed by President Trump..." you'd see Democratic support drop. Not because they like the bill or the idea any less, but because the T-word signals "maybe this isn't for your side." That's a very obvious example of course, but there can be all sorts of things where people just figure "this is what my side thinks" and responds accordingly.
At the same time...that's ok! People make judgments based on cues and perceptions of what "their team" would or wouldn't do all the time. In our democracy, your vote counts whether you've put in hours studying the issues or whether you show up and randomly pick names on the ballot. So I'm somewhat OK with polls where people are giving answers that are perhaps loosely held, so long as I keep in mind that's part of the deal and interpret the results with humility accordingly.
Next question!
From "Nexialist", whose cover photo is of two Goldens and who therefore gets priority in having their question answered:
What precautions do you take against push polling? Once a phony narrative begs the question, answers are compromised.
First, thank you for the correct use of "begs the question". A rarity!
Before I go further, I think it is important to define the word "push poll" because, much like "begs the question" it is a phrase thrown around online a lot incorrectly.
The technical definition of a "push poll" is when people are contacted under the guise of survey research and the goal of the survey is not to learn what people think but rather to push a message. I might even argue that "push poll" is a bad term for this, because it isn't a "poll" at all!
A push poll might go: "Hello, I'm calling from So-and-So to do a survey. First question: were you aware that Kristen Anderson is a terrible person who sometimes fails to get birthday cards dropped in the mail to family members in a timely fashion? Second question: were you aware that Kristen Anderson is a monster who sometimes makes her dog Wally wait until 5 pm to eat dinner when he is clearly hungry at 3:30 pm?" and so on. In all likelihood, the person doing the calling isn't even recording any answers, they're just saying mean things under the guise of it being a poll. This is a "push poll" and there is nothing research-oriented about it.
However! There are times when a legitimate poll being conducted for research purposes might ask people for their views on my birthday card tardiness or Wally feeding habits! If you are planning to run a campaign against me, and you want to know if you should run a negative ad about me, you might ask in a survey about both of those things and see which one people react to most intensely. Just because a poll asks how people feel after hearing a biased message does not mean the poll is a "push poll", even it means it is trying to gauge whether a particular message actually does push people's views!
Sometimes, people get up in arms about this if they are surveyed and asked legitimate research questions that include message testing of negative messages. They might take to Twitter and complain that they were "push polled." But if the survey is being done by a legitimate pollster, it is almost certainly not a "push poll," it is a message test.
What I think you, Nexialist, might also be asking is separate from the technical definition of "push poll" and instead is more about poll questions that are leading in nature. They "push" people to respond a certain way through the use of language and prompts that nudge or lead respondents to a particular answer. They're "legitimate" in that they are actually collecting responses, but they're also trying to manufacture a result. I got a similar question from Matt Homan here:
How much effort is put into framing of questions to prevent biased outcome? Esp. in situations where the pollster wants to put a thumb on the scale (e.g Party’s internal polling) but have it appear to be unbiased in an ad?
This one is a little tricky. Almost all internal polling very (the stuff for strategic and private use) very much wants to not be biased. What would be the point of doing a poll just to create bad data for yourself? This is also why a message testing poll, even though it "pushes" respondents, is still legitimate: it is pushing people intentionally for a valid research purpose to see if and how they move.
If I'm interpreting Matt and Nexialist's question correctly, though, this is more about concern around the data that does get released publicly. And yes, data is often released with some goal in mind: to educate and inform, sure, but also to perhaps guide allies to do more effective messaging, to push back against the conventional wisdom about an issue, etc.
I always want my questions to be worded in a way that is credible, because if it is clearly biased people will just throw it out. But Matt is right that there are subtle ways a question can still be biased even if I don't intend it to be.
For instance: you can do a poll that finds a large number of people "agree" with a lot of stuff because of acquiescence bias - people like to say "yes" "support" "agree" more than the negative responses. That's why some pollsters try to avoid such questions when they can. (One of my favorite pollsters, YouGovUK's Anthony Wells, literally has "Hates agree/disagree statements." in his bio.)
One alternative then is to ask questions that pit two sides of a debate against one another as fairly as possible. (A good example: the YouGov UK team's efforts at gauging UK public support for a no-fly zone!)
One other way pollsters keep that in check is having others review their work before they field. A client or partner pollster can check your biases.
Next question is one that wasn't directly submitted to me but that I am interested in enough to swipe from noted scooter-enthusiast Targeted Victory's Logan Dobson.
I had seen the Economist/YouGov poll often and was using it heavily in a post a few weeks ago when I scrolled to the top of the crosstabs and noticed what Logan noticed: they ask a lot of questions. (The most recent release is 161 questions, as Logan notes.)
I want to preface by saying I think YouGov is a strong pollster who does great work. Back a few years ago, there was a Pew analysis of "non-probability based polling samples" and they blinded out the identity of the pollsters in the study, but one - "Sample I" - rose above the rest. At the big conference of pollsters that spring, it came out that YouGov had been that clear winner. So this is very much not a commentary on the quality of their panel or rigor as a pollster.
But. 161 questions?
This is a lot. Too many. The longer a survey is to take, the more people are going to drop out because they're busy and can't be bothered. I'm a pollster and I've bailed out of over-long questionnaires before. And these aren't 161 cupcake questions either. Question 37 is whether you think Ukraine's President Zelensky will still be president in year. Question 61D is whether you think Ukraine should be able to join NATO. Question 76 is a fav/unfav of Alexander Lukashenko. (Do you not know who Alexander Lukashenko is? That's ok!) You've got a few dozen COVID questions, a question about whether the Boston Marathon bomber should get the death penalty, one question about whether you like Chuck Schumer followed by another question a few dozen down the list about whether you approve of the job Schumer is doing.
I once had a prominent household name pollster tell me that he'd run an experiment once to see how long he could keep people on the phone. He mentioned a survey where people were polled about their opinions on every single team in the National Football League. Most people are not interested in answering questions about this and find it tedious after a certain point, but NFL die-hards were happy to stay on the line almost indefinitely talking about the topic. The longer your poll is, the more you are effectively pushing out the normal people who just don't care that much.
You could definitely keep me on the phone for an hour asking me questions about Formula 1 or chili pepper growing. But you would not end up with a representative sample if that was your research design.
So while I am grateful for the massive volume of information that Economist/YouGov poll produces...holy moly, Batman.
Speaking of peppers, last question! From friend of the newsletter Password Is Taco:
Not so much a polling question, but I've noticed both @scottlincicome & you are homegrown pepper enthusiasts, so obv there is something to it. Sell me, a person who has lived a fine life w/ mostly bell peppers and ground cayenne, on the advantages of all these other varieties.
Perfect timing, friend. It's pepper sprouting season and so I've got thoughts for days.
My entry point into pepper growing was mostly about the ease of growing peppers versus other things. Root vegetables are tough if you live in an apartment or a townhouse. Some other fruits and veggies can be very sensitive and require a lot of care and maintenance. Peppers? They practically beg you to neglect them. My first season growing stuff was on a small apartment patio that was little more than a brick pizza oven with an outdoor sofa in it, and despite murdering basil, coriander, and a whole host of other poor little plants I brought home from Home Depot, the one little habanero plant I'd bought absolutely thrived. Harvesting it at the end of the summer was so satisfying. I was hooked.
The next benefit of growing peppers is that the variety you can get in stores just isn't nearly what you can grow easily on your own. Your average grocery store probably has the peppers you listed - bell peppers, cayenne powder, jalapenos, maybe a few others - but not a very wide range. But chili peppers aren't just about heat and spice. Consider one of my favorite weirdo peppers, the habanada, which is a habanero that has been genetically noodled with to remove the heat! Without being nuclear level hot, you can taste the cool floral and sweet notes that are also there in a habanero.
I don't drink wine very much, so if you set a glass of red wine in front of me and asked me if it's a pinot noir or a cabernet franc I will basically be flipping a coin to give you an answer. And if you're not a pepper person, the difference between a jalapeno and a serrano is probably not a huge deal.
But if you like spicy food at all, growing your own chilis gives you access to a wider range of cool, subtle flavors. For instance, I love making a salsa that leans on the chocolate habanero (I basically take this J. Kenji Lopez-Alt Serious Eats recipe and throw an heirloom tomato in the blender with it.). It wouldn't have the same rich taste with plain orange habaneros you get from the store.
Even a pretty easy-to-find pepper like a serrano is just a whole different ballgame when you grow it at home. I make homemade sriracha (my "Red Menace" sauce which is...appropriately or inappropriately named depending on your point of view given *gestures around at everything*) using home-grown serranos that I let ripen on the plant in the fall until they're red. That makes them sweeter and prettier when I blend them up with palm sugar and vinegar and so on. I can get big green serranos at Whole Foods, but I can't get extra flavorful ripe red serranos there.
Bottom line:
Growing your own chilis isn't too hard, they aren't super finicky and thrive on benign neglect AND there are so, so many varieties that are very useful in home cooking that just aren't in stores.
That's it for this Ask Away! Thanks to everyone who submitted, I'm sorry I couldn't get to all of them and hope to tackle more in coming weeks! And if you missed the boat this time, send your questions along and I'll hang on to them for next month's edition.
***
Thanks again for being a reader of Codebook. Do you have a question for the next "Ask Away!" feature? Hop to the comments to start the discussion.
If you enjoyed this post, please share Codebook with a friend or colleague!