“Just give me the facts.” We started our last missive with this line and discussed how designed experiences lead us to specific outcomes. Thinking more about this subject, one of the frequent types of data we see and interact with is polls that tell us what the public or specific groups within the broader population think about topics or issues.
The last few years have not been “kind” to pollsters. As you can read here or here or here, there’s a lot of distrust of political polls floating around out in the ether. How can something as simple as asking a question and getting a response be so complicated?
This video from Pew Research Center’s Courtney Kennedy goes over a few ways the simple question-wording in a survey can significantly impact its outcome.
The video highlights a few basic things Pew Research does to help guide individuals to give helpful answers. A few examples they cite of problems that questionnaire creators should avoid are:
Giving too much or too little background information in the question itself
Priming and pushing specific responses based on how you frame the question
Using double negatives and other confusing question structures
These and other items listed in the video are ways question-askers introduce response bias into their questionnaires. These biases impact the results that the question or prompt produces.
Even beyond the way the questions are worded, the way the poll is administered can introduce bias in the response. On a recent episode of the FiveThirtyEight Politics Podcast, Kennedy discussed the ways polls are administered and how that impacts their results.
Lest we think this only applies in the political space, here’s an interesting short video talking about the impact question wording can have on Customer Support questionnaires from companies:
We all come about information in different ways. Sometimes we’re aware that we are being presented with information in a way that frames our perspective; sometimes, we’re not. Sometimes we are aware of the framing, but we think we’re being given the information in a way that’s intentionally free from bias.
The truth is there is no way in which the form of the questionnaire is not impacting the results it produces, or said another way, there’s no way the medium is not affecting the message.
We are not powerless in this situation. The information we are getting at any time is likely just one side of the story that, without further research, should be wholeheartedly accepted at face value. In political polling, FiveThirtyEight puts out a list of pollster rankings after every major cycle, but even then, we have to decide whether we trust FiveThirtyEight. In that case, we must look for the institutions that show their work.
How are people getting to the conclusions they are drawing? I will not likely become an expert at everything in the world, and many things are very complicated. We must come to a place where we trust someone versed in those fields to give us the information, but we must always be aware that that information is influenced at every step. Through that awareness and knowledge of the types of ways information can be affected, we can learn who to trust and who to ignore.
So the next time we find ourselves asking for “just the facts,” we must remember that down to the method used to gather the facts themselves, the medium impacted the message before any conclusion was drawn. There’s no such thing as context-free information.
A few extra videos and articles that didn’t make the main article: