Tell me a Story. 515 1a. week #3

“Wilmi en haar Wolfman” by Arnelle Woker is licensed under CC BY-NC-ND 4.0

Tell Me A Story.

“Left hand driving is much safer than right hand. That’s scientifically proven”

This excerpt from “Kitchen Stories”, written and produced by Bent Hamer and Jorgen Bergmak, and directed by Bent Hamer shares a tale of a research project that had kitchen optimization for Norwegian men living alone as its goal. The research plan was to situate a male observer in the corner of the kitchen in a towering chair to observe the single Norwegian man as he moved around the kitchen. The researched (Isak) is lured in to participate through the offer of a horse; sadly, the horse given to him is not that of a living breathing horse but rather a small artifact that may be housed on a shelf somewhere. This initial incident, what would now be considered an ethics violation, leads to a multitude of further research/ethics violations, according to the way the research was structured, and also leads the researched (Isak) to also become a researcher as he stops using his kitchen and observes the researcher (Folke) at his high station through a hole he has made in the ceiling. These violations, while set in a humorous context, illustrate a deep disfunction and one cannot help but feel sadness for both the researched (who ultimately dies) and the researcher who develops a strong friendship with the researched.  This movie does not specifically shed light on the experience the reader might have had of this particular study, although one may imagine that if the study was not deemed invalid due to many critical errors, the readers may have a perspective on the situation that could potentially be full of inaccuracies. Or perhaps the ‘reader’ in this case could be applied in multiple ways; as the viewer of the film, as the researcher who comments at the end of the movie, ” I have some observations here that should interest you. He’s hit on something essential here.”, and, as the reader previously discussed. The movie brilliantly addresses themes connected to friendship and human nature while simultaneously addressing the inherent flaws of research in a satirical manner.

Although today there are ethics boards and research committees, policies and procedures to prevent these types of violations from occurring; research may still be fallible. According to Brown, Kaiser, & Allison, (2018) in, “Issues with data and analyses: Errors, underlying themes, and potential solutions”, there are a variety of errors that may occur within a study.

“We have noted errors related to measurement, study design, replication, statistical analysis, analytical choices, citation bias, publication bias, interpretation, and the misuse or neglect of simple mathematics, reporting, collaboration, data collection, and study design”.

(Brown, Kaiser, & Allison, 2018., 2564)

This quote reinforces the idea that just because something is “scientifically proven” does not mean it is true or true forever or in all circumstances. It may be true, or it may be true at that time and place; or may be true for a particular person or group but may not be a universal truth.  Some of the reasons for these errors are simple to explain. For example, a calculation error may be as simple as one incorrect number. Whereas other reasons for errors can be a lot more complex. A more complex error that may occur is researcher bias. Interestingly, a few articles I read this week, connected to social media, provided some interesting discourse around this phenomenon.

The first article, titled, “How to Convince Someone When Facts Fail”  by Michael Shermer (2017), discusses belief systems and how people will hold on to beliefs despite evidence to the contrary. This could directly impact the work of a researcher because their overriding beliefs (particularly those who are very passionate about their beliefs) about what the outcome should be or how the participants should respond may be realized through intended or unintended actions. This may also impact the reader as despite what they read and despite how much evidence there is to support an idea, the mere fact that they hold another truth in their mind, may not allow for belief in anything that opposes this. In fact, according to Shermer, the more an individual reads opposing views, the more they listen to opposing views, the more they see opposing views, the stronger they will grasp and hold on to their previously set beliefs. Given this it would be fair to say that a researcher may be blinded by their beliefs.

Mike Caulfield addresses this same idea in his article, “Network Heuristics” (2019), as he shares a personal experience he had with colleagues/students. He would show them a site and then ask them to identify all the components that made it fake. After they had come up with several items proving the site was fake, he would reveal that, in fact, the site was real. What he discovered was that once people had made up their mind about something it could be very hard to change it, despite all the evidence he provided. In fact, he found that a certain number could not be convinced that the site was real and would become argumentative. Caulfield referred to this as cognitive dissonance.

When thinking back on the speaker we had in class this week, Dr. Shauneen Pete,  the term cognitive dissonance could also be used explain the long-held biases some Canadians have and why they may refuse to let these go despite evidence to the contrary. It may also be why, when stories of injustice regarding indigenous people arise, and when stories about their history are told, that some people still hold to these beliefs/biases about indigenous people and may in fact become argumentative. I recall a comment that arose in a discussion about indigenous people that shocked me but could perhaps be explained by this phenomenon. The argument made in response to a discussion about residential schools was that, “their children were taken away from them because they couldn’t take care of them; they are/were drunks.” Long held biases like these need to change, but as previously mentioned are very hard to change. These biases, again may affect all aspects of research.

At this point I stop and reflect wondering what biases I may hold. Wondering how these impact me as the reader, the research, the researcher and the researched. I search my mind for arguments I have had with others about beliefs I hold and I being to look at them as I think about bias and I think about cognitive dissonance.

And again, this idea of bias appears an article by  Bowers. C .A (2018),  “The Digital Revolution and the Unrecognized Problem of Linguistic Colonization” when Bowers discusses in great detail the concept of print and the value we place on a method of documentation that is: “ abstract” [and] “ is inadequate in communicating ongoing relationships, and reduces the importance of learning from all the senses and giv[es] special attention to local contexts” (Bowers, 2018., p.193). Bowers believes this reliance on print (he also connects this to digital systems as they are all print based as well) and this belief about the written word as a “high-status knowledge” system leads to “abstract and surface thinkers”(Bowers, 2018., p.194).  He also discusses the, “importance of oral communication” stating that lived experience results in more complex and context-based knowledge (Bowers, 2018., p.193). Bower notes that this intricate and complex form of communication is represented as inferior to data as it is not objective.  Bower explicitly discusses bias within this context when he says,

“The long-standing bias against oral traditions can be seen in how the word “illiterate” carries the connotation of backwardness and ignorance.”

(Bowers, 2018., p.196)

Again, those who hold this bias, may be influenced by it and may struggle with accepting this alternate viewpoint. The bias Bowers discuses directly relates to another paper viewed this week by Onwu,G & Mosimege, M (2004), “Indigenous knowledge systems and science and technology education: A dialogue”. The dialogue between the two speakers is a discussion regarding Indigenous knowledge systems and education. There is discussion around the lack of documentation of belief systems and practices with an agreement that this is a step that needs to be taken. This connects to Bowers paper and I would be curious to know what he would think about the documentation of these ideas. After reading Bowers paper I suspect he might have some concerns about transferring what would be, “complex and context-based knowledge” to the abstract and disconnected print form (Bowers, 2018., p.194). I also think, based on the ideas included, that transferring this knowledge to print form could be impossible as some aspects of this knowledge would be lost. I think Bower may appreciate the following comment by Mosimege, M when he says,

“I am suggesting that the two systems are different and therefore require different forms of verification. These verification methods and processes can actually be equated and be made to be of similar standards, however they have to be appropriate for each system, otherwise we would compromise one system at the expense of another and in the process lose the beauty of what the two systems could provide alongside each other.”

(Onwu,G & Mosimege, M, 2004., p. 6)

This comment resonates with me, in particular, the ending where it discusses the beauty that could be lost by trying to have the same verification process for two different systems. To hear it expressed in this manner, creates clarity; it seems reasonable and logical. I have found in discussions over the last few weeks that there may be many right ways rather that just one right way. This idea is reiterated within the Onwu & Mosimege dialogue as one of the questions that arises is; Why is Western science is considered to be the “only true science”; and, why it is considered to be the superior science? (Onwu,G & Mosimege, M 2004., p.11.  This again connects to the idea of bias, as a researcher who holds these biases; that of the printed word and of western science may occlude items as a result and may create research that reflects and perpetuates this bias. It may be true that all aspects of research: the research itself ( methodology chosen or research question),  the researcher, the reader ( preconceived bias that affects the meaning they may make) and the researched ( selection of participants, derived meaning/translation of oral to print language) are subject to bias.

And, I pause again as I think about my history, my story connected to these ideas and my beliefs around these ideas. And I wonder again about biases, my biases connected to science, connected to the written language, connected to the very word “illiterate”. 

While I did not set out to engage in an intense discourse around bias when I began this post, ultimately that is the link that connected each idea to another and thus I was led down this path. As the research, researcher, researched and the reader, the understanding of bias and the effects it may have is important. Perhaps some of our beliefs are purely, “stories”. Perhaps some of our “stories” are purely facts.

“We’ve often convinced ourselves in higher education that there is something called “critical thinking” which is some magical mental ingredient that travels, frictionless, into any domain. There are mental patterns that are generally applicable, true. But so much of what we actually do is read signs, and those signs are domain specific. They need to be taught.”

                       -Mike Caulfield (2019)

“Network Heuristics”

 

2 thoughts on “Tell me a Story. 515 1a. week #3

  • Cognitive dissonance is fascinating, especially if we consider it on a spectrum with open-mind being at the opposing end. We want students able to make justified opinions that are supported by observation and rational reasoning; but we also want them to leave their ideas behind and accept other’s view. I’m curious how you feel regarding this: should we lean more to one end of the scale, or is there some possibility for them to reconcile with one another.

    • Thanks for your reply!
      That’s a great question! I wonder, does understanding someone else’s viewpoint mean you need to leave/forget your viewpoint? Can you accept another’ viewpoint while still maintaining your own? Can we teach students to critically evaluate their own viewpoints for cognitive dissonance? For bias? This would be such a valuable skill!
      I’m not sure I have the answer to these questions… but I am fascinated by them. I would love to talk to you more about some of these ideas!

Comments are closed.