Home For Fiction – Blog

for thinking people


March 4, 2020

Unintended Misinformation: Dynamics and Dangers

Society

internet, manipulation, misinformation, society

A mythological king was once asked what was his most precious asset: his health, his army, his wealth, or his offspring. “Information!” he said. “Give me information, and I can easily get back the rest”. Of course, information also comes with misinformation. And unintended misinformation, in particular, can be a very insidious, dangerous concept.

Unlike fake news or intentional propaganda, unintended misinformation neither involves malice nor has ulterior motives. Those spreading misinformation aren’t aware they’re doing so. Instead, they believe they’re helping (either a cause or the general dissemination of knowledge).

But, as I mentioned above, unintended misinformation can be an insidious process, causing significant damage and having far-reaching repercussions. The problem, as we’ll see in more detail below, lies precisely in its lack of intention or malice. Put simply, its dynamics are such that preclude detection of the misinformation until it has effectively displaced the truth.

Unintended Misinformation
Once the door of unintended misinformation is opened, it’s very hard to close it

The Dynamics of Unintended Misinformation

In George Yule’s The Study of Language (an excellent book for anyone interested in how language operates), the author – referring to H.P Grice – mentions the so-called co-operative principle:

An underlying assumption in most conversational exchanges seems to be that the participants are, in fact, co-operating with each other. (p.145)

In particular, an aspect of the co-operative principle (one of its maxims, as they are known), states:

Do not say that which you believe to be false or for which you lack evidence. (Ibid)

Obviously, someone spreading fake news or interested in manipulating their audienceI’m referring to nonfiction; not fiction, where manipulation is a legitimate technique, doesn’t follow this maxim. They know what they’re saying is false. At best, they lack evidence.

But where does unintended misinformation fit in that?

The Problem of Expertise

Let’s assume I am about to write a post on, say, space technology. I have no technical background, so I’m extra-careful about gathering my sources and double-checking my facts. For all purposes, I’m honestly trying hard to compose a factual, accurate post.

In other words, I think I obey the maxim mentioned earlier: I think I say things that are true and for which I have evidence.

The problem is, since I’m lacking the expertise required, I also lack the critical thinking required to evaluate my sources and all I consider as fact.

Perhaps, in good will, I used a source that is flawed (perhaps itself deliberately violating the maxim). Someone with expertise – say, a space propulsion specialist, or an astrophysicist – would have the necessary experience and knowledge to critically approach the source and dismiss it accordingly. But I wouldn’t.

In a way, unintended misinformation is a bit like the Dunning-Kruger effect: You might have good intentions, but you are too ignorant (in the given topic) to realize your own ignorance.

What Are the Repercussions of Unintended Misinformation?

As I briefly mentioned in the introduction, unintended misinformation can be particularly dangerous – danger here defined as a hard-to-detect falsity.

Naturally, propaganda can also be hard to detect – if the person spreading the information is sophisticated enough. But the insidiousness of unintended misinformation is that it “hijacks”, so to speak, a well-meaning, perhaps even trusted agent to propagate.

unintended misinformation
Unintended misinformation can shift public opinion, accidentally concealing the subtlety required for accuracy

Let’s take a real-life example – indeed one that was partly the inspiration behind this post.

The Truth Is Out There (but We’ve Misplaced It): The Boeing 737 MAX Saga

If you haven’t been living on Mars for the past year or so, you must know that Boeing’s 737 MAX planes have been currently grounded for almost a year (and counting, at the time this post goes online), due to two accidents attributed, at least partly, to a flawed component – strictly speaking, piece of software.

I won’t go into technical details, not wanting to tire youNot to mention, wouldn’t it be ironic if I got something wrong in a post about unintended misinformation? See more about this in the concluding section 🙂 , but being fairly knowledgeable from an enthusiast’s perspective about aviation matters, I have been following the discussions and analyses closely. And I discovered an interesting detail, relevant to our topic.

The software in question, called MCAS (Maneuvering Characteristics Augmentation System), has been habitually referred to by virtually all mainstream media as “an anti-stall” device.

Now, both from my own limited, non-specialist perspective, as well as by reading what specialists (pilots, engineers) have been saying on the matter, I knew this wasn’t entirely accurate. The purpose of MCAS wasn’t entirely a way to prevent stalling, but rather a way to mimic the handling behavior of the predecessor model, the 737NG.

Part of the confusion (from the perspective of the journalists composing those articles) probably lies in the fact that, as a result of the different handling behavior, the plane could likely enter a stall without MCAS. Indeed, certain critics have observed that the degree of authority of MCAS could be the factor deciding whether it’s an anti-stall system or not.

“Well, alright”, you might say. “So the journos simplified a complex topic. Is that so serious?”

I’m afraid it might be. More often than not, there are no simple answers to complex issues.

Unintended Misinformation Can Shift Focus

You see, for Joe Journalist, “anti-stall” is handy, easy to use. It conveys a lot of information. “Without this system, the plane might stall”, they write – and it’s kinda true. “So, it’s an anti-stall system”.

However, this innocentHere I choose to assume that there was no intention involved, that is, journalists were not influenced to opt for one term over another. linguistic choice becomes crucially important if you consider another thing.

From the perspective of how aircraft certification works, stall protection is not quite the same as stall identification. They’re also not the same as mimicking the handling of another aircraft. Each category has specific certification requirements. Considering the immensely complex framework, where what company A did, what authority B promised, and what client C expected can clash, such issues matter.

Perhaps it’s an exaggeration to claim a couple of words can shift public opinion so easily. Then again, remember how euphemisms work. Repeating a flawed fact enough times, and it becomes the default position.

home for fiction

What Kind of Responses Do We Have Available?

Presumably, the best gauge is the expertise of the writer. When I write about literature and writing, or the Gothic, you would be justified to trust me. That is, it would be highly likely that I know what I’m talking about. On the other hand, when I write about psychology or society, it would be a good idea for you to be a little bit more skeptical.

After all, all these elements – literature, society, culture – are not set in stone. Unlike physics, math, or engineering, literature is about arguments, rather than facts.

By consequence, when I write about science and engineering, you need to be even more skeptical about what I say. Although I try to be as accurate as possible, it’s more likely that I might unintentionally misinform you. This does include the present article, by the way.

Ultimately, it all goes back to critical thinking. Seek information from various sources. It’s alright to start from, say, Wikipedia, and then search for media or publications specializing in the topic.

You would already be miles ahead from those getting their facts from Twitter alone!