Friday, March 29, 2024
39.0°F

EDITORIAL: Beware the insidious manipulators

| August 3, 2022 1:00 AM

If someone accused you of being brainwashed, you would:

a) Punch them in the nose

b) Try to destroy them on social media

c) Accuse them of being a bully and then cry your eyes out

d) Carefully consider why they might think that

The correct answer, of course, is a). But seeing as how that’s likely to land you in jail, let’s go with d).

Seriously, what would you do? Would you try to communicate with the accuser to understand why they came to that conclusion?

Because here’s the thing: To varying degrees, we’re all the sum of strong outside influences on the way we see the world and think. Brainwashing involves pressure or influence that leads to adopting radical ideas and beliefs, but what, these days, is radical? And how often do we critically examine why we might be thinking the way we do, especially when visceral reactions should set off warning bells that something isn't right?

Coeur d’Alene-based researcher and author Uyless Black has rendered a valuable public service by getting at the heart of one of the strongest influences on what we think: social media. His five-part series was published last week in The Press. If you ignored it or missed an installment or two, fear not. We’ll provide links to all five segments at the end of this editorial.

Even if you’re not terribly interested in the topic, we will ask you — beg you, if we must — to read just one piece anyway: the third installment, headlined “Changing an opinion a far cry from easy.” The information is so important at every level of modern society that we believe it should be standard high school reading.

Black acknowledges that people naturally seek to be right, that what they think and believe is correct or completely justified — a perception supported by many studies, including one published in Scientific American.

“This tendency is extremely difficult to correct,” the study says. “Experiments consistently show that even when people encounter balanced information containing views from differing perspectives, they tend to find supporting evidence for what they already believe.”

So the task for truly objective consideration of information is already daunting. The hill gets much steeper, however, thanks to internet manipulators.

Black explains how massive vendors like Google and Facebook understand a user’s preferences based on what that person has been looking at online.

“They (can) prioritize information in our feeds that we are most likely to agree with — no matter how fringe — and shield us from information that might change our minds,” Black quotes from the Scientific American article.

The practice is manipulative marketing at the least. And we agree with Black that it is far worse than that: The practice of bolstering radical beliefs through misinformation and disinformation while restricting contrary evidence or verifiable information is a potent global threat.

Please, take time to read Uyless Black’s series below. The series is also available on Black’s blog: https://bit.ly/3zg7Hc0

Part 1: https://bit.ly/3Sas4A1

Part 2: https://bit.ly/3zIWGS5

Part 3: https://bit.ly/3oEZCbQ

Part 4: https://bit.ly/3zmmJwS

Part 5: https://bit.ly/3oIDKfK

• • •

CORRECTION

The Coeur d'Alene School District levy vote is Aug. 30. Sunday's editorial gave the wrong date.