Wednesday, December 04, 2024
30.0°F

ANALYSIS: A social media danger to society

by UYLESS BLACK/Special to The Press
| July 28, 2022 1:00 AM

Editor's note: This is the third report in a five-part series.

An American pioneer living in the 18th century said, “Lord, grant that I may always be right, for thou knowest I am hard to turn.” That quote sums up much of this article.

The pioneer was describing himself as well as almost every human who walks this earth. His “hard to turn” phrase is known today as cognitive bias. It identifies a trait of human behavior in which people accept information coming from a person or group they know, but information from an unknown person or group is viewed with suspicion or rejected.

The term is also used to identify another human trait: People seek sources for information that confirm what they already believe. Even more, people more easily remember information that is in conformance with their viewpoints than information that is contrary to their beliefs.

You and I might be saying, “Not so, I am open-minded!” The evidence shows otherwise. A study published in Scientific American disclosed, “This tendency is extremely difficult to correct. Experiments consistently show that even when people encounter balanced information containing views from differing perspectives, they tend to find supporting evidence for what they already believe.”

I think all of us know that when people receive information counter to their own beliefs, they dig in. They become even more committed to their established views on the subject. They become more doctrinaire, likely one reason for bar-room brawls.

This behavior is being used by many Internet vendors for their own benefit, but not ours. Many large-scale Internet vendors, such as Google and Facebook, rely on user sessions to gain knowledge of users' browsing preferences. This invaluable pool of information is used by many of the major search engines to personalize what is displayed to a user based on the user’s history; that is, what the user’s preferences are.

As the Scientific American Journal makes clear, if Internet vendors are so inclined, “They [can] prioritize information in our feeds that we are most likely to agree with — no matter how fringe — and shield us from information that might change our minds.”

This activity means that users increasingly have their options narrowed. Their beliefs are reflected during a query. Sites to their “like” views are displayed first, say on the first page of their PCs or smartphones, with less-agreeable information shown further down in the feed, often on other pages.

Keep in mind that the priority of what is displayed has nothing to do with its quality. These cues to us, cues to influence what we shop for, who we vote for, etc., rarely have any relationship to the quality (accuracy) of the information.

An Internet user might not have an opinion about a subject. This neutral user, sitting on the fence, is a prime target for social media vendors to target, to move the user off the fence, to persuade the user to form an opinion about the subject. These “persuadables” become prime targets for tailored messages to be sent to them, to move them to a certain way of thinking.

The upshot of these manipulative strategies of many Internet and smartphone vendors is to reinforce our beliefs in order to sell us more products. Of more serious consequence, to sell us more political snake oil.

Bombarded with enough of that snake oil, the fence-sitter will likely get off the fence. The 1700s American pioneer, if he had access to modern social media, might very well find he is not so “hard to turn.”

How so? By echo chambers, bots and memes spreading false-but-negative information, all designed to manipulate our behavior — the subject of the next article in this series.


During his career, Uyless Black consulted and lectured in 16 countries on computer networks and the architecture of the internet. He lives in Coeur d’Alene with his wife, Holly, and their ferocious three-pound watchdog, Bitzi.