Saturday, May 04, 2024
50.0°F

Who’s that algorithm in the mirror?

by SHOLEH PATRICK
| August 15, 2023 1:00 AM

There’s a lot of stink about social media algorithms which dictate what content we see online. Computer code pays attention to what we click and how long we linger, then feeds us more until we gorge ourselves on sameness. It’s no coincidence we begin to see narrowed themes in ads, videos and auto-generated news (whether real or dubious).

Does feeding at biased troughs influence our beliefs?

Four recent studies, the first of 16 in a collaboration between Meta (which owns Facebook and Instagram) and universities such as Princeton, Dartmouth, University of Pennsylvania, Stanford and others, analyzed social media activities of millions of consenting Meta users over a three-month period. The researchers found that while Facebook algorithms considerably influenced the spread of information, they didn’t significantly affect users’ underlying beliefs.

But social media’s echo chamber does have an impact.

The studies, published in the two major scientific journals Nature and Science, examined different aspects of how social media has been shaping our knowledge, beliefs and, in some cases, behaviors.

Here’s what they found:

1) Social media decides what we see. The data confirmed that Facebook algorithms had a big influence over what information users saw and how much time they spent scrolling and clicking. As logic would follow, that in turn affected user knowledge about news events, proving, sadly, that the average Facebook/Instagram user gets most of their news from controlled social media feeds instead of seeking it directly and openly from unbiased news sources.

2) Our information “diet” is too exclusive. We’re feeding mostly on what we agree with and getting repeatedly hammered by the same information, and that sameness keeps us ill-informed and literally ignorant of other relevant data. Facebook algorithms tended to feed information from sources already agreed with/clicked, creating political “filter bubbles” that reinforced the same worldviews and was a vector for misinformation and biased information.

3) It’s not changing our minds. It’s reflecting them back. The studies didn’t find what some expected: Despite Facebook’s influence on the spread of information and possible misinformation, there was no evidence that the platform had a significant effect on users’ underlying beliefs. Nor did it grow or change them.

Stagnation is a wisdom-killer.

Thus we dig in our heels, under the delusion that the world reflects our opinions. What we’re really experiencing is this limited exposure to the world of our opinions, one normalizing them above others, to the exclusion of the rest who actually think and experience differently (because their online world mirrors them too).

Which makes those who think differently more foreign to us, less familiar, less understood and thus, seem less trustworthy.

Which, over time, has made us (no surprise) far less comfortable with, tolerant of, whatever, anyone who has a different opinion. “How can they …?” “Don’t’ they care about …?”

Other viewpoint holders no longer feel as “human” or “reasonable” or ethical or compassionate or – name your desired moral judgment. All the while forgetting that we are using different information sources, which are feeding us different realities.

That’s why the opposite side is such a head-scratcher. We aren’t feeding at the same trough anymore.

None of this should be a shocker. Yet, despite the study authors’ conclusions, it is nonetheless influencing our beliefs in a worse way. Not changing them politically or socially perhaps (with the notable exception of the megaphone for misinformation that social media provides), but definitely changing our attitudes toward fellow man.

And not for the better.

Those of us who are older may remember social gatherings before YouTube, Facebook/Meta, Twitter and so on. Before computers and Apple Watches. When face-to-face made it harder to “otherize” anybody.

We swung left, right, red and blue and purple. We disagreed, even loudly and judgmentally. But we did it more often across a lunch table or chess board or barbecue grill. We shook our heads at each other but kept a broad list of party guests. Lended an ear or a hand or a New Year’s invitation even if they thought the other was nuts for their choices at the polls.

We otherized each other a lot less. Was it perfect? Not even close. But it wasn’t so bad we moved to get away from each other like this, and not just among everyday folk. Ask any veteran congressman if they used to break bread together after debates on the Hill ended, and then ask if they do it now, across the aisle. I’ve been told directly by several that Rs and Ds did go out after work together, and they now mostly don’t.

America (the world?) has forgotten how to play well with others. And it’s hard not to point fingers at the World Wide Web and its algorithms and news blurring.

Do we dump it? Of course not. But we could stand to be more aware of its workings, influences and ourselves. And maybe, just maybe, seek out news (yes, it’s incumbent on us now that there are thousands instead of just a few sources) and information that doesn’t look like a giant mirror reflecting us back at ourselves.

Let me repeat that, including for myself: Don’t. Just. Read like-me-news.

Call it a more balanced diet for a more balanced world. I have a dream.

• • •

Sholeh Patrick, J.D. is a columnist for the Hagadone News Network who still has hope. Email sholeh@cdapress.com.