Friday, October 11, 2024
64.0°F

Google with child in mind

| November 19, 2013 8:00 PM

Today's topic is neither pleasant nor, let's hope, necessary given most readers' online habits, but few issues are more important. That said, even with such lofty purpose this news won't come without controversy.

The fact that Internet activity has never been 100 percent private is once again spotlighted, but they mean well. It started in the United Kingdom with Prime Minister David Cameron, who wants to do more about child pornography and its link with other pedophilic crimes. This summer he asked Google and Microsoft (whose Google, Bing, and Yahoo engines represent 95 percent of the search market) to block results of certain search terms. At first they said no, but after a few months and software development they relented.

British origin and the corporate giants' usual competitor status notwithstanding, on Monday, Nov. 18 the companies announced this move will affect users worldwide within six months, beginning with Britain and other English-speaking nations.

Yes, that means the U.S.

When users enter the forbidden search terms, a warning will appear, notifying of criminal activity as well as pointing toward possible help. Pedophilia is more than a crime; it's considered a mental disorder by the Diagnostic and Statistical Manual of Mental Disorders.

What are the term combinations to avoid, including innocent use by child abuse researchers like me? Thirteen thousand terms in 100,000 or so combinations are too numerous to mention, and anyway I couldn't easily find them. Perhaps in time Google will offer them in a news release.

Still, preventing completion of suspicious searches is not entirely new. Partly to accommodate government policies and laws, Google and Bing already use "autocomplete" algorithms (i.e. programming) to scrub verboten words/searches, assuming you mean something else and offering that instead. Perhaps the difference will be that the old ability to get around this default action will now be blocked?

The companies will do more than cull terms and block search results. They say they'll use new programming to block or remove illegal photos/imagery and prevent the sharing of links to child porn (in peer-to-peer networks). FBI reports complain of the great difficulty of tackling child pornography and sexual abuse in the Internet age, which, with its multiple venues for hiding real identities, can make locating victims and perpetrators nearly impossible.

The Internet has exponentially expanded the child-exploitive underworld, allowing its associated dangers to thrive by comparison to yesteryear. The so-called "dark Internet" may be used even by those unknown to each other to share exploitive images online without public/outside access. Minister Cameron said this is his next target.

Critics say the new move is more show than substance, that it will do more to reduce the companies' exposures to liability than it will actually help kids, because many illegal content seekers don't use searches to find it. Plus they worry child protection organizations and workers may get inadvertently swept in by mistake.

Time will tell; Google says a prior deterrence campaign led to a 20 percent drop in illegal content results (which one hopes leads to a drop in demand and supply), and has promised tech support to protection organizations. No doubt, programming will require continual tweaking. If even a little headway leads to a child spared, isn't "a little" better than none?

Certainly for that child, it is. Few algorithms are perfect, but in this high-tech world of eroding privacy, certain efforts are worth the trade, even if it will take some retraining for the rest of us.

Sholeh Patrick is a columnist for the Hagadone News Network. Contact her at sholeh@cdapress.com.