Here’s a deeply alarming story from BBC News: Algorithms are taking over our thought processes! Run for your lives.
Actually, this is funny, even hilarious. Journalists, always on the lookout for a new cultural catastrophes, have lately latched onto the word “algorithm” to symbolize machine (computer) control of the human environment. “Algorithm” sounds mysterious, technological, inhuman. Algorithms are going to think for themselves, take over the world, and eventually push humans and human thought out. Does this sound familiar? “Algorithm” and the fear thereof is just another moment in the long fantasy of automatons turning into humans, of machines learning to think and feel, or, in reverse, humans becoming machines (de la Mettrie and Deleuze have two ends of this stick).
The flaw in this terrifying scenario is that algorithms are nothing more than sets or sequences of procedures (rules, logical functions) WRITTEN BY HUMAN BEINGS. For example, Amazon.com might have an algorithm that says something like: if dg buys a book by Leo Tolstoy, then our computer will offer him a list of five other Leo Tolstoy titles. That’s a marketing algorithm. It’s not mysterious or weird or even complex–and it’s not a mathematical enigma. And the actual programming (which must be pretty dull) is done by a human. Except on some very simple-minded level, algorithms can’t write themselves. So the idea that algorithms control humans is idiotic.
Google uses algorithms to place those little ads you get next to your emails. But even here they can’t be astonishingly complex. If I right the phrase “Jonah’s erector set” in an email, I will find ads for Viagra and erectile dysfunction next to my email thread. Of course, this gets tricky when, say, Google Search tries to use algorithms (which, as per the examples above, are often pretty simple-minded) to predict the sort of answers I might want in my searches. I might get, to my mind, distorted search results. But that would end up making Google a less and less helpful service and I would go elsewhere. Same goes for that deeply comic little algorithm Google uses to auto-complete search terms (I turn off this function and the Google search history function).
The fear of algorithms reflects an oddly human suspicion of science (and, basically, fear of science these days means fear of the unknown as fewer and fewer people find themselves educated enough to keep up with scientific discovery–um, yes, think: intelligent design). Actually, people should be wary of other people first.
If you were expecting some kind warning when computers finally get smarter than us, then think again.
There will be no soothing HAL 9000-type voice informing us that our human services are now surplus to requirements.
In reality, our electronic overlords are already taking control, and they are doing it in a far more subtle way than science fiction would have us believe.
Their weapon of choice – the algorithm.
Behind every smart web service is some even smarter web code. From the web retailers – calculating what books and films we might be interested in, to Facebook’s friend finding and image tagging services, to the search engines that guide us around the net.
It is these invisible computations that increasingly control how we interact with our electronic world.
At last month’s TEDGlobal conference, algorithm expert Kevin Slavin delivered one of the tech show’s most “sit up and take notice” speeches where he warned that the “maths that computers use to decide stuff” was infiltrating every aspect of our lives.