When you ask Google or Bing or Duck Duck Go to find something for you, how does ‘it’ filter the results? How do advertisers, retailers and financial markets make decisions on what product to offer you? How do banks decide who to offer a loan or a bank card to? How do insurance companies determine how to assess risk and set prices? How do employers and dating sites use personality tests to find matches? The answer is, they use Algorithms!
An Algorithm is a set of detailed instructions that are fed into a computer program to deliver a result, or set of results from the information that it is given.
On the internet, Algorithms determine what we see first, or most. For example, when I input the letters ‘do’ into Google search, the autocomplete algorithm suggests ‘donald trump, ‘dominos’ and ‘donald trump news’. The search engine Duck Duck Go and Yahoo suggest ‘domino’s pizza’, ‘dorothea hurley bongiovi’ and ‘donald trump’. Bing suggests ‘domino s’, download chrome’ and ‘download google chrome’ before getting to ‘donald trump twitter’.
Why do they all choose such similar suggestions? It is because search engines look for what they believe is most relevant – which is the highest frequency of a search term and the way that pages containing that term link to other pages in the Web.
There are other factors at work too. Because Algorithms are written by people, it is not uncommon for those people to write their personal bias into the algorithms:
The dustup over Facebook’s “trending topics” list and its possible liberal bias hit such a nerve that the U.S. Senate called on the company to come up with an official explanation, and this week COO Sheryl Sandberg said the company will begin training employees to identify and control their political leanings.
– Nanette Byrnes, Why We Should Expect Algorithms to Be Biased, MIT Technology Review, June 24, 2016 –
Sometimes algorithms are simply mercenary in nature. Facebook may claim that its algorithm is personalized for your benefit, but it would be fair to say that Facebook’s algorithm is also optimized for Facebook, and thus for the advertisers.
More disturbing – it has been demonstrated that people’s emotions can be controlled by algorithms built into their social feeds. In 2012, Facebook and data scientists from two Universities (in a study that was published in Proceedings of the National Academy of Sciences) tweaked the news feed algorithms of roughly 0.04 percent of Facebook users, or 698,003 people, for one week in January. During the experiment, half of these people saw fewer positive posts than usual, while half saw fewer negative ones. When positive expressions were reduced, people produced more negative posts; when negative expressions were reduced, the opposite occurred. (In a note of contrition, the Proceedings of the National Academy of Sciences concluded that the decision to manipulate the content without the users consent might have violated some principles of academic research…)
It is one thing to know and accept that sites like Google or Facebook (both are primary news sources for people under 35) can manipulate what you see and potentially control how you feel. Are you also willing to accept that they could also be isolating you from other viewpoints, thus exacerbating your biases?
If you believe, as I do, that trending news can often be incomplete news – will you search for better information if you see ‘Red Flags’ like the following?
– does the story contain facts that seem to be inflated?
Example from Greenpeace USA: “The Arctic is one of the most unique places on Earth. It spans eight countries, is home to more than 13 million people… Fact check: The National Snow and Ice Data Centre, which is supported by NASA, the National Science Foundation (NSF) and the National Oceanic and Atmospheric Administration (NOAA) says “In total, only about 4 million people live in the Arctic worldwide.”
– is the story attempting to appeal to your emotions, or to the emotions of young people who may not be old enough to understand the inaccuracy of the story?
Example: Canadian environmentalist David Suzuki, soliciting donations for the Suzuki Foundation, did a “live from the North Pole” broadcast in front of some faux Arctic scenery in 2011. “Santa’s workshop is sinking! Climate change is melting the snow and ice and the rising water is getting too close for comfort. Santa must relocate – fast – to make sure all the nice boys and girls still have a happy holiday.”
– does the headline contain exaggerated language that attempts to make you fearful?
Example: “Sir John A. Macdonald: 5 Frightening Facts About Our First Prime Minister” – Rachel Décoste, Huffpost –
– does the story swear at someone or make derogatory comments?
Example: “Soon enough, he will be alone, surrounded only by his admiring fellow racists. But he will still be governing from the Oval Office. It bears repeating. Americans got what they asked for. And it oozes.” Opinion piece on Donald Trump from Neil Macdonald, for CBC News
Who is to blame for the dismal state of journalism today? Do you think Algorithms have played a role in forming your opinions or have they impacted your life in other ways?