[ by Charles Cameron — keeping a wary eye on algorithms, with the sting in the three links at the end ]
Here’s a tweet alerting readers to a newspaper piece on the Munich terror attack:
Police investigate claim Muinch gunman lured victims to their deaths with fake Facebook posthttps://t.co/LQh0KYpjKF pic.twitter.com/oj82EoQdnq
— Evening Standard (@standardnews) July 23, 2016
Here, entirely by chance, is the tweet that followed it as I scrolled down my feed:
Reach your customers on a personal level with the right message, on the right platform, at the right time. https://t.co/TnaJVrqQng
— FoxMetrics (@FoxMetrics) July 17, 2016
That tweet, for what it’s worth, was promoted.
I’m sure the people doing the promoting wouldn’t have chosen to have me read it immediately after reading the newspaper tweet just above it, but that’s the sort of things that happens when “thought” gets automated. It reminds me of the time I was researching al-Awlaki on the pro-jihadist site Revolution Muslim, and some algorithm suggested I’d like an add offering “bold Christian clothing”:
No sale there, I’m afraid.
It gets more serious, though:
3 Quarks Daily, Algocracy: Outsourcing governance to Algorithms WSJ, Google Mistakenly Tags Black People as ‘Gorillas,’ Showing Limits of Algorithms ProPublica, How We Analyzed the COMPAS Recidivism Algorithm ProPublica