I went to an event tonight at Postlight, a digital product studio down in the Flatiron district. It was a conversation with Cathy O'Neil about her new book, Weapons of Math Destruction (subtitle: "How Big Data Increases Inequality and Threatens Democracy ").
Let's just say: timely & necessary. She rightly (imho!) dismissed the notion that Facebook and companies of the like are beholden only to the profits fed by their proprietary algorithms: they also have to assume a moral responsibility when those algorithms are tied to the corporal existence of actual human beings. The messy stuff of life will get you every time, as much as mathematicians and data scientists and corporate bottom lines would like to believe otherwise, and benign intentions are no excuse when the data is applied toward unexpected ends.
Also most algorithms are bullshit. Everybody agreed on that one. And I can confirm because earlier today Facebook revealed that it was basing my ad preferences on the following "hobbies and activities." I mean, who doesn't like to sit around and hear/see/lick/smell/sign things, but calling those "interests" is a step too far.
Gina Trapani, who conducted the interview, is a web (not literal) giant