Facebook’s Hidden Persuaders

hidden-persuadersMajor internet platforms are constantly trying new things out on users, to better change their interfaces. Perhaps they’re interested in changing their users, too. Consider this account of Facebook’s manipulation of its newsfeed:

If you were feeling glum in January 2012, it might not have been you. Facebook ran an experiment on 689,003 users to see if it could manipulate their emotions. One experimental group had stories with positive words like “love” and “nice” filtered out of their News Feeds; another experimental group had stories with negative words like “hurt” and “nasty” filtered out. And indeed, people who saw fewer positive posts created fewer of their own. Facebook made them sad for a psych experiment.

James Grimmelmann suggests some potential legal and ethical pitfalls. Julie Cohen has dissected the larger political economy of modulation. For now, I’d just like to present a subtle shift in Silicon Valley rhetoric:

c. 2008: “How dare you suggest we’d manipulate our users! What a paranoid view.”
c. 2014: “Of course we manipulate users! That’s how we optimize time-on-machine.”

There are many cards in the denialists’ deck. An earlier Facebook-inspired study warns of “greater spikes in global emotion that could generate increased volatility in everything from political systems to financial markets.” Perhaps social networks will take on the dampening of inconvenient emotions as a public service. For a few glimpses of the road ahead, take a look at Bernard Harcourt (on Zunzuneo), Jonathan Zittrain, Robert Epstein, and N. Katherine Hayles.

You may also like...