By Mark Schaefer
Like most of the online world, I’m stunned by the Facebook experiment designed to surreptitiously toy with the emotions of its customers. But it goes beyond the simple shock value of a company manipulating people.
The news emerged that in 2012, Facebook conducted a study to determine whether it could alter the emotional state of its users. The company’s data scientists enabled an algorithm, for one week, to automatically omit content that contained words associated with either positive or negative emotions from the central news feeds of 689,003 users. They examined whether content from the subjects was more positive or negative based on the manipulated tone of the news feed.
There are broad reasons for concern that are deeper than what we see on the surface.
1. This was a corporate decision
I’m not a person who hates Facebook and is looking for a reason to ding them. I actually think it is understandable that Facebook wanted to conduct this kind of fundamental research.
However the rational decision would be to pay a university to get the same results under controlled and honest conditions. No company should ever make a decision to turn their customers into lab rats. The more disturbing issue is that Forbes reported that this research was approved by an internal Facebook review board. So this was not the case of a lone wolf embarassing the company. This breach reflects the dysfunctional corporate culture of Facebook. That makes my head spin.
2. Facebook is hiding behind legalese
At this moment, days after the furor erupted, Facebook has still not issued any apology. One of the researchers, Adam Kramer, created a Facebook post explaining the methodology and stating the impact on people as “minimal.”
Facebook justified the news feed mind game by saying it was covered by the company’s “Data Use Policy” (part of the terms and conditions nobody reads), which contains one cryptic line about how your information could be used for research. Christopher Penn did a nice job separating the difference between “legal” research and “ethical” research in his post about Facebook emotional testing.
So the message here is that the company will do whatever it wants as long they can cover their asses legally.**
3. They haven’t learned their lesson
Facebook’s arrogant approach to customers and privacy was so extreme that it was the subject of a U.S. Congressional investigation in 2012. They were found guilty, fined and subjected to 20 years of privacy audits by the goverment. The government essentially ruled that Facebook needs a babysitter. This Facebook experiment shows that the company still has the attitude and maturity of a petulant 5-year-old, doing whatever it wants unless it has adult supervison.
What if somebody was already experiencing depression and this experiment made them more depressed … even dangerously depressed? What is the probability that over 689,000 people that somebody was pushed into an inescapably dark place? Did they even THINK about the fact their “users” are real people who may already be suffering?
4. Its arrogance will be its undoing
Facebook made a terrible error in judgment. But it gets worse. It published the study in the March issue of the Proceedings of the National Academy of Sciences. The message here is, “we screwed our customers and we also want to stroke our egos buy getting academic credit for it.” It put its ego above its customers.
Here’s the chilling thought: This is the only experiment we KNOW about because it was published.
5. Facebook: The world’s Valium?
One takeaway of the study was that taking all emotional content out of a person’s news feed caused a “withdrawal effect.” Facebook concluded that it should subject you to happy content to keep you coming back. The implication is that to increase usage (i.e. maximize profits through ads) Facebook must not just edit your news feed through Edgerank, it should tweak the emotional tone of its world like a digital Valium.
The actual experiment is only the tip of the iceberg. What are they going to DO with the results of this research? I doubt the answer is “nothing.”
Implications of the Facebook Experiment
One camp has emerged supporting Facebook, claiming that we are all subject to digital manipulation by every company and Facebook has the right to do whatever it pleases with its data. Some contend this is simply normal A/B testing conducted by any company involved with eCommerce. It is more complex than that. Intentionally making sad people sadder crosses an ethical line beyond the day to day work of improving a user experience.
Last year, before the Facebook IPO, I wrote a post called “Why Facebook Will Become the Most Dangerous Company on Earth.” The premise was that with the unrelenting pressure to increase profits — every quarter without end — the company eventually would be forced to use its only real asset, our personal information, in increasingly bold and risky ways.
I think this is proving to be true.
The implication of a strategy that disrespects customers is not just a temporary emotional furor. There is an economic implication, too. Facebook is the world’s dominant social network and its only significant threat is itself. Corporate arrogance is a sure path to self-destruction as history proves.
What are your thoughts on this experiment and its implications?
Mark Schaefer is the chief blogger for this site, executive director of Schaefer Marketing Solutions, and the author of several best-selling digital marketing books. He is an acclaimed keynote speaker, college educator, and business consultant. The Marketing Companion podcast is among the top business podcasts in the world. Contact Mark to have him speak to your company event or conference soon.