"The meat would be shoveled into carts, and the man who did the shoveling would not trouble to lift out a rat even when he saw one -- there were things that went into the sausage in comparison with which a poisoned rat was a tidbit."
-- Upton Sinclair's The Jungle, 1906
It's been over a century since Upton Sinclair's The Jungle detailed the unsanitary work in the meatpacking industry. That peek behind the scenes shocked the public and led to monumental reform, creating what is now the Food and Drug Administration. Facebook (NASDAQ:FB) just gave a peek behind what the online advertising industry is capable of, and similarly outraged the public.
In a psychological experiment involving about 700,000 unknowing Facebook users, the social media giant proved that more positive posts will lead users to post more positive items; the story was the same with negative posts. It also proved to the public that a profit-maximizing corporation might not operate with general decency of disclosure or in the best interests of society.
But what does this mean for Facebook today?
This isn't the first time Facebook has stepped on the wrong side of acceptable actions.
In September 2006, the company released its "news feed" which published all friends' activities to a central stream with no accompanying privacy controls. A few days later, CEO Mark Zuckerberg published this apology:
We really messed this one up. When we launched News Feed and Mini-Feed we were trying to provide you with a stream of information about your social world. Instead, we did a bad job of explaining what the new features were and an even worse job of giving you control of them.
In November 2007, the company launched Beacon, which tracked and posted users' activity across other websites. It took a little longer, but Zuckerberg published another apology a month later:
About a month ago, we released a new feature called Beacon to try to help people share information with their friends about things they do on the web. We've made a lot of mistakes building this feature, but we've made even more with how we've handled them. We simply did a bad job with this release, and I apologize for it.
I'm the first to admit that we've made a bunch of mistakes. In particular, I think that a small number of high profile mistakes, like Beacon four years ago and poor execution as we transitioned our privacy model two years ago, have often overshadowed much of the good work we've done.
But no lasting damage
In 2010, after Facebook altered its privacy options, a Kickstarter project called Diaspora aimed to raise $10,000 to build a decentralized social network in which a user retained total control over privacy and data. The project raised $200,000 and at the time was the most funded project on the site. With momentum and money, it seemed Diaspora might threaten Facebook.
But the latest Diaspora statistics show only 1 million total users, and only 12,000 monthly active users.
We've gotten glimpses at the data Facebook keeps about users and shares with others, and what the company can do with it. However, none of the outrage ever seriously threatens Facebook's dominance. Perhaps it's because we don't consume social media in the same fashion as meat, and privacy is less tangible than a hamburger.
It seems Facebook can do no wrong. The lock-in of users and network effects create too much friction for the typical user to switch to a competing social network. As Facebook finds ways to increasingly monetize its users, it will only build up a resources to battle or acquire any smaller competition. It will keep making mistakes, and keep apologizing, and keep making billions of dollars, which brings us to the latest psychological study that it carried out on unwitting users.
After the uproar of the study, one of Facebook's data scientists wrote an apology that spelled out the core of Facebook's motivation behind it: "we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook." The company is incentivized to keep people on the site for as long as possible, and were simply acting in that fashion. The employee continues, "my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused."
What will Facebook have to apologize for next?