Facebook (NASDAQ:FB) has come to dominate social media. Billions and billions of people use Facebook and Instagram every day to share with friends, family, and colleagues all over the world. In very meaningful ways, Facebook has helped open up the world and create connections that transcend borders. 

Facebook shareholders have done incredibly well. Shares are up 582% since the May 2012 IPO, more than double the total returns of the SPDR S&P 500 ETF Trust. Investors who bought in the months following the IPO -- I was lucky to buy in September 2012 -- have enjoyed even better gains. Facebook stock bottomed out in August 2012, and shares have gained nearly 1,400% since. 

Sadly, Facebook has also caused substantial damage along the way, choosing the easy path of monetization and breaking its social contract with users and society at large. A company like Facebook should represent the very best of environmental, social, and governance (ESG) investing, a philosophy that focuses on doing well by doing good.

Hand with thumb pointed down.

Why one investor "unliked" Facebook as an investment. Image source: Getty Images.

For this reason, I haven't enjoyed the full path of gains Facebook has generated, since I made the hard decision to sell my Facebook stock in March of 2018. 

Simply put, I lost faith in Facebook's management. Founder and CEO Mark Zuckerberg is a visionary, no doubt, but the stranglehold he maintains on the company's voting shares is a serious barrier to any meaningful change in what Facebook allows -- and even chooses to monetize -- on its platform. As long as that remains the case, I don't expect I'll have any interest in being a Facebook investor. 

Social media and shifting the Overton window

Zuckerberg has regularly pointed to the Facebook platform as a way for people to connect to the world. That sounds great in theory, but the sad reality is Facebook's tools seem to work in reverse. In very meaningful ways, Facebook has become an echo chamber. The social network's own algorithm plays a big role here, tailoring user feeds in part based on what users "like" and share.

This creates a distorted view of reality, feeding users a constant stream of rhetoric and noise from Facebook pages and groups that often shout out posts and shares from real people. As a result, the "Overton window" has shifted, and extreme views on everything from politics to medicine to science -- even whether the earth is a sphere or flat -- have been given greater credence. 

Facebook's responsibility, and where it has failed us

I'm not here to blame Facebook for the extreme views of society. That's unfair, and it's unreasonable to expect any social platform to police such things. To a very large extent, users should bear responsibility for what they share and post, and Facebook shouldn't act as an "arbiter" of truth on everything that's posted. 

Facebook does have an obligation, however, to better-control who is able to use its platform. There are plenty of groups with the motivation and incentive to use the platform to distort and mislead for their own benefit. Facebook hasn't done enough to combat this. 

There has been some movement in the right direction. Facebook has made it easier for users to report posts that violate the company's policies, including fake news, hate speech, spam, or other objectionable content. Facebook has also added a button on posts with shared links that gives information about the content and the website it is from. 

Yet Facebook continues to fail in a key way: what it monetizes. While Twitter made the decision in late 2019 to no longer accept political advertising, Facebook has shown zero interest in following suit. To the contrary, 2020 is a presidential election year in the U.S., and it's likely that the social media giant is looking for this to be its biggest political ad-spend year ever

Facebook could be heading toward a possible repeat of the 2016 debacle, where it accepted millions of dollars for promoted posts that came from foreign groups making a concerted effort to disrupt the political process in America. As long as Facebook agrees to accept political ad money, its own financial incentives will run counter to the platform living up to its mission to "...give people the power to build community and bring the world closer together."

Doing well, but not doing good

Every company has environmental, social, and governance obligations, and so-called ESG Investing has become more important to millions of people in recent years. Facebook's social mission is more important than most, simply because of what its business is. Facebook has failed all stakeholders in this area, including investors, users, and advertisers. 

We've already seen many advertisers recently step away from the platform because of concerns about Facebook's inaction on these issues. How long the "boycott" lasts remains to be seen, but it's evident that I'm far from the only concerned party. 

Looking ahead, Facebook could continue to beat the market. It's absolutely dominant, and marketers will go where the customers are watching. 

But investing isn't just about making money; it's also about participating in building something that makes the world better. Facebook absolutely has the power to do that, and it certainly does good for millions of people. But at the same time, the harm that the platform causes -- evidenced by the kinds of advertising it will take money for -- outweighs the good for me. And until that changes, I won't own a single share.