"I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves," Facebook (META 0.43%) CEO Mark Zuckerberg wrote on the platform back in the wake of the 2016 U.S. presidential election. The social networking giant had been widely criticized for its role in allowing misinformation to spread on the platform, potentially swaying the results of the election.

In the years since, management has repeated this refrain. COO Sheryl Sandberg similarly said Facebook is not an "arbiter of truth" in 2017, and Zuck Dawg echoed his previous sentiments less than two months ago. Yet Facebook is starting to do just that, implicitly recognizing its responsibility in the viral spread of misinformation in the age of social media.

Fake News in all-caps over a background of various letters and numbers

Image source: Getty Images.

Supporting "high quality news"

Yesterday, Zuckerberg said that Facebook is exploring adding a new section to Facebook dedicated to "high quality" and "trustworthy" news. In a blog post discussing a recent conversation Zuck had with Axel Springer CEO Mathias Dopfner, the chief executive wrote, "We talked about the role quality journalism plays in building informed communities and the principles Facebook should use for building a news tab to surface more high quality news, including the business model and ecosystem to support it."

Facebook is even open to supporting the effort financially, potentially paying publishers a licensing fee. Many have criticized Facebook for exacerbating the decline of local journalism, siphoning away eyeballs and ad dollars, but now the company wants to help improve the economics for publishers that it helped wreck. Facebook isn't looking to create a paid product but would absorb any associated costs. "This isn't a revenue play for us," Zuckerberg said.

Checking facts on WhatsApp

Meanwhile, Facebook subsidiary WhatsApp just rolled out a new fact-checking service ahead of elections in India this month, where users can forward questionable messages to a dedicated service called Checkpoint Tipline operated by local start-up Proto. The service will attempt to mark messages as either true, false, misleading, or disputed in an effort to limit the spread of misinformation.

WhatsApp is incredibly popular in emerging markets and has similarly played a role in the proliferation of fake news around the world. That includes during last year's election in Brazil, as well as many cases of questionable rumors that led to deadly violence in India, Myamar, Sri Lanka, and Mexico, among others.

Two chat bubbles with a lock where they overlap

Image source: Facebook.

Ironically, WhatsApp's end-to-end encryption, which Facebook wants to bring to all of its messaging services once it integrates them all, makes it easier to spread misinformation.

Getting sick from Facebook

Last month, Facebook also said it was taking new steps to fight misinformation around the safety of vaccines. The company will try to limit the distribution of vaccine hoaxes, which have contributed to the growth of anti-vaccine communities around the world and created public health crises in the form of preventable outbreaks, such as the measles outbreak that has currently spread to 15 states in the U.S., according to the Centers for Disease Control.

"Leading global health organizations, such as the World Health Organization and the US Centers for Disease Control and Prevention, have publicly identified verifiable vaccine hoaxes," global policy exec Monika Bickert wrote. "If these vaccine hoaxes appear on Facebook, we will take action against them."

Those actions include reducing the ranking of Pages and groups that share hoaxes, removing them as recommended content, and prohibiting the content from being monetized with ads.

The arbiter of truth we need, but not the one we deserve

Combined, all of these actions show that Facebook does indeed recognize that it needs to be an "arbiter of truth," even if it doesn't necessarily want to. Investing in "high quality" journalism is an implicit endorsement of the underlying veracity, particularly if Facebook is paying for the content. A fact-checking service is an overt effort to distinguish truth from lies. The spread of vaccine hoaxes on Facebook is creating public health emergencies, and the company can no longer turn a blind eye.

Scale has always been Facebook's greatest enemy, and people have been lying on the internet since the late 1600s. "Of all the content on Facebook, more than 99% of what people see is authentic," His Zuckness wrote in that 2016 post. "Only a very small amount is fake news and hoaxes." But with 2.7 billion people logging in to one of Facebook's core services every month, even 1% of malicious content can adversely affect millions of people.

Given all of Facebook's controversies over the past couple of years, the tech giant may not necessarily deserve the role, but Facebook users need something -- anything -- in the battle against fake news.