Facebook (NASDAQ:FB) has made it possible to share your stunning vacation pictures, a shot of your meal at a gourmet restaurant, or any other image from your life. When done as intended, that sharing is a wonderful thing that connects friends and family no matter where they are.
Of course, that ability to share images has a dark underside. On the innocent end of the scale, your parents might share a mildly embarrassing childhood photo of you, or your friends may post a picture of you in an outfit from the past you wished had been forgotten.
But not all sharing is designed to be positive, or to poke only mild fun at people you love. Some people use Facebook to humiliate others, and the company has begun a new effort to stop that from happening.
"When someone's intimate images are shared without their permission it can be devastating," wrote Facebook's global head of safety Antigone Davis in a blog post. "To protect victims, it's long been our policy to remove non-consensual intimate images (sometimes referred to as revenge porn) when they're reported to us -- and in recent years we've used photo-matching technology to keep them from being reshared. To find this content more quickly and better support victims, we're announcing new detection technology and an online resource hub to help people respond when this abuse occurs."
Check out the latest earnings call transcript for Facebook.
What exactly is Facebook doing?
The social-media site isn't just making an effort to detect nudity, according to Davis. It's also using machine learning and artificial intelligence to "proactively detect near-nude images or videos that are shared without permission on Facebook and Instagram."
That, she explained, helps the company find the inappropriate (and often illegal) content before it even gets reported. Doing that helps victims, by taking the process out of their hands and having it happen faster -- before the image or video has been widely seen.
"Often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared," she wrote.
Potentially improper posts are reviewed by a "specially trained" member of Facebook's Community Operations team. If the image or video violates the company's standards it will be removed.
"In most cases, we will also disable an account for sharing intimate content without permission," Davis wrote. "We offer an appeals process if someone believes we've made a mistake." In addition to the new detection technology, Facebook has also begun a pilot program that's being run jointly with victim-advocate organizations.
"This program gives people an emergency option to securely and proactively submit a photo to Facebook," Davis wrote. "We then create a digital fingerprint of that image and stop it from ever being shared on our platform in the first place."
A plan for privacy
These programs come as Facebook CEO Mark Zuckerberg has talked about improving privacy at the social media company. That's a move necessitated by the many questionable decisions the company has made about how it shares the data it gets from its users.
To continue to grow, Facebook and Instagram need to be safer places. That means the company has to use every tool at its disposal to fight improper posts, while giving victims and potential victims everything they need to protect themselves.
Facebook has alienated some users by selling data. Zuckerberg has addressed those concerns, but it will take time to win back consumers' trust. The company is taking an active role in policing its own space and making it safer for users, and these are important steps.
This is a case where doing the right thing is also the right business move. Facebook and Instagram need billions of users to feel free to share without having to worry about those platforms being used to humiliate them. Taking these steps won't guarantee that never happens, but it should cut down on people using the platforms for revenge.