A number of big technology companies rely on user-generated content, and they're not liable for what users post. That has become an increasing concern now that deepfake videos exist -- that is, videos in which people appear to be saying or doing things that they never said or did. That has led to calls for increased regulation or at least more efforts to make sure these videos aren't shared. And, of course, the calls for regulation go beyond deepfakes and ask that these companies make a much stronger effort at self-policing.

To catch full episodes of all The Motley Fool's free podcasts, check out our podcast center. A full transcript follows the video.

10 stocks we like better than Walmart
When investing geniuses David and Tom Gardner have a stock tip, it can pay to listen. After all, the newsletter they have run for over a decade, the Motley Fool Stock Advisor, has quadrupled the market.* 

David and Tom just revealed what they believe are the ten best stocks for investors to buy right now... and Walmart wasn't one of them! That's right -- they think these 10 stocks are even better buys.

Click here to learn about these picks!

*Stock Advisor returns as of June 1, 2019
The author(s) may have a position in any stocks mentioned.

 

This video was recorded on June 21, 2019.

Dylan Lewis: The focus so far in this conversation has been bias, but I think the reality is, it doesn't matter what side of the aisle you're on, it seems like people are pretty annoyed at these platforms. I know that on both sides of the aisle, there have been representatives that have spoken out about these platforms needing to be policed more, needing some vetting process for the content that's out there. A lot of that coming after the deepfake video of Nancy Pelosi was circulating online. I think the reality for these businesses is, they've enjoyed all of this user-generated content for a very long time and had a free pass, and they are now going to have to start dealing with the fact that there are some serious repercussions for spreading content virally in a way that maybe they weren't prepared for.

Dan Kline: And the technology is getting better. Let me explain what a deepfake is. Let's pretend there's a lot of video of Dylan and myself out there. We've done a lot of these shows. In theory, a not-that-talented programmer, videographer, whatever you want to call them, could take the video of us talking about Facebook and turn it into video of us saying terrible things about the U.S. Men's National Soccer team. I picked the most benign topic I could possibly think of. And we're just terrible. We're trashing the coach, we're doing everything. And we get a huge backlash. Facebook doesn't know it's fake. We don't know it's out there. And all of a sudden, there's this whole community of people who don't like us.

What happened with the Pelosi video is a fake video made her look like she was giving a speech while intoxicated. It wasn't true. Facebook didn't actually take the video down. They didn't really have the correct procedures in place to deal with it. So for getting this legislation, there needs to be very high-tech methods of figuring out if the video is real. There's been Mark Zuckerberg deepfake video. It would be very easy to create a video of almost anyone saying almost anything. You can see, easily, why that would be very bad.

Lewis: Yeah, that was probably one of my favorite responses to the Nancy Pelosi issue -- someone decided to go out there and take one of Mark Zuckerberg just to see, OK, this is how you handled it when it was a political representative. How would you handle it if it was your CEO saying some things that were obviously not so great to be seen in the public light?

Kline: Right. Are you a Howard Stern fan, Dylan?

Lewis: I'm not, but I know our man J-Mo is.

Kline: Howard Stern historically has taken people's audiobooks and cut them up to have them say things that are clearly things they would not say, but it sounds fake. [talking choppily] "The person is talking like this!" So, you get the joke. When you watch a deepfake video, you do not get the joke. You think, "Oh, my God, did Nancy Pelosi give a drunk speech? Is Dylan Lewis for some reason in the Industry Focus chair, going on a terrible rant?" That has to be dealt with. Figuring out what is legal but odious speech that has to be protected -- you're allowed to have unpleasant views; you're not allowed to have libelous views, you're not allowed to push conspiracy theories that have been disproven. How do you police that? What's the happy medium between self-policing and the government stepping in and saying, "This is how it has to be done"?

Lewis: Yeah. I think the worst-case scenario for these companies is that it makes it much harder for them to have user-generated content that people want to engage with posted on the platform, because then it's harder for them to get people to come to the platform, which makes it harder for them to serve up ads. I think regardless, it's going to be something where, we've gotten very used to executives and management from Facebook, Alphabet, etc. appearing on Capitol Hill. It seems like that's going to continue.

Kline: It's going to continue. And look -- Facebook, YouTube, all these companies have to invest more heavily in figuring this out. They have to be able to show, "Yes, we kicked this person off our service. Here are the terms they violated. Also, here's how this is being applied unilaterally." Because, yeah, if you kick off a right-wing extremist for violating terms, that's OK; but if that person can come back and say, "Hey, wait a minute, here's this guy on the other side of the aisle. What he's doing violates the terms as well," you have to be able to defend that, and they're not spending the money.

What's not going to work -- and I can speak from experience, I used to edit two daily newspapers where I had to moderate the comments. These are papers maybe being read by 20,000 people total, and there were not enough hours in the day to moderate the comments/deal with the people who, when I rejected their comment, usually because they personally insulted someone else, violating our policy, I would spend hours a day on the phone with people upset about it. You have to figure out how to keep this automated and to keep this real time, and that means investing a lot more than they have been investing.

Lewis: Yeah. And the reality is, a couple of years ago, I think a company like Facebook could have said, "Oh, we haven't realized the scale that we've reached in terms of people accessing information and our role in spreading information and making it viral."

At least over the last three or four years, it's been pretty clear that, you know what? You guys know what you're doing, and you need to be held accountable for it, because there's too much bad information out there, and it's really ruining public discourse.