It was recently reported that about 50 million people had their Facebook (NASDAQ:FB) data scraped without their consent or knowledge. In this episode of Industry Focus: Tech, host Dylan Lewis and Motley Fool contributor Evan Niu take a closer look at the timeline of the privacy scandal, Facebook's terrible response to it, and what this could mean for the company going forward.

Find out how worried Facebook should be about the #DeleteFacebook movement, why management's response to the scandal has been so discouraging, what to keep in mind about CEO Mark Zuckerberg's ability to run the business, which particular metrics shareholders should keep an eye on in light of the scandal, and more.

A full transcript follows the video.

This video was recorded on March 23, 2018.

Dylan Lewis: Welcome to Industry Focus, the podcast that dives into a different sector of the stock market every day. It's Friday, March 23rd, and we're talking Tech, and whether to defriend Facebook. I'm your host, Dylan Lewis, and I'm joined on Skype by senior tech specialist, Evan Niu. Evan, it's been a pretty brutal week for Facebook and Mark Zuckerberg.

Evan Niu: [laughs] That's an understatement.

Lewis: I can't really think of a worse playbook for how to handle a corporate crisis than what we saw play out over the past couple of days. It's been a little bit all over the place. We're going to run through the timeline of what's happened with this Cambridge Analytica story, but the thing that was so tough for me to see is the immediate denial of any trouble or wrongdoing and saying, "This wasn't a hack, this wasn't a breach." When you see a company do that from the get-go, that always makes you scratch your head a little bit.

Niu: Yeah, their response from the very beginning to this story has been not the best way to handle a scandal like this. [laughs] 

Lewis: So, for people who may not be familiar with what's been going on, this is something that's been years in the making. Facebook used to allow third-party apps to scrape user information, and there would be some element of user opt-in for a lot of these. It would be part of surveys or apps that were available on the platform at the time. So, the information that would be grabbed would be things like, things that you like on the platform, maybe some information on your friends as well.

You go back to 2014, Cambridge University researchers were using this for academic work with personality tests, and they were approached by this political consulting firm, Cambridge Analytica, and they decided not to work with them. And then, one of the researchers from that group, Alexander Kogan, basically went off and built his own version of it and then worked with Cambridge Analytica to give them this information, which was actually in violation of the terms and conditions that developers accept when they worked on Facebook's platform at the time.

Niu: Right. And I think, this was one of those personality quizzes, and when people just check this box, no one reads these things anymore. You check this box that consents to share your data. But the thing is, at the time in 2014, Facebook's tools allowed it so where you could consent to share your friend's data. So, even if you never took one of these personality quizzes yourself, if you have a friend who does, and there are a lot of people who like taking these types of quizzes on social media, that could inadvertently expose your data, even if you never even took it.

Lewis: And that's part of what makes this so controversial. I think the number that I've seen is, of the millions of people who had their profile scraped, only a tiny portion, I think it was 270,000 people, had agreed to let researchers scrape their data to begin with. So, you talk about the add-on effect of letting your friends' information come through, too, and that's where people start to get really mad. It's like, Betty, their high school classmate or someone like that, decided to take this survey, and now they might have my information. Of course, some of the issue here, too, is that this was being collected under the guise of academic research and that it wasn't going to be used for commercial purposes. And at the time, had Kogan simply collected the data and kept it for academic purposes, it would have been fine, according to Facebook's developer policies. The problem is, he then gave it to Cambridge Analytica, which violated the developer agreement.

Niu: Right. It's crazy, because the multiplier effect is huge. 50 million people's information around 300,000 people. Because people have hundreds of Facebook friends. It's pretty crazy that they even let that happen in the first place back then. That doesn't sit right with you. Just, common sense, it seems silly that you would allow one person to share all of their friends' data, too. It's kind of a big oversight on Facebook's part, there.

Lewis: Yeah. And over time, they've worked to something that's maybe a little bit more palatable for people. What's amazing is, this story goes so far back, and the versions of it are so different depending on what your time reference is. If you go back to 2015, Facebook learned that information was being turned over to Cambridge Analytica, so they removed Kogan's app from their platform and they requested confirmation that all the data had been deleted. And you think that's that. And what winds up happening is, it resurfaces several years later, because there's a possibility that maybe that data wasn't deleted and it's still out there, and these raw files are available, these profiles are available. And given some of the issues, the perception that there's been election meddling and that type of stuff, the idea that you could target people on social media in a hyper focused way is very unsettling to a lot of people, understandably so.

Niu: Right. And there's been quite a lot of backlash now, which we'll get on later. It's been a pretty wild week. They initially threatened to sue some of these news outlets. Now they're like, "OK, we shouldn't have threatened to sue you." [laughs] Just, their whole response has been terrible. They really fought hard to try to keep this out of the public eye, but of course, now it's out there, and now they have to deal with not only the fact that it happened, but also their own response to it, which has been really sorely lacking in a number of ways.

Lewis: Yeah, this came back into light because of some reporting done by The New York Times and I believe, another outlet. And Facebook did not respond particularly well. They went to ban Kogan and Cambridge Analytica from the platform, but the first initial responses from public officials for the company were saying, "This isn't a hack, this isn't a breach, this is something that these companies were authorized to do at the time that they were doing it. It's just that they then decided to violate our terms and conditions with what they did with the information that was collected." And, as someone who maybe just had a ton of information about them exposed, that's not reassuring at all.

Niu: Yeah. If anything, it's actually even more troubling, that third-party developers could get this information all using the Facebook tools in the way that they were designed at the time. Of course, those developers shared it in an improper way. But, the point is, Facebook tools back in '14 were designed in a way that straight up allowed this to happen in the first place. So it's almost even worse than getting breached by hackers maliciously.

Lewis: And what didn't help, either, was that Mark Zuckerberg waited four days to respond. So, you had all of these other people at Facebook making public comments and talking on the record about the issue. Four days later, you get a Facebook post from Mark Zuckerberg, and he gives a timeline of the events, kind of the plan for the platform, and this is what people want all along. They want the CEO to stand up and say, "We're going to be accountable for this." I think maybe we got that a little bit. But I also noticed, in looking at the formal post, he never said sorry and he never actually apologized.

So, there's some element where he's accountable for this, and there's this "with great power comes great responsibility" element to it, and he clearly understands that they need to make some changes to the platform. But, he wasn't like, "Hey, guys, we really messed up. I'm so sorry about this." He was still a little guarded in what he said. 

Niu: Right. Definitely, that was a big omission on his statement. I will say, I think he did a good job at least clearly telling us the timeline in a pretty clear, concise way. Because some of these news reports are kind of hard to follow, but he laid it out very straightforward. And he also laid out their plan on how to move forward, which is to investigate and audit apps that have access to your data. Before, they made these changes to how these tools work. They're going to restrict the access that people have going forward, they're going to automatically remove access for anything you haven't used in three months, I think. They're also going to try to bring these privacy tools more prominent. Because these privacy tools are buried in the settings menu on Facebook. They're going to try to put these up right front and center on the Newsfeed so that people know they're there, reminding people that they have more control over the privacy settings than they might think. It's a way to make sure people understand how their data is being shared, trying to give them more control.

Lewis: So, we talked a little bit about the media backlash that happened with this. There's also the user backlash that comes with an event like this. And perhaps not surprisingly, #DeleteFacebook becomes this trending topic on Twitter. In my view, at least, one of the more damning posts or public comments about the whole topic, you have WhatsApp co-founder Brian Acton posting that he was deleting his Facebook. He's not currently with WhatsApp, but Facebook wrote WhatsApp a pretty big check as validation for their work.

Niu: Yeah, he definitely got rich off of it. [laughs] 

Lewis: So, for someone that has some ties to the business to be like, "You know what? I'm deleting my Facebook," I don't think that bodes particularly well for public perception of your platform.

Niu: Right, exactly. I think, if we want to look at it from the angle of investors, I think there's three main angles you want to look at it from. You want to look at it from a user impact perspective, an advertiser and a regulatory angle. 

On the user side, like you said, people are outraged. This #DeleteFacebook is trending. If you look on Google Trends, the term "Delete Facebook" is rising there, too. So, it's clear that there's a lot of user backlash, and a lot of people are talking about deleting, deactivating. What's not clear is if this backlash is going to be sustained, or if it's just the initial gut reaction, and if you fast-forward a few months it fades away. We don't know. It's too early to say. 

But, investors will definitely want to keep an even closer eye on user metrics in the coming quarters. Users metrics are obviously one of the headline metrics anyways, but now it's even more important to really see what happens to these numbers in the wake of this scandal. We've already seen that users in North America are kind of saturated, maturing, so they're flatlining already. Which is not a bad thing, but if that number starts to head down, these are trends that investors are going to want to keep an eye on.

It's also worth noting, in certain parts of the world, deleting Facebook isn't really even an option. In some parts of the world, Facebook is the internet, particularly in some emerging markets like in Southeast Asia, where users rely heavily on Facebook as their primary way to interact with government representatives, their communities, things like that. So, Facebook has a lot of work cut out for it to regain users' trust.

Lewis: Yeah. And, actually, it doesn't seem like you're buying the trust story, because you deactivated your Facebook account, right, Evan?

Niu: Yeah. I actually personally deactivated my account because this whole thing is really creepy. I'm not super active on Facebook anyways, and I certainly don't take these quizzes. But, as you mentioned before, maybe one of my friends took a quiz and ended up sharing my data. Who knows? But, I haven't really been getting a whole lot of value out of Facebook lately. In recent months, they've really ramped up their notification spam to this really annoying, obnoxious level, where basically, you're getting notified when anyone anywhere on your friends list does anything on Facebook at all. It's really obnoxious. I didn't go as far as to delete my account, so I know they still have a lot of my data. But, I'm definitely going to take a break from the platform and sit on the sidelines for a little bit.

Lewis: I think those push notifications, in a way, speak to how we've interacted with Facebook and how that's changed over the past five, 10 years. Back when I was in college, you'd post statuses all the time, you'd write on people's walls all the time. And that's really not how a lot of people use it these days. It's almost more like this consuming thing. You scroll through your feed, you see videos, but that engagement isn't quite there in the same way. I get updates sometimes like, someone updated their status, and it's like, gosh I haven't done that in probably a year. And it was a major life announcement when I did that, it was like I moved or something. So, I wonder if those are there because they're realizing that people aren't working with the platform the way they used to.

On the note of North America, some recent data came out from Edison Research. They do their Infinite Dial research note, and it just came out for 2018. They survey a couple of thousand Americans. They found that use of Facebook among Americans dipped for the first time ever in their most recent survey, and that drop was most noticeable with users between 12-34, fell 12% from 2017. So, we're looking at what's going on. This is, in some ways, a platform that's struggling with a lot of bad news, but I think there are also some demographic shifts that are happening there that maybe are impacting its relevance.

Niu: Sure. And, of course, we always talk about Snapchat getting that demographic. I think it all plays into what Facebook is trying to do. Recently, you've heard Mark Zuckerberg talk a lot about trying to reduce passive consumption of content on the site, which is, like you mentioned, what people use it for nowadays. You read links, you're not really engaging with people as much anymore. So, I think this shift in their notifications is trying to get people to interact with their friends again. 

But, the hard thing is, I don't think they're good at picking who you actually want notifications about, so they're just letting you know any time anyone posts a photo, updates anything, puts a comment. It's just crazy, the amount of stuff they do now. But, I think they're trying to pursue this whole of meaningful interaction and engagement idea, but I don't think they're executing on it very well, and then this whole thing comes along. So, I think another aspect to look at is the advertising that we mentioned. Which is arguably the most important for investors, because that's where the money is coming from, right?

Lewis: And we're going to do that, Evan, on the second half of the show. Evan, we talked a little bit about the user side of things. Why don't we talk a little bit about the advertiser side? All of these elements of the story that are unseemly to consumers, it's exactly what makes Facebook such a compelling platform for marketers.

Niu: Right, exactly. I would argue that the advertiser impact is the most important aspect to look at for investors, because that's where all the money comes from. And thus far, over the past few years, this whole controversy over Russian meddling and using social media and Facebook to interfere with the election, it's been this huge overhang on Facebook as far as the coverage and the sentiment, perception of Facebook. But we haven't seen any impact on the financials. For example, last year, 2017 -- and again, this controversy has been going on for about two years -- advertiser revenue was up 49% to $40 billion in 2017. So, there's clearly not a whole lot of impact on the financials as far as advertisers are concerned. 

I think the irony here is, the whole situation really just shows how effective Facebook ads really are, so much that they could have potentially helped sway the election. That tells advertisers that their ad dollars are still well-spent on this platform, because the ads are effective and they're doing their intended purpose. Of course, advertisers follow users, and users also steadily increased throughout 2017 as well, despite all this ongoing controversy. So, it certainly does tie back to the user impact. If users start to actually leave Facebook in large numbers on a sustained basis, that could subsequently impact the advertising business and the financials. But, again, it's still a little too early to call. Mark Zuckerberg was on TV yesterday, and he said that he hasn't seen a lot of people deleting in large numbers --

Lewis: Who knows what that means? [laughs]

Niu: Right, exactly.

Lewis: On a two billion user number, who knows what large numbers mean to Mark Zuckerberg?

Niu: Exactly, and where, geographically, those users are located. So, there's a lot of details that we don't know yet, but the potential is still there that the financials could hurt if there's massive user backlash.

Lewis: And something that's kind of interesting with thinking about user composition on a social media platform is, I've long had this theory that it's good for a platform to have folks outside of the millennial demographic on it. I think, you look at click through rates, you look at people's willingness to engage with ads, I think that tends to be higher as you get above 35. I think most millennials have come of age in an era where they've been inundated with ads online, so you have ad blindness in a way that maybe you don't for people who came out to internet later in life. So, you think about that stat before where most of the people leaving the platform were between 12-34, that's where it was most felt, you get outside of that and those might be the people that are staying. Everyone that seems to be leaving from 12-34 is going to Instagram. That's another property that Facebook owns.

Niu: Right. If you're going to delete Facebook, technically you should delete Instagram, too.

Lewis: Yeah, you can't just protest one, right? You're still using their products.

Niu: Throw away your Oculus Rift, delete WhatsApp. [laughs] 

Lewis: Yeah, you have to go whole hog. You can't just do part of it. So, I think from an investor perspective, though, something that's on my mind with this story is the regulatory impact. Any time you have a lot of people throwing their hands up and wondering about the privacy of their data or the security of their data, that's where lawmakers start to get interested, and they start to knock on your door and say, "Hey, we need to have a couple of chats." We're seeing that that's happening already. People want to talk to Mark Zuckerberg. Thinking about the digital media business in general, that's what I'm looking at for Facebook and even Google, too.

Niu: Right. Amidst all this Russian election stuff, there's been a lot of talk over the past year or two about regulating political ads specifically. But now, the conversation is expanding to talking about other forms of regulation, more broadly around privacy in general. And regulatory risk is a part of many industries, and now that risk is becoming a very real possibility for Facebook. 

Fundamentally, no companies like being regulated, even though regulations are generally pro-consumer. But, not only are there always substantial costs associated with compliance, but companies basically lose a lot of control over how they can run their businesses. And that can be very burdensome, very specifically within tech companies, which put a lot of value on agility because tech evolves so rapidly that you need to be able to move fast and keep up with stuff. Facebook's motto has always been move fast and break things, but it doesn't really work out very well when the thing you're breaking is democracy. [laughs] 

Lewis: [laughs] Or if you're breaking user confidence, that's not particularly great, either, right?

Niu: Yeah. And to bring this home, the E.U. actually passed a regulation back in April 2016 call the General Data Protection Regulation, or GDPR, that actually goes into effect May of this year, which is just two months away. And it imposes rules across all E.U. member states that apply to all companies processing data on E.U. residents. And violations can come with potentially massive fines, up to 4% of revenue. That's the theoretical maximum, and a lot of companies, if they violate these types of things, they can usually settle for a lot less. But the point is, a lot more countries are becoming more cognizant about wanting to take privacy of their citizens seriously. And this whole controversy speaks to that. And Facebook is certainly aware of this E.U. regulation, because it was passed almost two years ago. But the possibility of, what if the U.S. does something like that, or other countries? So, it's another thing that's going to be on the minds of investors, it's a much more prominent risk now.

Lewis: Another topic that this has bubbled up is this idea of, should Mark Zuckerberg be at the helm of Facebook? And I think, looking at the tour of the United States that he has done over the past year and a half or so, his repeated trips to China, there's been speculation that his ambitions maybe go beyond Facebook in that he has some political aspirations at some point. But, thinking about where he is now, some people are saying, the way this has been running and way the platform has been handled over the past year and a half, two years, maybe he shouldn't be in charge. You may feel that way, I think the reality is, as long as Mark Zuckerberg wants to be at the helm, he is going to be running the show there.

Niu: [laughs] Right, exactly. It's pretty common and scandals and crises like this for people to call for the CEO to resign or step down in the wake of the scandal. But you have to remember, Zuckerberg has about 60% of voting power, so no one can really oust him. Technically speaking, that's what a board of directors is there for, to fire a CEO if they have to. But since he has so much voting power, he can single-handedly vote directors in or out. So, those directors are still ultimately beholden into him. There's really not a whole lot of accountability that public investors can have here. 

So, like you said, he's only going to step down if you personally feels like he's not the right person for the job. But, I don't think that's the case, and I think it's pretty clear that he does plan on leading Facebook for the long term. He's still young, certainly, and he hasn't expressed any doubt in his own confidence, other than to try to own these problems. I wouldn't expect him to relinquish the CEO position anytime soon, and there's really not much you can do about it.

Lewis: And speaking of the investor perspective here, I am a Facebook shareholder, as are you, right, Evan?

Niu: Yeah. 

Lewis: How are you thinking about this? What's your going in for the next couple of months with this company?

Niu: I don't know. It definitely has me questioning Facebook's role. Again, from an investing perspective, the numbers are still there, the financials are still there. But then, you have to question, they've clearly screwed up really badly, and they're not handling it well. But lots of companies face huge scandals and issues. If they can overcome it, they can still thrive in the long term. I'm still kind of on a wait and see how this plays out. I'm certainly questioning it, but I'm not making any decisions quite yet. It's just, I'm going to wait a little bit and see how this plays out over the next few months, but I'm certainly not feeling happy about my holdings right now. [laughs] 

Lewis: I think what's a little frustrating, too, if you're looking for indicators as an investor, is the way that the MAU number is calculated, it might be that any serious exodus that we see from the platform won't really bear out until calendar Q2 results. It might be that in Q1, the numbers are strong enough that it doesn't really matter. So, if you're looking for that as an indicator, there might be an information lag there. Which is why -- putting this out to listeners -- if you have decided to deactivate or delete your Facebook account, please write into the show or tweet us @MFIndustryFocus, write into the email industryfocus@fool.com, I would love to get your rationale for it and a sentence or two. Please do that.

My perspective as an investor is, I think about how many people are on the platform. People get really annoyed all the time when this type of stuff comes up. The reality, though, is that we've entered this social contract with these companies, where we're not willing to pay for what they're offering, so we're willing to be the product. And we get up in arms sometimes when it doesn't quite go the way that we like it too, or there's some unseemly element to that relationship. But, at the end of the day, we're not willing to pay to be connected with our friends online. It's the case with Facebook, it's the case with Instagram, Snapchat, Twitter, you name it, there really isn't a paid-for social media experience out there. And so long as that's the case and they're the big titan here, I don't really know that anything is going to change that much.

Niu: And it makes you appreciate this crusade that Apple has been on for the past two or three years, really highlighting how they approach privacy. They've really antagonized these advertising companies, including both Google and Facebook. But, at the core of this privacy debate, exactly like you mentioned, we've agreed to this, implicitly or explicitly. So, how angry can you be when things like this happen, when you knew all along that this was a possibility, vs. a company like Apple that works much harder to safeguard your data, and has no business built on selling that data. So, it does resurface this ongoing debate that has always been there. But, yeah, we have given them this data willingly, so we're also kind of complicit when things go wrong.

Lewis: Yeah. And if you have issues with the idea that you're being, then, targeted with very specific messages, that's still a message that you have to be susceptible to. It's still seeing a marketing message.

Niu: [laughs] Yeah. It's targeted.

Lewis: And you have to decide that that's something you want to react to. In some ways, it's the nature of our digital world. I think it's kind of unfortunate. And in some ways, Facebook handled this well. When they got news of it, they reacted, and maybe didn't take as strict of a reaction when it came to laying out platformwide response and doing things to scale back and make sure this is less likely to continue to happen, or that Cambridge Analytica can't continue to do what they had been doing already. My issue with them as a company really was how they responded this most recent time, where they had a lot of bad news came out and they immediately started finger pointing. Most people didn't know the backstory, that they had already done quite a bit of preventative work on this, so they immediately looked pretty obnoxious, and that's just not a good look for a corporate enterprise. Just own it. If you make a mistake, just own it.

Niu: Yeah, exactly.

Lewis: That's my investing takeaway, and my management take away. If I could say anything to Mark Zuckerberg, it would be, just own it.

Niu: He's trying. He's not doing a good job, but he's trying.

Lewis: I don't know about that. That was a long-winded rant to end the show. Evan, do you have anything to add before I let you go?

Niu: I think I've said enough. [laughs] 

Lewis: [laughs] I think we both have. I'm going to cut it here. Listeners, that does it for this episode of Industry Focus. If you want more of our stuff, subscribe on iTunes or check out The Fool's family of shows over at fool.com/podcasts. As always, people on the program may own companies discussed on the show, and The Motley Fool may have formal recommendations for or against stocks mentioned, so don't buy or sell anything based solely on what you hear. Thanks to Austin Morgan for putting up with our rants behind the glass. For Evan Niu, I'm Dylan Lewis. Thanks for listening and Fool on!

Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. Dylan Lewis owns shares of GOOGL, AAPL, and Facebook. Evan Niu, CFA owns shares of AAPL and Facebook. The Motley Fool owns shares of and recommends GOOGL, GOOG, AAPL, Facebook, and TWTR. The Motley Fool has the following options: long January 2020 $150 calls on AAPL and short January 2020 $155 calls on AAPL. The Motley Fool recommends NYT. The Motley Fool has a disclosure policy.