After the unmasking of Facebook (META 0.14%) whistleblower Frances Haugen, her explosive testimony before Congress, and the release of The Wall Street Journal's months-long investigation entitled "The Facebook Files," the eyes of the world are on Facebook more than ever before. In this segment of Backstage Pass, recorded on Oct. 6, Fool contributors Brian Withers, Demitri Kalogeropoulos, and Rachel Warren discuss these recent events and share their thoughts about the future of the big tech company. 

10 stocks we like better than Facebook
When our award-winning analyst team has a stock tip, it can pay to listen. After all, the newsletter they have run for over a decade, Motley Fool Stock Advisor, has tripled the market.*

They just revealed what they believe are the ten best stocks for investors to buy right now... and Facebook wasn't one of them! That's right -- they think these 10 stocks are even better buys.

See the 10 stocks

 

*Stock Advisor returns as of September 17, 2021

 

Brian Withers: We're going to continue on with the Facebook thing because I think it's important. Sometimes your companies get in the news, the stocks that you own, for not reasons you're excited about. [laughs] This is a little bit of approaching things from that direction. Certainly Frances Haugen, I think that's how you say her name.

The Facebook whistleblower came across and I know you guys mentioned you watch some of the testimony yesterday. She came across to me as extremely knowledgeable and credible. When she didn't know the answer, she didn't make stuff up she said, "That's not my department, but here's what I've seen." Was just really a star witness. Now, Facebook comes out and says, they're trying to discredit her saying that she wasn't in the know, she'd only been there for two years.

Zuckerberg even came and said her claims don't make any sense. You've got management on one side saying one thing, and you've got this whistleblower who seems incredibly credible on the other side, how should investors know who to believe? Demitri why don't you take this one.

Demitri Kalogeropoulos: Sure. Yes. That is a tough question. I wasn't able to watch the testimony, but I did read some of it and like you said, she did seem knowledgeable, credible. She was commenting on some areas that she was working on for sure. But I think what I'm picking up is it seems like this clash is coming down to a disagreement about Facebook management's intentions and priorities. That's what I'm watching as a shareholder, it's definitely worth following that.

If you're going to be an investor, whatever company you are if that's into question here. Zuckerberg did say that his team definitely isn't trying to hide anything about this internal research they've been doing about the impact on social media on different groups. He's saying that their team, they do try to prioritize the well-being of the platforms users and that's important. 

On that note, it's definitely a good thing I think that the company is doing this research on its own. It's looking into ways they can improve the experience. You would expect them to do that for sure and look for ways to have people leaving the platform feeling better than when they got onto it.

That's the tricky part I think. I guess the strategy here I would say if I could recommend anything to management would be to get as public as possible about this stuff. Get it out. Get these research tools that they're looking into, out in the open. Facebook as you said I think there's something like 2 billion users and growing right now. They are the only company I know of that counts active users in the billions.

It's definitely whether they like it or not, they're the leader in this controversial social media space. They're going to have to get out and lead in these stuff. I prefer them to get ahead of these things rather than run from behind, where it seems like they've been doing lately. Otherwise they're going to have just more really tough periods for the brand like they've been going through lately.

Rachel Warren: I actually listened to a good portion of the testimony yesterday. I agree that she appeared to be very sincere and thoughtful about what she was saying. There were moments, for example, when she was under questioning where there would be questions, maybe posed her that were phrased in a way where the answer would seem to be automatically disparaging to Facebook.

She would put a caveat and maybe, "Oh, I wasn't in that department, I didn't know" or she'd clarify, "Well no that's actually not what their process is." Not this out of the gate trying to just trash anything Facebook does, but having very thoughtful, seemingly data-based answers for what she was saying. She seemed to be speaking in a fair and balanced manner. 

But like I said before, at the same time, I think Zuckerberg and his team have a right to be heard as well and share their side. But I think the key points in that is they need to get ahead of it. The only way to really get ahead of something like this where you're having so much data that's alleging that these are things that were known to the company that were causing meaningful harm to users.

You need to get ahead of that and say, OK, did we know, did we not know? I think that these are questions that need to be answered. I think it's to be expected that perhaps they would try to discredit what she's saying.

Although I do think if Facebook tries to simply dismiss her claims as they don't make sense, that maybe isn't really going to hold water for very long. I don't think that it's enough to dissuade users from wanting questions answered about what Facebook knew and when they knew it, and whether they prioritize the bottom line over product safety. I think investors have a right to know it as well.

I think the interesting thing is, with any other type of a business, you would be thinking, you wouldn't really be asking these questions of whether or not the company might be prioritizing its profits to a certain extent. I mean, obviously it's a business, it operates to hopefully turn a profit.

But at the same time, there's a responsibility to ensure your product is safe. In this case, it's the digital product, which connects billions and billions of people around the world. I think investors should keep their minds and eyes open as more information comes out. 

I honestly think it's too soon to really know. I think the underlying core of Facebook's business is still a quality business. I think there's a lot of pitfalls to social media that are well known, but it also has a lot of benefits. Again, it really comes down to how you use it. I do think when you have a company that is on the verge of a monopoly like Facebook, you do need better accountability and clarity, where some of these issues are concerned.

But like I mentioned at the same time, this is a business, it's going to operate according to what benefits the top and bottom line.

I think the question is, when you have a company that wields really so much influence in people's everyday lives around the world on a level that we haven't really seen elsewhere in human history, when does an ethical responsibility come into play to moderate and curb these new and existing products? I think the general public deserves clarity on that. I definitely think that's what investors want to know and deserve to know.

Withers: It's really interesting, what you're mentioning, Rachel, about the responsibility piece is that's what's different, I think about software than physical products. Whether you talk about things that can hurt other people, once they are sold out into the marketplace, Tesla can only do so much to encourage people not to speed, to wear their seat belts, to use the autopilot responsibly, etc.

The same can be said for any other products that could potentially cause harm. But a software product, there's a lot that Facebook knows, somebody may have lied to get on the platform, but how often they use the platform is absolutely something that could be potentially managed in a way.

The fact that Facebook knows this, now they don't necessarily know how people are feeling after using the platform and whatnot. I think you guys both made references to third-party studies or other studies. 

That was something the whistleblower brought up, too, was many of the other industries have these third-party things set up, to where you can get independent views and perspectives and maybe those need to be better publicized or Facebook needs to embrace some of those and recognize them. But I'm like you guys, they need to get out in front of this and continue to take it seriously.

Even more so just for the people that are impacted by the platform in negative ways. I would just think that a company wouldn't want to have them be the sole reason why somebody does something or has mental health problems because of the product.

It's a tough one. It will be interesting to watch over the next few months or however long this goes on.