Motley Fool host Deidre Woollard recently caught up with Tom Kemp, a cybersecurity expert and author of Containing Big Tech: How to Protect Our Civil Rights, Economy, and Democracy.

They discuss:

  • The implications of biometric data collection in a world where big tech runs rampant.
  • Which companies are getting the privacy game right (and wrong).
  • Angel investing, and workarounds to weak links in existing cybersecurity systems.

To catch full episodes of all The Motley Fool's free podcasts, check out our podcast center. To get started investing, check out our quick-start guide to investing in stocks. A full transcript follows the video.

This video was recorded on Sept. 24, 2023. 

Tom Kemp: That tells you that there is a hunger for privacy. That people do not want to have third-party entities tracking them and selling the data to data brokers. Do we really need to have all these apps leaking out our precise geo-location to conduct commerce in the United States? The answer is no.

Mary Long: I'm Mary Long, and that's Tom Kemp, a cybersecurity expert, angel investor and author of the new book Containing Big Tech. How to protect our civil rights economy and democracy. Deidre Woollard caught up with Tom to talk about biometric data and new frontiers in the battle for privacy. How AI regulators could find inspiration from the grocery store and some perfect world solutions that balance data collection and personal privacy.

Deidre Woollard: I really enjoyed the book and I just want to start with the most basic thing, the title, so containing Big Tech, what is it and why is it necessary?

Tom Kemp: Absolutely. Yeah, as you mentioned, we have five Big Tech companies. We've got Meta, Apple, Amazon, Microsoft, and [Alphabet's ]Google and they've certainly built innovative products that improve many aspects of our lives. But what we're now seeing is becoming more aware of is that their intrusiveness and our dependence on them have created some pressing threats to our civil rights, our economy and democracy. Those threats include the over-collection of our data, the potential problematic ways that AI can be used, and the stifling of competition and entrepreneurship due to their dominant market position. With this book, I wanted to expose the consequences of Big Tech's digital surveillance, their use of AI, their monopolies. But I also wanted to provide solutions and I think by containing the excesses of Big Tech, I believe we can ensure that our civil rights are preserved, our economy is competitive and healthy, and our democracy is protected.

Deidre Woollard: Well, we're having this conversation at a very opportune time. In the book, you talk a little bit about Google's acquisition of Double Click and their ad monopoly on both sides of the ad equation. This week we've got the big lawsuit between Google and the Department of Justice, which is all about Google's dominance in search. What's happening here? Do you believe that Google's power in both search and in ads needs to be broken up?

Tom Kemp: Yeah, you're right. There are actually two antitrust lawsuits. The first one is happening as we record this. I think we're in Day 2, it regards search. The second one will be about their ad tech platform which got through the acquisition of Double Click, but that's going to happen later. Specific to the first one right here, US versus Google, focuses narrowly on the company's search engine and what the government alleges that Google's 90% market share, they're leveraging that to throttle competition in search. What they are basically saying is that the government is arguing is that Google has maintained this monopoly by not making better products, but by locking down shelf space, basically where the consumers might be able to find a different search engine and so for example, it pays out billions of dollars to Apple to become the default search engine and then someone like do go complains, hey, it takes 15 clicks on an Android device to switch over from Google to their browser right there. Basically, it's almost like a replay of the lawsuit that the US government had against Microsoft in the 90s, where Microsoft was bundling the browser on the platform. To answer your question, you should it be broken up, et cetera, I really think in the search market, I think you can do what you did with the Microsoft case, which is say, hey, you can't have all these exclusionary agreements that basically take up all the shelf space and lock people in. You really need to give people choice by having one click to switch to another browser versus 15 clicks. Then in the case of the ad tech antitrust, because Google is both the pitcher, the catcher, the umpire, they own the supply side, the demand side, yes, I do actually think that there probably warrants a break up right there.

Deidre Woollard: In terms of the suit that's going on right now, it sounds like then part of the answer is the relationships with Apple and Android and making it so that you have another option. Is that what you're saying?

Tom Kemp: Yeah, I mean, basically Google will say, oh, you have options, it's a click away but it really isn't a click away. It takes you 15 clicks to do that, according to Duck Do Go, I'm just quoting them. Then in the case of the licensing agreements, Duck Do Go doesn't have $15 billion. If the argument the government has is that if Google is truly such a great operating system, why do you have to pay so much money to Apple to ensure that your Chrome browser is the default right there? I think probably what's going to happen is Judge Meta is going to probably ding Google in certain contractual practices they have, but I doubt that he will do what you some people have called for, which is requiring the split up of the browser from Google. I think if there's going to be any splitting, it's going to happen on the actual advertising platform that they got through Double Click, because it's clear that Google owns all components in that market and they're taking about 50% of every dollar being spent if you do the analysis right there. I think that is really where they're going to probably face a bigger hammer as opposed to Judge Meta with the Google search lawsuit.

Deidre Woollard: I want to switch topics a little bit and talk about some other ways that tech is a little bit intrusive. One of the things I find fascinating is biometric data. We're starting to see more and more uses of this. I know Amazon has been testing out paying with your Palm. How is this different from just typing in a password and what makes this concerning?

Tom Kemp: Well, the big difference is, is we can always change a password or pick a new one, but we can't pick a new fingerprint or iris. Using our fingerprint or iris or our voice can be faster. It can be more secure than typing or guessing a password and so I think everyone likes having with the phone to have the unlock happen with the face. I just think we need to be careful about using that technology and actually selling people's iris, selling people's biometrics and so I think there needs to be guardrails because again, if your password gets stolen or someone buys your password, you can always change it. But if someone sells your biometric information, then you can't change that and then they have the keys to your kingdom because increasingly we will be using our face, our voice, iris, our fingerprint to access services.

Deidre Woollard: It's interesting because we are really trading ease of use for this privacy. One of the things I've noticed is that maybe there's a generational shift here because I've talked to younger people, they don't seem as concerned about this as I am. I'm wondering if the concern over privacy is something that varies maybe by age or demographic.

Tom Kemp: I think historically that's been the case is like the mindset of is like, well, I don't have anything to hide so that's OK. They gathered the information and frankly, in the early days of big tech, the mining of data was all about sermons ads and we made this trade-off. It was pretty annoying that if I was looking for red shoes, the red shoes followed us around for the next month. Maybe Deidre, if you did a research on a topic, then a friend of yours sees the ads being served. They were like, what were you looking at? [laughs] Why are you searching for that right there? But the reality is that we're now actually in a post abortion rights America. I think people are all of a sudden saying, wait a minute. Like the stuff that I used to do, the search information about certain topics, the places I visited, etc. Can actually be used against you. I now think that there's actually going to be a shift which also corresponds with the support for specific laws as it relates to reproductive health. Because people are concerned about their personal and sensitive data being collected and sold. I think it's also going to get worse with AI because we now know that musicians and writers are concerned about their IP being scraped by AI. Actors and screenwriters today are actually striking because of AI. An actor doesn't want to go in and get paid 10 bucks to have some pictures taken of them and their voice taken, and they never can monetize their picture or voice or face ever again. To me that your face, your voice, your biometrics is your copyrighted material, and it's your personal and private data and I think people will start saying, wait a minute, I didn't like how that person took my face and put it in a video. I didn't approve that. I think it's actually going to rapidly shift in the other direction, especially when generative AI kicks in and people are like freaking out about their face and voice being used in ways they didn't approve.

Deidre Woollard: Thinking about that from a policy perspective, I know you've done some work on privacy policies and data policies. How can we enforce that and is it just fines? Is it something else? What are the ways that it could go?

Tom Kemp: Well, first of all, I think at the end of the day, we do need a federal privacy law. I mean, it's ridiculous that we don't have basic rights over our data. I also think that now that one third of Internet users are kids, that's where it's at right now. We really need to carefully look at, is it really healthy for our society and for the kids to continuously be tracked and have all their behavioral information be collected. If that happened in the physical world, that those people would be arrested as stalkers. But for some reason we just allow the fact that all the kids sensitive information and I think as we are very familiar with that, the personalization, and it leads down to rabbit holes and negative body images and other stuff that happens because of that. I do think we need to have a rethink here, given what's happening and yes, as part of a federal privacy law, I do fully believe that we need some agency, maybe the FDC or part of the FDC, that actually can provide some enforcement to give people the right to know what's being collected and the right to say no to the sale of their data. I think that's really important for us to have.

Deidre Woollard: Well, I wanted to get into AI a little bit because you mentioned that you talk in the book about AI bias, the wide ranging implications of that. A lot of people for calling for AI regulations, but you also mentioned AI certifications, which I found fascinating. Why AI certifications?

Tom Kemp: I mean, first of all, I want to be very clear that I'm not calling for like airline safety when it comes to privacy or AI where there's never a crash. I'm calling more for like car safety where, yeah, you have to put the babies in the back seat. In the baby seat, you need to have some basic emissions. You need to have air bags. Look at the innovation that's happening that's in the automobile industry right now with all the electric cars. Specific to AI, what I'm talking about is high risk AI. Which is AI that can impact people's lives. I'm not talking about AI being used in a game or something like that. Now, specific to certifications, the finance industry has Certified Public Accountants and Certified Financial audits and statements and man, shouldn't we have something similar for high risk AI, where someone actually did some basic auditing and the people are actually qualified to actually do the audit and have gone through some certifications themselves, just like a CPA. Then I also think we need codes of conduct in the use of AI, including industry standards. For example, the International Organization of Standards should have certain quality codes and standards put forth as well. We do this in the real world, for different industries. That there are standards, in the financial industry. We have financial statements, CPA's, etc. Well, we probably should do this AI because AI is taking over that stuff and we should apply that to that as well.

Deidre Woollard: Well, in terms of the AI bias aspect of it with the large language models, do you feel like there's a need to examine what goes into those?

Tom Kemp: Yes, so here's my take is, is that we just need transparency and food labels. Here's a story. A friend of ours, my wife and I, they have a high school kid, and she was told to write an essay, but the best way to get in college. She wrote an essay that said, I'm going to move to Montana and play the bassoon, playing the demographics, angle right there. It was a funny essay. The teacher said, you didn't write this. This was written by ChatGPT. Like how can you prove it. Like did she do it or ChatGPT? She did really write it and so from my perspective, we should, as consumers, have the ability to go to ChatGPT with an image, a video, some text, and simply say, did you create this or not? Yes or no. A verification that's it. Or similarly, I'm on the phone, I should have the opportunity to hit 411 and say, am I talking to a human or a robot? Same thing on chat as well. I think we should just have basic transparency. Is it a human or is it a machine that we're dealing with? Or is this been created by a machine? Or 80% of this was created by a machine versus a human. That's the stuff that I'm talking about. It's like nutrition labels that we have for medicine and food. I mean, if we know how many calories are in a whopper, why don't we know that, if a certain image, was created by AI or not. I think this is just basic rights.

Deidre Woollard: That's fascinating. It would certainly change how students feel about ChatGPT right now if they knew that it could be verified. I want to switch back and talk a little bit more about advertising because, things have been changing in terms of cookies and things like that. But I'm also seeing this increase in retail ads. Companies like Walmart, Amazon, Kroger is now going to put ads on the freezer than the stores. I mean, the ad thing is just everywhere. Is there any good in this for the consumer if we're being sold to everywhere we go?

Tom Kemp: It's good if we consent to that happening and we have the right to know our data is being collected and we have the right to say no. It's just simply about having basic rights of saying, there are some people that may go into a grocery store and they may want personalization that happens based on their past purchases and there are people that may want to get discounts if they give up their data. That's perfectly fine. It just simply should be I want to know what you're collecting about me and at some point I may want to say no or I may want to limit that. You can use it Kroger or Walgreens or Walmart, but that doesn't necessarily to personalize it, but that doesn't necessarily mean that you should have the right to go around and sell it to anyone with a credit card to data brokers. I just think in general, it does get a little scary if they start collecting sensitive data and selling it. If I go into a Walmart and I buy over the counter products, I don't know if I buy adult diapers or prenatal pills or things of that nature that may have some insight on my health. Even though I may buy it for a friend or a relative, or an elderly adult, or a teenage kid or whatever. I'm not sure, I want people having access to that information. I still also think there should be limitations on the type of data that can be collected and sold. I don't want to walk into the pharmacy section and say, here's, something flash on the screen and says, Mr. Kemp, here's your adult diapers you need. That's crazy, but, I think it's all about consent knowledge, the ability to say no, and limiting the use of sensitive data about people.

Deidre Woollard: I think it's interesting too that privacy is also now becoming a selling feature. We've certainly seen it with Apple. Apple has spent a lot of money in their ad campaigns talking about how they're protecting your data. Is there a potential that it shifts back to where you pay more for privacy or privacy becomes a selling feature basically.

Tom Kemp: I don't think you have to pay for privacy. I mean, it's like privacy should be an inalienable right and we don't have to pay for freedom of speech and other rights that we have as well. Again, the original business models of Google was contextual ads. They did great. Again, you can also do behavioral advertising, but do you really need to know my exact location or can you back it up by precise location, by a zip code or an acre, or something like that? That's what we're talking about right here. In the case of Apple, they're doing a very effective job of differentiating against the other people. They came out with this feature called App Tracking Transparency or ATT, not to be confused with AT&T, and 96% of people turned it on. That tells you that there is a hunger for privacy. That people do not want to have third party entities tracking them and selling the data to data brokers, and Google should do something similar on Android. Do we really need to have all these apps leaking out our precise geo location to conduct commerce in the United States? The answer is no. They don't need to know exactly within five feet where I'm at all the time.

Deidre Woollard: The scope of these companies is so massive, and it's one of the things you talk about in the book, is the way that these companies have really squeezed out the competition. It's obviously what's happening with the Google case. Are there companies that are handling data in a way you admire and are there other ways that we can encourage smaller companies to a greater tech diversity?

Tom Kemp: Yeah. Look, I think that the problems that we have with privacy and the problems that we could potentially had with AI, bias, and exploitation are exasperated by the fact that we have large monopolies who do not feel the competitive pressure to do things differently. For example, both Apple and Google charge 30% on their app stores. They require you to use their transaction systems and charge 30%. In a normal market, the merchant pays what Visa? One percent, 2% you have to pay 30%. That is an example of a monopolistic practice that does not help innovation in one area where innovation can occur is better privacy. Similarly, Meta and all the other ones don't provide interoperability that once you're in their walled gardens, if you leave, you can no longer communicate with people that are in there as well. The motivation then becomes like, hey, you're captive audience so we're going to continue to collect in mind more and more of your information as well, and so there could be calls to mandate interoperability. Ironically Meta scraped the crap out of my space, and that's how they got going. But if you try to do the same thing with Meta, they'll sue you big time if you try to do something similar to what they did with my space in the early days as well. What I'm simply saying is, is that if we actually start having real competition instead of being charged 30% for transaction fees and actually require interoperability, I think we will then see more competition that will lead to better privacy in cybersecurity, for consumers.

Deidre Woollard: Well, you just mentioned the word that I want to talk about as we wrap up, which is a cybersecurity. Not just a big tech problem, it's a big everything problem. MGM Casinos had a cybersecurity problem this week. You're an investor in cybersecurity. What are you looking for as a cybersecurity venture investor? And what should investors that are invested in some of the publicly traded cybersecurity companies be looking for, be asking?

Tom Kemp: I certainly have ownership stakes in a number of public cybersecurity companies because I generally believe that protecting businesses and consumers from hackers is good. Also I do small angel investing, tiny checks to two people in a dock, [laughs] so that's the size. I'm not a heavy roller venture capitalist or anything like that. I do what a lot of Silicon Valley people do. Entrepreneurs, which is, hey, my friend's starting a company, can you give them a couple thousand, $10,000, to get jump-started right there? What I look for when I do these small seed investments or angel investments, I look if it's hey, do they have great co-founders? How big of a pain point is the problem? Is there a solution, an aspirin, or a pain killer? How big is the market? You want to have a large market where if you make mistakes you can still be successful, and how crowded is the market? It's great to be in a billion dollar plus market, but if there's already 10 players that have already raised enough a lot of money there, then it may be very difficult even if you have a better bootstrap as well. That's what I look for in terms of my little tiny angel investments. Then as it relates to the larger cybersecurity companies, I do think that if I were to give anyone of these companies advice, not that they're calling me up necessarily. I really think that they probably need to more consider the fact that a lot of the attacks into corporations come from people's personal devices, their personal passwords. I think that the home network, because we're all working from home, and something that's on your kid's iPad could actually hop over into the corporate network or into your Salesforce or something like that. I really think that the shift needs to occur to also protecting the blurring between personal and professional usage of home networks, devices, et cetera. I think that could be an interesting area.

Deidre Woollard: Absolutely. It's fascinating to me how many of the big hacks start with exactly with something very small like that.

Tom Kemp: Absolutely. Look, 80% of breaches involve a password. Like a stolen password or easily guessed password or someone was faked or fooled into giving their password. Then once they get in, then the hackers hop around. Identity is the top attack vector because we as humans are the weakest link when it comes to cybersecurity.

Deidre Woollard: Last question, I want to leave it on as an optimistic note. In five years in the future, maybe even 10 years in the future, how do you see data and privacy being handled in a perfect world where you see everybody protected? What does that look like?

Tom Kemp: In a perfect world, we're able to use universal signals on our devices and browsers that precommunicate as we visit a website that says, here's the security setting so I don't have to go through. We're all sick of dealing with cookies. Like, do you accept the cookies and it's like blah, blah. No, I just want to see that one article about my sports team. I don't want to spend two minutes like, do I want marketing cookies? Do I want analytics cookies? Et cetera. I'm hoping we can be in a world where we can have opt out signals that express our desire for what we want to do. That would be a great situation that we can build privacy into the browsing as we go along. I hope that we continue to have greater awareness like we talked about cars. Before that there is now a significant concern that too much data is being leaked out from cars. There was even something about allegedly Tesla engineers were looking inside people's garages and say, this guy's got a really cool car next to the Tesla, et cetera. I hope there's greater awareness. I hope there's a federal privacy law and I hope that we start putting some guardrails around AI because I think this could spin out of control. I am an optimist in the book containing big tech. I actually provide things that people can do as consumers right now and then to protect themselves to reduce their data footprint. Then I also do provide specific roadmap for policymakers that they can follow to get us to a point where, look we take advantage of the goodness of the large tech players, but we contain the downsides associated with it. That's what I'm simply trying to get at with the book containing big tech.

Mary Long: As always, people on the program may have interest in the stocks they talk about and the Motley Fool may have formal recommendations for or against, so don't buy or sell stocks based solely on what you hear. I'm Mary Long. Thanks for listening. We'll see you tomorrow.