On Oct. 30, the White House issued its first executive order to help regulate the development of artificial intelligence (AI). It features a sweeping set of directives allocated under eight subheadings:

  1. New standards for AI safety and security
  2. Protecting Americans' privacy
  3. Advancing equity and civil rights
  4. Standing up for consumers, patients, and students
  5. Supporting workers
  6. Promoting innovation and competition
  7. Advancing American leadership abroad
  8. Ensuring responsible and effective government use of AI

As you can probably tell, the executive order covers a lot of ground. Ultimately, however, its goal is to prevent existential threats and data breaches and protect the rights of consumers and workers to ensure the safety of AI models before they are released to the public.

The order comes after President Biden spent months discussing AI with other world leaders, experts, and industry leaders, all while experimenting with the technology himself. It also builds upon previous guidance issued by the White House to AI developers and voluntary commitments made by some of America's largest tech companies to deploy their models safely.

Below, I'm going to highlight how the directives under the first two subheadings listed above could directly impact AI industry leaders Amazon (AMZN 0.75%), Microsoft (MSFT -1.00%), and Alphabet (GOOG -3.33%) (GOOGL -3.37%).

A digital rendering of a computer chip with AI inscribed in the center on a blue background.

Image source: Getty Images.

Amazon, Microsoft, and Alphabet have invested billions in AI

Amazon, Microsoft, and Alphabet, in that order, are the top three providers of cloud computing services to businesses. The cloud is where companies store their valuable data and run their online operations, from their sales channels to their global workforces. Since AI development, training, and deployment rely upon mountains of data, the cloud is also home to that technology.

All three of the tech giants have invested heavily in their physical data center infrastructure recently. They have purchased advanced AI semiconductors from Nvidia, and they have designed their own hardware in-house as they jostle for an edge over one another. But chips -- while critical -- are only one part of the equation.

Businesses are also demanding foundation AI models they can build upon to help accelerate their own AI development, and cloud providers are delivering in spades.

Microsoft invested $10 billion in ChatGPT developer OpenAI earlier this year, and its latest GPT-4 model is available to Microsoft Azure cloud customers. Amazon bet $4 billion on start-up Anthropic, which is also developing AI models that will be made available on Amazon Web Services. In the recent third quarter of 2023 (ended Sept. 30), Alphabet said it had over 100 different models available to Google Cloud customers.

Going forward, cloud providers will rely upon the continued development of those models by third parties, start-ups, and their own in-house teams so they can keep up with customer demand.

Rules surrounding safety and data privacy could slow AI development

The White House's goals to create new standards for AI safety and security and to protect Americans' privacy should be music to the public's ears. After all, some leading voices in the AI industry have expressed concerns about how quickly the technology is advancing and the dangers of unchecked development.

Earlier this year, Tesla CEO Elon Musk himself helped lead an effort to pause the development of advanced AI models for six months to give global governments time to implement regulations. He promoted a petition that attracted over 33,000 signatures from industry experts and academics, although it ultimately didn't achieve the desired result.

Up until now, AI developers could push new models to their users without any guardrails except those they imposed upon themselves. But the executive order will change that (to a degree).

The order requires the National Institute of Standards and Technology to create a rigorous testing regime, which will be applied by the Department of Homeland Security. Any AI model considered a threat to national security, the economy, or national health and safety must pass those tests before it can be released to the public.

But in reality, most of the industry's leading models could fall into one of those categories. OpenAI's GPT-4 model is being used by 18,000 businesses in dozens of industries through Microsoft Azure, as I touched on earlier. Therefore, it's absolutely having an impact on different segments of the economy, so future models (like the eventual release of GPT-5) could face substantial delays prior to release because of the executive order.

Moreover, an AI model is only as good as the data on which it's trained. The government wants to ensure consumers' personal data is protected, but that won't always be easy (or practical) when models compete with each other for accuracy, meaning the winner could be decided based solely on the recency of the information it ingests.

The executive order focuses mainly on how the government sources data for its own models, but it calls upon Congress to pass more comprehensive rules to protect the public. Any restrictions on the type of data developers can use will ultimately affect the AI models themselves.

Shares of Amazon, Microsoft, and Alphabet are a buy anyway

The executive order creates a risk that one AI model might not pass the government's tests, whereas competing models will. If one cloud provider is more reliant on that model than others, it could lead to a temporary loss in revenue.

However, so long as the rules of the road are applied equally, a broad slowdown in development probably won't lead to a loss in cloud or AI market share for Amazon, Microsoft, or Alphabet. I would expect each company to be impacted to a similar degree over a long enough period of time.

For that reason, owning shares in the three tech giants will likely remain among the best ways for investors to gain exposure to AI, so the executive order shouldn't be a reason to avoid them.