Self-driving cars get lots of buzz these days, and deservedly so. The advent of autonomous vehicles could revolutionize the way we live and interact with transportation, and carry huge consequences for all auto, tech, and even semiconductor companies.

In this series, I look at what leading companies are doing with self-driving cars. First, let's look at the basic technology, timeline of adoption, and big-picture opportunities.

Small self-driving car, seen from the front

Image source: Waymo.

Self-driving car technology

Current self-driving cars use four different technologies: LIDAR, radar, ultrasonic, and passive visual.

LIDAR (Light Detection and Ranging) uses laser light to identify objects and measure distances. LIDAR is good for detecting speed and distance up to 200 meters in light or dark. However, today's systems are mounted on the outside of the car, so they're not good at detecting very close objects. Though it works well in low light, LIDAR does not work very well in fog, rain, or dust, due to its use of light wavelengths, and doesn't distinguish color and contrast. LIDAR is also currently very expensive.

Radar uses radio waves to identify objects and determine velocity and angles. It has good range, but low resolution. Radar is better than LIDAR in snow, fog, and rain, and better at detecting objects at close range.

Ultrasonic systems emit ultrasonic sound waves and determine distance by how long these waves take to return to the source (this is how bats echolocate). Ultrasonic sensors are good for close-range detection in all weather, but do not have the range of LIDAR or radar.

Passive visual technology uses cameras combined with sophisticated image-recognition algorithms to understand what the cameras are seeing. These systems can detect color, contrast, and resolution better than other sensors. They also have long range, but only work in good light conditions; performance diminishes as lights dim.

These technologies are constantly evolving, and different companies use different sensors in combination. Alphabet's (GOOG -1.10%) (GOOGL -1.23%) Google, for instance, is developing its own LIDAR system as its main sensor. Tesla (TSLA -1.92%), on the other hand, feels it can get the necessary capability from a combination of radar, ultrasonic, and passive visual systems, without the excess cost of LIDAR.

Again, things could change and companies can achieve new breakthroughs, but it's important for investors to understand these technologies to determine which bets to make.

Fully autonomous or semi-autonomous?

The Society of Automotive Engineers classified levels of autonomy in six tiers:

Level 0: No self-driving features.

Level 1: Controls one system at a time, like cruise control, or automatic emergency braking.

Level 2: Can control two functions simultaneously, such as speed and steering. Tesla's Autopilot is level 2, and other car manufacturers such as BMW are offering level 2 features, as long as there is a driver paying attention.

Level 3: This level includes basically fully autonomous features, but will warn a driver when the driver needs to take control. This is incredibly complicated to do, because once humans get used to the self-driving function, we tend to doze off or get distracted. This level won't deliver an appreciable jump in safety over level 2, so many companies are skipping this step.

Level 4: Nearly autonomous. It may be possible for a human to control the car, but the car will drive itself completely and shut itself off if things go wrong.

Level 5: Completely autonomous; these cars lack pedals or a steering wheel and are not meant to be driven by humans.

Semi-autonomous features are essentially in levels 1 through 3, and, like Tesla's Autopilot feature, are already in some luxury cars now.

Fully autonomous cars, at levels 4 and 5, do not require a driver to be present at all. This type of vehicle is very far off, and some question its viability, but wide adoption of fully autonomous vehicles would be the true revolution some are anticipating.

Many car companies, unsurprisingly, are adding autonomous features to their vehicles. However, the gap between having a human as backup (levels 2 and 3) and being fully autonomous (levels 4 and 5) is a much bigger leap. That's because human beings are not good at turning off their brains, then turning them back on if they are called upon to drive. There are also large regulatory and liability hurdles once the human element is completely taken out.

That's why tech companies like Google are foregoing level 3 entirely and going straight for levels 4 and 5. Many believe the company that reaches a fully autonomous technology first will be the lead platform upon which future ride-sharing services will be built, even if adoption is 10 or even 20 years out. That's why self-driving is so important to Lyft and Uber (which recently got into some trouble on this issue).

Stay tuned to this series, as I dig in to the leading self-driving companies in greater detail.