After well over a million miles of autonomous driving, Google's fleet of cars has only been involved in a handful of accidents, and all of them were due to human error -- until last month. Now, a self-driving car has been part of an accident which was, according to Google's parent company Alphabet (NASDAQ:GOOGL) (NASDAQ:GOOG), at least partially the software's fault.

In this clip from Industry Focus: Tech, Dylan Lewis and Sean O'Reilly explain what happened, and how accidents like this might be avoided in the future.

A transcript follows the video.

This podcast was recorded on March 4, 2016. 

Dylan Lewis: Just kind of as an update to what has been going on with Google self-driving car, so far, they have logged over I believe it is like 1.3 million miles on roads in Mountain View, California, Austin, Texas and Kirkland, Washington. Their current fleet is a mix of modified Lexus SUVs and then the bespoke driverless cars that they've created.

Sean O'Reilly: Little pod thingys.

Lewis: Yeah, I don't even know how to describe them to somebody.

O'Reilly: They're like the monorail cars from 40 years ago at Disney World.

Lewis: Yeah, there's something kind of European looking about them. They're cute I guess. That is their flagship prototype and so they've been out on the roads as well. Just the track record of safety so far. Their vehicles have been in come minor accidents in the past. Up until mid-February I think it was somewhere in the teens, somewhere around 15 or so accidents. The car and the tech on Google side was not at fault for any of these. It was drivers doing something stupid or people stopping short or what have you. Up until mid-February is the operative thing to hone in on there.

O'Reilly: Vince and I did our tech/CG crossover show and of course we talked about driverless cars. And a week later -- a week later! -- the Google, I think it was one of the little pod things, I don't know which type of car it was.

Lewis: It was a Lexus RX.

O'Reilly: Oh, I'm sorry. Okay. It gets into an accident with a stinking bus. A week later, after 1.3 million miles. Then I do a show about it, hits a bus.

Lewis: Yeah, just don't talk about any airlines anytime soon.

O'Reilly: Right. The thing that I took away from it though is "partially at fault." Google says we're probably partially responsible, but everybody walked away.

Lewis: Yeah. To give you an idea of what happened, basically Google's car was in the right lane of a city street and it was going to make a right turn. I think that there were some sandbags in the drainpipe.

O'Reilly: Which of course messed with the sensors and they do that to keep leaves and stuff out of the drainpipes.

Lewis: I think the car veered left a little bit so that it could go around it and make the right. There was a bus coming up on its left. Bus was coming at 15 miles an hour and the Google car was moving at about two miles an hour and I think it was one of those situations where the Google car thought the bus was going to yield. The bus did not expect the Google car to make the cut. In their statement, Google said, "This type of misunderstanding happens between human drivers on the road every day. This is a classic example of the negotiation that's a normal part of driving. We're all trying to predict each other's movements."

O'Reilly: Not only that, but and this lends itself to the previous accidents where it's just a computer interacting with a human and it's just unpredictable, but just 10, 20 years from now -- if and when all this stuff happens -- the bus would have been communicating with the car. It probably would have been avoided. This does not concern me in the least.

Lewis: Yeah, that is something that people cite all the time. The biggest problem with these very sophisticated, smart devices is when you add the human element to them.

O'Reilly: Right.

Lewis: If they're communicating on the same plane, they're speaking the same language, they know what each other are going to do.

O'Reilly: Then all of a sudden it's like a roller coaster ride or whatever. Maybe that's a bad example, but all of a sudden it's like a monorail ride, like with the Google car.

Lewis: Yes.

O'Reilly: Because it's safe and everybody knows what's going on and all that stuff.

This article represents the opinion of the writer, who may disagree with the “official” recommendation position of a Motley Fool premium advisory service. We’re motley! Questioning an investing thesis -- even one of our own -- helps us all think critically about investing and make decisions that help us become smarter, happier, and richer.