Image adapted from photo by Michael Shick (Licensed via CC BY-SA 4.0)

Given Ashton & Price’s vested interest in all things related to the road, it’s not surprising that the topic of Google’s driverless cars comes up quite often. Back in October of last year, All Thing Legal discussed what was merely a hypothetical situation at the time: who would be responsible if a driverless car caused a car crash? At that time, Google’s cars had been involved in several car accidents, but it every case, the driver of the other vehicle had been to blame.

Well, Google finally blew its perfect record. On Valentine’s Day, one of Google’s driverless cars ran into a city bus in Mountain View, California.

The All Things Legal team took advantage of the recent Google car crackup to revisit the legally fraught topic of autonomous vehicles.

Craig started off by quoting from Google’s report of the accident, while using his expertise—and keen sense of sarcasm—to fill in the gaps. Google’s report starts by discussing how they had been recently experimenting with emulating a very human behavior—squeezing around cars in a right-hand lane in order to be able to make it to a turn more quickly:

“Our self-driving cars spend a lot of time on El Camino Real, a wide boulevard of three lanes in each direction that runs through Google’s hometown of Mountain View… El Camino has quite a few right-hand lanes wide enough to allow two lines of traffic. Most of the time it makes sense to drive in the middle of a lane. But when you’re teeing up a right-hand turn in a lane wide enough to handle two streams of traffic, annoyed traffic stacks up behind you. So several weeks ago we began giving the self-driving car the capabilities it needs to do what human drivers do: hug the rightmost side of the lane. This is the social norm because a turning vehicle often has to pause and wait for pedestrians; hugging the curb allows other drivers to continue on their way by passing on the left. It’s vital for us to develop advanced skills that respect not just the letter of the traffic code but the spirit of the road.

On February 14, our vehicle was driving autonomously and had pulled toward the right-hand curb to prepare for a right turn…”

Craig: “So there’s traffic in the lane, if both cars can share the lane safely, then you can go. The Google car went to the far right of the right lane to make a right-hand turn. Traffic could still pass on its left. Then it detected sandbags near a storm drain. And we talked about this last week. [Autonomous cars don’t work well] when it’s raining, and puddles and splashing throws [their systems] off too. So far this technology only works in fair weather.”

Craig then returned to Google’s report:

“It then detected sandbags near a storm drain blocking its path, so it needed to come to a stop. After waiting for some other vehicles to pass, our vehicle, still in autonomous mode, began angling back toward the center of the lane at around 2 mph…”

Craig couldn’t help but break off again: “You hear this a lot from Google. ‘Yeah, we were just going 2 miles per hour.’ Yeah, how many beers have you had to drink? ‘Uh, two?’ So every time I read an article about them they’re going 2 [miles per hour] when the accident happens… So yeah, we probably will have a job for a while in regards to, unfortunately, auto cases, because this technology does not seem to be as fool-proof as they were making it sound.”

Ed Schade took the opportunity to point out that perhaps Google’s A.I. has gotten a bit too lifelike: “I think it’s exactly what they’re making it sound, because in the article it goes on and says, ‘several weeks ago we began giving the self-driving car the capabilities it needs to do what human drivers do.’ And that’s what they do: they crash once in a while. So, they’re living up to the standard.”

After this break to discuss the resemblance between Google’s autonomous cars and a drunk driver creeping down the shoulder of the road with a few empties rolling around the floorboards, Craig once again returned to Google’s description of the accident:

“After waiting for some other vehicles to pass, our vehicle, still in autonomous mode, began angling back toward the center of the lane at around 2 mph – and made contact with the side of a passing bus traveling at 15 mph. Our car had detected the approaching bus, but predicted that it would yield to us because we were ahead of it.”

As it turns out, Google’s code probably didn’t adhere to California’s Vehicle Code.

Craig then set aside Google’s account of the accident to take a closer look at his team’s expertise—California’s rules of the road: “So Ed, you know what the Vehicle Code says, right?”

Ed: “Yeah, and that’s the interesting thing, because this vehicle’s supposed to abide by all aspects of the Vehicle Code, it’s programmed in. The Vehicle Code says if you’re the merging vehicle, you’re the one that is supposed to yield. And in this event, Google didn’t yield. The vehicle, it violated—probably—its own program, and it definitely violated the vehicle code when it merged in and hit the bus.”

Craig: “Yeah, it’s an unsafe turning maneuver under the Vehicle Code. Once you’re stopped and other traffic is moving, you cannot move over until they’re not an immediate hazard. And when Google hits the bus, it’s an immediate hazard. So, chalk one up against Google on that one.

“If there were injuries, the bus—it’s probably the city of San Jose or Mountain View, I don’t know whether it’s a city or a county bus—but [the bus owner] would be entitled to the cost to repair the vehicle. As it stands right now, Google’s going to be responsible, because they’re the owner of the vehicle.

“California law, the proposal for the DMV is, Google wanted total driverless. So no pedals, no brakes, no steering wheel. And California, the DMV said, ‘Nope. We’re going to have a steering wheel and brakes, and the responsible person… is going to be the person behind the wheel. So the insurance issues will be exactly the same. So if Google goes nuts and goes on a rampage and takes you to Tijuana, gets in an accident in San Diego just before the border, you’re going to be responsible, you can’t blame it on your car.

“Federal guidelines are, it’s going to be autonomous. So no steering wheel. So there’s going to be a difference in federal regulations versus state if they’re adopted that way.”

“…The end of the Verge article, which I thought was pretty interesting, and you know, this is editorializing obviously, this is the quote: ‘We are many, many years away from a road free of human drivers. And until then, self-driving cars are occasionally going to hit things.’ So, I think that’s true.”

Tim Hodson stepped in to point out that while the specific circumstances of the accident are rather entertaining, the larger situation is a very serious one: “No, it’s true… it doesn’t seem like anybody was hurt, which makes the situation a lot easier to talk about. Because I’m glad this happened. Because it seems that Google and a couple of the other companies are quite arrogant in how good their technology is… and it’s clear, it’s a long ways away. And it’s all fun and games now, when these types of things are happening, but when there’s a bunch of cars like this on the road, if they can’t ensure our safety, and they can’t ensure the cars aren’t going to be hacked into and somebody’s going to be able to take it over, it’s not going to be a funny matter and something easy to talk about. So, the more [often] stuff like this happens, and the more they can [improve] the technology and make it safer, it’s a good thing.”

As we said in our blog post from back in October, “the conundrum posed by driverless cars is a difficult one to address.” This recent accident shows that, just as the technology isn’t going to go away, neither are the legal questions that the technology’s growth will raise.

It’s going to be incredibly fascinating to watch how all of this plays out, and how state and federal laws will develop to accommodate the entrance of AI drivers onto the nation’s roadways. Ashton & Price certainly plans to still be here when Google’s cars roam the roads, so, if you’re reading this in the future, and if you’ve been involved in an accident with an automated vehicle, give us a call or contact us through our site. Our team of experienced personal injury lawyers will be more than happy to help you with your case against Google’s growing army of robotic cars.