solarz
Brigadier
Manual overrides are meant to be available and used in case the automated system malfunctions or fails for any reason. It is not an indication of whether humans or the automated systems can perform a task better. Humans need to be equipped and trained properly to know when the automated system is malfunctioning or failing.
Just wondering about hypothetical situations: if a driverless car loses control on a wet or icy road in bad weather and injures someone or damages property is it the fault of the driverless car for losing control or is it the fault of the person sending the car out in bad weather?
Cars are already prone to failure right now, and there's no "override" to keep them working, beyond coming to a stop on the side of the road. I suspect the control software would be less likely to fail than the car parts themselves. There can also be redundancies built into the automation.
If Google is expecting some service of this being used pretty soon, then that's pretty close of what the production model is going to be like. I see a lot of vandalism especially if this is used as some taxi service. Besides I think Google announcing it is like Amazon announcing delivering packages by drone. It's not happening anytime soon because there's a lot of government regulation and study that needs to be done for this unexplored territory.
I believe Google is using this to develop their driving software. I don't think Google is going into the automotive industry, their primary interest in this is still software (and of course, data).