Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A mechanical loom can be made so it doesn't kill people when it makes a minor mistake, and doesn't have to dodge unpredictable humans and random wildlife or other obstacles while working.

That freedom to mess up is a luxury that self-driving cars don't have, and makes all the difference.

People on here seem to think that a self-driving car only has to be safer than a human in order to be viable (which is bar that isn't close to having been reached in itself), but in reality humans are held legally responsible when their driving kills people.

If a self-driving car kills someone, then the manufacturer is responsible, not the owner of the car who wasn't driving so can't possibly be responsible.

Otherwise we need to have a conversation as a society on what liability we will accept, because I know that I don't want to be the one paying for the deaths caused by these car companies playing fast and loose with people's safety in order to capture the "autopilot" market.



You seem to be bringing up the danger and uncertainty around liability as an obstacle for the adoption of self driving vehicles, thus making them "far from a reality" / "complete hype", right?

As sad as it is, I'm not so sure that the economic incentives around automating truck drivers won't win over a few lives in the end. I'd be curious what vegas odds would be on self driving cars. I wouldn't bet against it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: