I’ve noticed amongst people I know from out of town that when they visit San Francisco, they are eager to take a Waymo autonomous taxis. It’s almost a tourist attraction in it of itself. That is, until it proliferates into other cities and regions. I myself have yet to hail a Waymo ride, just like I’ve yet to take a ferry to visit Alcatraz island.
As a person of introverted proclivity, I am on paper a big fan of autonomous taxis. To not have another stranger (the driver) there at all - never mind interacting with them - is serene music to my ears. But as with everything in life, there are tradeoffs.
Robots may be predictable, but humans are definitely not. On public roads there are multitudes of negative potentialities you must account for. And I don’t see how a driver-less taxi is capable of handling those situations. For example: what if a gang of dudes walks over to your stopped Waymo in a menacing fashion? If I were driving, the law gives me protection to mash the gas and get the hell out of there by harmful means.
Would a robot do the same? Has Waymo put into code calculations of when it is appropriate to run people over? There’s got to be a hierarchy of which life is more valuable, right? Perhaps the person paying for the autonomous ride should be supreme. If the outside world is threatening the occupant(s) inside a Waymo car, stopping and locking the doors cannot be the only option!
You can bet that I too would run over a gang of bikers in my Range Rover, if so provoked. Would an autonomous car do the same? I would like to know the answer before getting into one.
The late night filings.