What Tesla And Google’s Approaches Tell Us About Autonomous Driving

Jul 1, 2016

By Alina Selyukh

U.S. transportation authorities are investigating the deadly collision of a Tesla Model S car. And many reports say the fatal crash has heightened concern about self-driving cars. Which may be true. Except — Model S isn’t a self-driving car.

As NPR’s Sonari Glinton points out, what Tesla’s Model S has are self-driving features, autonomous elements meant to assist drivers rather than replace them.

Virtually all major car and tech companies are pursuing self-driving technology as the future of transportation. But Tesla and Google are the earliest innovators, taking very different approaches.


Continue reading by clicking the name of the source below.

6 comments on “What Tesla And Google’s Approaches Tell Us About Autonomous Driving

  • It unrealistic to expect a perfect record for driverless cars, especially at the beginning. What counts? Are the cars safer than had they been driven solely by humans?

    The CBC went all shrill acting as if this one accident meant the end of driverless cars.



    Report abuse

  • I have driven many times on El Camino in Mountain View, CA, and have seen the Google self-driving cars in action. They are very non-aggressive and exhibit “model” driving behavior. I thought the discussion at The Verge on the circumstances surrounding the only known accident caused by a Google Car was very interesting. The Google car is programmed to move as far right as possible when approaching an intersection to make a right turn. This is out of courtesy to drivers in the same lane who might want to pass while the Google car is waiting to make the right turn. In the case of the accident, the Google car decided it had to “take back the middle of the lane” because there were some sandbags around a gutter along the curb. It was at this point that the Google car (going 2 mph) drove into the side of a public bus (going 15 mph) that was trying to pass it. Legally, it’s debatable whether taking back the center of a lane is improper. I guess if there is a bus occupying that space, it is. The human occupant of the Google car said he saw the bus behind the Google car and assumed (just like the Google program) that the bus would yield. Obviously, at that low speed, there were no injuries and the Google car made a very similar mistake that human drivers often make, assuming the best intentions of other drivers. But I think a world of self-driving Google cars, where polite driving would be paramount, is devoutly to be wished.



    Report abuse

  • prietenul @# 2.

    Given the appallingly low standards of driving in the UK, anything would be an improvement on the current situation.

    Basically, there are two ways to drive: defensively or offensively; for the most part the latter is the case.

    If driverless vehicles are designed with safety first and foremost, which I believe is the case, as long as the system can’t be overidden by the owner, on balance I think they will be a boon.



    Report abuse

  • 5
    Ted Foureagles says:

    When Dad was teaching me to drive way back in the 1950s his first advice was to try not to do anything that relies on an action by someone else to work. I have tried, and doing so may have saved my life at times, but it’s difficult in heavy traffic where others are aggressively jockeying for position. I’ve not been in a self-driving car, but many modern cars have some sort of “driver assist’ features. For example: Dear Li’l Sis’s new Toyota has “smart” cruise control. If you come up behind a slower car it backs off to maintain a safe following distance of about two seconds. Or if a car cuts in front of you it will back off or even apply brakes to re-establish the interval. It doesn’t do it quite as quickly as I would, and it apparently doesn’t consider whether there’s a tailgater on your ass who might be engrossed in a phone conversation, but it does it. The result in heavy multi-lane traffic is that the car is constantly slowing down to accommodate other drivers and moving back in line while being tailgated by the next passer. But that’s pretty much what happens when I’m in control too, except that I’ll occasionally hit the throttle and break the law to escape the clump if I find a reasonable gap.

    The problem with self-driving cars right now, aside from some arcane ethical arguments, is that they must exist in a field of human driven cars, and as such a game of “prisoner’s dilemma” is working out on the roads. Do we, in the interim, imbue the programming with an option to defect — to hit the gas, break the law and get out of a bad situation for the greater good? Or do we program them to always cooperate, in which case they would need their own special utopian lane while everyone else whizzes by in their decidedly sub-optimally human controlled cars?

    }}}}
    By the way, I drive a Miata that doesn’t have so much as a radio.



    Report abuse

Leave a Reply

View our comment policy.