王飞跃的个人博客分享 http://blog.sciencenet.cn/u/王飞跃

博文

自主驾驶车的路还很远?

已有 3567 次阅读 2015-10-29 23:15 |个人分类:科研记事|系统分类:海外观察|关键词:Self,Driving| Self, Driving

Thanks Tomi and Kazuhiro for leading me to this news report.


http://www.businessinsider.com/self-driving-cars-have-a-long-way-to-go-in-the-us-according-to-this-chart-2015-5

1 in 3 Americans say they will never consider a self-driving car, according to a new poll



Self-driving cars may have a long road ahead before winning over mainstream American consumers.

According to a Harris Poll survey, charted for us by BI Intelligence, 33% of all US adults indicated they will never consider buying or leasing a self-driving vehicle. The distaste for self-driving cars was even higher among older groups, with 36% of both Generation X (ages 38-49) and Baby Boomers (ages 50-68), and 50% of Matures (69+) indicating they would never buy/lease a self-driving vehicle.

We're still years away from a time when self-driving cars are available to the general public. The fact that two-thirds of Americans are not completely averse to the technology is promising.

Still, there are clearly concerns about the safety and reliability of self-driving vehicles — 22% of respondents said they would consider buying a self-driving car when the “bugs” have been worked out. All age groups except the Millennials cited concerns about technical bugs as the main reason for not buying a self-driving car.

Google will have to allay those concerns if it wants its autonomous cars to catch on with the public. On Monday, Chris Urmson, the director of Google's self-driving car program, announced that the company's prototype vehicles have been involved in only 11 minor accidents, with no injuries, during 1.7 million miles of driving. “And not once was the self-driving car the cause of the accident,” Urmson noted, though he did not provide any details about the accidents.

bii_us_consumers_purchased_self_driving_carsBI Intelligence



Why You Shouldn’t Worry About Liability for Self-Driving Car AccidentsPhoto: VolvoHåkan Samuelsson—President & CEO, Volvo Car Group

Volvo president Håkan Samuelsson caused a stir earlier this week when he said that Volvo would accept full liability whenever its cars are in autonomous mode. Samuelsson went further, urging lawmakers to solve what he called “controversial outstanding issues” over legal liability in the event that a self-driving car is involved in a crash.

“If we made a mistake in designing the brakes or writing the software, it is not reasonable to put the liability on the customer,” says Erik Coelingh, senior technical leader for safety and driver support technologies at Volvo. “We say to the customer, you can spend time on something else, we take responsibility.”

This, says Samuelsson, makes Volvo, “one of the first car makers in the world to make such a promise.” Google and Mercedes Benz have recently made similar assurances. But does that mean if your future self-driving Tesla or Volkswagen gets into a crash instead, you’re going to be on the hook for all the damages?

Not at all, says John Villasenor, professor of electrical engineering and public policy at the University of California, Los Angeles, and the author of a paper titled, “Products Liability and Driverless Cars.” According to Villasenor, “Existing liability frameworks are well positioned to address the questions that will arise with autonomous cars.” He told IEEE Spectrum that, “If an autonomous car causes an accident, then the manufacturer was already going to be squarely in the liability chain.”

The University of Washington’s Technology Law and Policy Clinic agrees. In a submission earlier this year to the Uniform Law Commission, a body that aims to standardize laws between U.S. states, the group said, “Product liability theories are highly developed, given the advance of technology in and out of cars for well over a century, and are capable of covering autonomous vehicles.”

imgPhoto: VolvoIntelliSafe Auto Pilot interface

As cars have become increasingly automated, with antilock brakes, electronic stability control, crash prevention radars and lane-keeping assistance, legal precedents have naturally developed in step. U.S. law now provides multiple routes for anyone seeking redress for any defective product, whether a simple toaster or a fully autonomous SUV.

For a start, manufacturers must exercise a reasonable degree of care when designing their products. It makes sense that any company selling a self-driving car that, for instance, was not tested in bad weather, might be sued for negligence if one crashed during a snowstorm. But that is, perhaps, a poor example. “Snow is difficult because it limits visibility,” says Volvo’s Coelingh. “And it is low friction and so limits braking ability. Snow’s not impossible but it’s really difficult.”

Volvo intends to be one of the first carmakers to get self-driving vehicles into the hands of real customers, with a fleet of a hundred autonomous XC90 SUVs planned for the roads of Gothenburg, Sweden, by 2017. Initially, none will be allowed to drive in snowy conditions.

Even if not pronounced negligent, manufacturers can still be found ‘strictly liable’ for any problems discovered in their final products, or can be sued for design or manufacturing defects. They can also be held liable if they fail to warn consumers of the risks of using (or misusing) products or services.

To reduce the chance of any mishaps, Volvo intends to give its first customers special training for their self-driving cars. The company will seek out a diverse range of drivers representative of its customer base, including older motorists and those suspicious of new technology. “One of the really interesting things is to see if people who are skeptical in the beginning will change their minds once they have used it for a while,” says Coelingh.

Judging by Volvo’s latest video featuring its Drive Me autonomous XC90s, motorists will be encouraged to watch television or do some work while the car is in charge. The video does not mention that the car might occasionally need to hand control back to the driver, as most experimental vehicles do today. For its pilot program in Gothenburg, says Coelingh, Volvo can even remotely disable the autonomous technology. “We might want to make the technology unavailable if something really critical occurs,” he says. But if a production self-driving car actually required more human oversight than a manufacturer claimed, and this led to an accident, the driver might have a legal case for misrepresentation.

“There are grey areas, involving disputes regarding whether an accident was caused by a failure of autonomous technology, an error by the human driver, or some combination,” says Villasenor, “But these are not areas that [Volvo’s] pronouncement will resolve. But it’s still good to see manufacturers stepping up and recognizing their liability obligations.”

The takeaway? While carmakers’ promises to accept liability are probably unnecessary, they’re not a signal to steer your old wreck into an autonomous Volvo in the hope of a fat payout. “We do not take responsibility for all potential crashes with a self-driving car,” warns Coelingh. “If a customer misuses the technology or if there is another road user that causes an accident, it’s not we or our customer who are to blame, it’s the third party.”

Learn More Volvo XC90 driverless cars liability self-driving car
  • The more interesting question, is when the self-driving car is forced to make an ethical decision to protect the driver vs protect pedestrian(s) in the road, what decision does it make? Should algorithms be written to preserve the maximum number of lives? or always prioritize the driver?


    • Avatar

      Another interesting question: if all self-driving cars are designed to operate very conservatively, i.e. below speed limits and with large spacings between themselves and other cars, and this known behavior induces other drivers to frequently 'cut off' the self-driving cars, causing more disruptive / less safe driving on the highway - how will highway traffic regulators respond?


      • Avatar

        Large spacing means reduced total flow of cars. I thought self-driving cars were supposed to bunch close together, increasing capacity. There seems to be a transition challenge here.



      • Avatar

        That is my point in another post here. The overall speed limit should be reduced. Self-drive cars will respect and Human-drive car will be ticketed up to the point they will respect. In the end (low) SPEED and distance as you mention will be a key element.



    • Avatar

      I think that (if the manufacturer is going to be financially responsible for injuries) they will be optimized to put the manufacturer on the hook for as little as possible.



    • Avatar

      That's an interesting point, and not one I've heard discussed. Thanks for your comment.



    • Avatar

      It should always be to preserve the maximum number of lives, with pedestrians getting higher priority in ambiguous situations.


      • Avatar

        "with pedestrians getting higher priority in ambiguous situations." -- And that will make the streets of New York a parking lot, as every pedestrian in the city need never fear stepping out into traffic ever again.

        As to the priority, I will be doing the research, and buy (paying more if I have to) to get the one that will prioritize the well being of my family over strangers.


        • Avatar

          So your solution is just to have the cars automatically run them down and kill them? Really? Wow. I also find your notion that your family's lives are somehow worth more than other people's lives, simply because you don't know them, to be rather odious. Sorry, but no, you are not better than everyone else, just because you happen to think so.


          • Avatar

            Not a question of DStuff thinking others are worth less. It's a question of who is he responsible for protecting. His family has the strongest call on his protection. I would not trust anyone with supervising a child who did not view that child in such a manner.

            While we are all equal before the law, we are *not* equally responsible for all others.


            • Avatar

              Which is exactly why this type of decision can't be left to individual drivers to do as they please. Naturally, people will look to maximize saving their own hide, everyone else be damned. They'll select/program their cars so that they will plow into 8 pedestrians if they have to, just as long as it saves their own life. Self-driving car protocols must come from the top down, and be uniformly applied across the board so that drivers don't endanger countless others in order to maximize their own personal safety. Saving the maximum number of lives in any situation should always be the mandated protocol.


              • Avatar

                And when the car buyer fully understands that the car he's buying will not put his/her children's safety first and foremost, they will walk away and buy the competition's car that will. If by force of law (and the guns that back it up) it is required that all cars discount the customer's safety, buyers won't buy (see the Volt and other mandated flops).


                • Avatar

                  So, you are by yourself, alone in your self-driving car. You encounter a situation where you (and only you) are probably going to die unless your car veers off the road and into a large crowd of pedestrians waiting on the sidewalk. Several of those pedestrians will die, and several more will be severely injured as a result. But you will live. Do you think it's OK for your car to be programmed to take that course of action, in order to put YOUR safety first and foremost?


                  • Avatar

                    Yep, I have the right of self preservation, and I expect my equipment to assist me in exercising that right. I also expect my estate to go after the manufacturer of a defective product if said product (self driving car in this case) fails to assist and instead inhibits those efforts.

                    But I also expect that a computer will not be able to do sufficiently complex ethical calculus to do anything more than freeze up in the moment of crisis, and Blue Screen of Death (literally) / reboot.


                    • Avatar

                      It's a pretty universally accepted principle that one's individual rights extend only up to the line where they begin to infringe upon another's rights, and then they stop. You may have the right to self-preservation, but you certainly do not have the right to kill and maim innocent people to exercise it, because then you are quite clearly infringing on their rights. Frankly it's sickening and appalling that you are just fine with killing a whole bunch of innocent people just to save yourself. Society would not accept this behavior, and yes, there would eventually be legislation prohibiting such wanton, reckless and self-serving actions. If not from the get-go, then certainly soon after the first incidents where innocent people are killed by the actions of self-driving cars that are trying to save their drivers without any regard for the safety of anyone else.


                      • Avatar

                        And the day that legislation goes into effect, I'll start making some serious cash opening a grey market shop that alters the programming. Probably won't even be that hard, it'll most likely be 4th rate code done in 3rd world code shops.

                        Human Nature will always override Utopian pipe dreams.



              • Avatar

                That is for sure. And if that means the streets will became a parking lot, so they will be. And at that point, we will recognize the city needs a new architecture.


                • Avatar

                  Or the self driving cars will go un bought.


                  • Avatar

                    The new generation do not want to spend time driving anymore. They want to spend that time interacting with their smartphones and let the car get the destination ( leaded by Waze ). Get there 10, 20 or 30 minutes sooner or later, does not matter. The only problem is that this safety-first driver-less technology is 10 years delayed.



  • Avatar

    There are a couple of things that article like this never address. So are these cars going to be fully autonomous or is a human required to be back up? It mentioned about not being able to drive in the snow but what does that exactly mean? Does the car just pull over and stop or does a human have to take over immediately? If the a human has to also be ready to take over when ever the computer faces an issue it can't solve (like ice, snow, hydroplaning, deer jumping out) it would really put a damper on demand. What if the human behind the wheel falls sleep? The same issue is why there are still human pilots in airplanes.

    There is also the question of maintenance? Would this be fully covered for the life of the car at purchase? What happens 5-10 years in and sensors start failing? Does the manufacture still have liability in that situation? Would this potentially mean that people truly would not own the car but just rent it from the manufacture so the manufacture guarantees it is maintained to a certain standard?



  • Avatar

    My only concern in the light of the VW innuendo is what would be the best way to ensure the algorithms are always doing what they are suppose to do vs. gaining questionable competitive advantage for the company? As engineer I always though doing the right thing was cheaper for the company on the long run as VW case is now demonstrating again, and yet a corporate culture was capable of doing the misapplication of technology systematically.



  • Avatar

    Simple minds need a simple example. If an accident occurs with a self driving Volvo car and it is the car's fault, Volvo will accept all liability. This is a misleading way of saying that all Volvo car customers will have to pay for all damages related to said liability.
    There's no free lunch folks. If Volvo pays, they simply pass the cost onto the Customer. It doesn't cost Volvo anything other than a tarnished brand which is generally short lived.
    Perhaps they need to face a fine of 5% of the net worth of the company. Then they might spend a few extra hours reviewing their software and hardware.


    • Avatar

      That's a pretty dumb idea. When a human being causes a car crash we don't fine them 5% of their net worth so they learn to drive more responsibly.



    • Avatar

      I'm sure that some insurance company will be happy to take money to cover the "infinitesimal" chance that a program will get something wrong. That's a no brainier compared to covering most human drivers.



    • Avatar

      So if that cost becomes too high, you can choose not to buy Volvo. Not really a life-altering event. Unlike being hit with liability in an accident which can ruin someone financially. For a lot of people having that security insurance may be well worth a modest increase in the cost of the car.



  • Avatar

    It is going to be interesting to see what happens to the auto insurance business over the next 10-20 years with increasing driverless functionality. Rates should go down and maybe auto insurance won't event need to exist at some point.


  • Avatar

    Suppose you've been drinking and ask your car to take you home. While you are between towns (or in a rough neighborhood) it starts to snow. What to do? Where is the legal responsibility?



  • Avatar

    Wonder what happens to driver skills if they atrophy while most of their time is spent as passenger. What if you live in LA and are driven everywhere for 11 months and then suddenly have to drive because it rains? And every other car on the road is now piloted by equally poor drivers?



  • Avatar

    Max speed limit in my town (Sao Paulo) is 50 Km/h. If a self-drive car stays at 40 Km/h, that is reasonable slow to avoid accidents and even in case they happen by some malfunction, it will not be that bad and will be fixed, even if that means the speed needs to go down to 30 Km/h.


    • Avatar

      Please have a friend hit you/run you down with their car at 30Km/h and let us know exactly how "not that bad" it is and how well you "will be fixed."


      • Avatar

        Sure there will be damage. It must be. By definition there is no 100% safety system. But at 30 Km/h it is exponentially smaller (or "not that bad") than at 90 Km/h. In the city I live, the speed limit was reduced from 90 Km/h to 50 Km/h and the number of accidents were reduced significantly (officially confirmed). But my point here is: (low) Speed will be the key factor.



  • Avatar

    Makes sense. The carmaker was going to get sued anyway, so they might as well lean in and accept some liability so they can get the ball rolling. Over time, I expect it will pay off since an autonomous car should get into fewer accidents and carmakers already get sued in major accidents.




http://blog.sciencenet.cn/blog-2374-931998.html

上一篇:[转载]翦一休:尼采的两个世界- – 德国哲人寻踪
下一篇:中国机器人:担忧与希望

2 陆泽橼 pseudoscientist

该博文允许注册用户评论 请点击登录 评论 (0 个评论)

数据加载中...
扫一扫,分享此博文

Archiver|手机版|科学网 ( 京ICP备14006957 )

GMT+8, 2018-7-18 03:33

Powered by ScienceNet.cn

Copyright © 2007-2017 中国科学报社

返回顶部