
① Research in human-vehicle interaction has shown even systems designed to automate driving are far from being error-proof. Recent evidence points to drivers’ limited understanding of what these systems can and cannot do as a contributing factor to system misuse. A recent study tackles the issue of over-trusting drivers and the resulting system misuse from a legal viewpoint. It looks at what the manufacturers of self-driving cars should legally do to ensure that drivers understand how to use the vehicles appropriately.
② One solution suggested in the study involves requiring buyers to sign end-user license agreements (EULAs), similar to the terms and conditions that require agreement when using new software products. But this is far from ideal. The agreement may not provide enough information to the driver, leading to confusion about the nature of the requests for agreement and their implications. Further, most end users don’t read EULAs. A 2017 study shows 91 percent of people agree to them without reading. Among young people, 97 percent agree without reviewing the terms.
③ The issue is that, unlike using a smartphone app, operating a car has serious safety risks, whether the driver is human or software. And human drivers need to consent to take responsibility for the outcomes of the software and hardware.
④ “Warning fatigue” and distracted driving are also causes for concern. For example, a driver, annoyed after receiving continuous warnings, could decide to just ignore the message. Or, if the message is presented while the vehicle is in motion, it could represent a distraction. Given these limitations and concerns, even if this mode of obtaining consent is to move forward, it likely won’t fully protect automakers from their legal liability should the system malfunction (发生故障) or an accident occur.
⑤ Driver training for self-driving vehicles can help ensure that drivers fully understand system capabilities and limitations. This needs to occur beyond the vehicle purchase. Recent evidence shows even relying on the information provided by the seller is not going to answer many questions. All of this considered, the road forward for self-driving cars is not going to be a smooth ride after all.
1. 1. What do we learn from research in human-vehicle interaction?
A Automatic driving systems are by no means immune to errors.
B Driverless vehicles are likely to be misused by some people.
C Self-driving car manufacturers are not aware of the legal matters involved.
D There is a long way to go before humans can interact with driverless vehicles.
2. 2. What is the problem with requiring buyers to sign end-user license agreements?
A End users, young and old alike, find the terms complex to interpret.
B Most end users sign them without bothering to read the terms.
C Many people are often confused by the wording of the terms.
D Most end users do not understand the terms after reading them through.
3. 3. What would drivers do when they suffer from “warning fatigue”?
A Waste no time keeping the car moving.
B Rest a while to avoid fatigue driving.
C Take no action despite repeated warnings.
D Take note of the message though fatigued.
4. 4. What does the author think of continuing to ask buyers to sign end-user license agreements?
A It will probably not guarantee the safety of drivers in case of accidents.
B It likely won’t ensure that the automatic driving system functions properly.
C It likely won’t fully protect automakers against accusations of deliberate cheating.
D It will probably not provide manufacturers adequate protection from legal responsibilities.
5. 5. What should be done to help drivers fully understand system capabilities and limitations?
A Training them to be experts in vehicle automation.
B Familiarizing them with the systems through training.
C Broadening their knowledge of accident-prevention mechanisms.
D Facilitating their access to the information provided by the seller.