Trình duyệt không được hỗ trợ. Vui lòng sử dụng Google Chrome hoặc Microsoft Edge để truy cập vào Ringle.
#Case Study
Dangers of Autonomous Vehicles
Case Studies
Updated: 2022.09.06
5 min read · Intermediate
Dangers of Autonomous Vehicles

Experts from across fields have long predicted that autonomous vehicles will be the future. Although we haven’t quite reached the point [1] of development for fully autonomous vehicles, the future is near.

Criticisms of autonomous vehicles still exist, and many complicated scenarios are bound to come to the forefront [2] as autonomous vehicles begin to populate the streets and our homes. Indeed, car accidents with autonomous vehicles have already happened in our present. The below case studies will serve to provide detailed breakdowns of past autonomous vehicle accidents.

Tesla’s driverless car accident: In May of 2016, during a bright daylight drive, a Tesla Model S was operating on autopilot mode (which serves as driver assistance mode, and isn’t fully autonomous). The driver who had directed the Model S onto a highway, was fully reliant on this mode, and did not react in time when a large white semi suddenly cut in front of the Model S. The car collided with the semi. It is worth noting that per the guidelines of the autopilot mode, the Model S should have begun braking instantaneously.

The police statement said that the Model S’s vehicle detection system failed to differentiate the white of the brilliant sky and the white color of the semi. The police report ultimately concluded that the deceased driver was at fault for not having his hands on the wheel in spite of the car’s warnings.

About two years later, in March 2018, a Model X suffered a similar malfunction, resulting in a fatal car c rash. The autopilot again failed, resulting in the vehicle crashing directly into a barrier. It then caught fire, and two other cars collided with it.

There has yet to be a consensus on the latter case, as police are still investigating the ultimate cause of the crash.

Google’s driverless car accident: On a sun streaked day of September in 2016, a truck driver ran a red light and collided directly with one of Google’s self-driving vehicles. Though there were no casualties, the side of Google’s car was completely crushed.

Shortly thereafter, Google issued a statement, saying: “in accidents involving Google’s self-driving cars, it has almost always been the other driver who was at fault” and added “our task at hand is developing an autonomous vehicle technology that can react appropriately to other unpredictable drivers on the road.”

Although it is fortunate that nobody was harmed, Google’s statements left some people wondering.

Uber’s driverless car accident: On one particularly dark night on March 18th, 2018, a woman was crossing the street illegally. At the time, Uber’s self-driving Volvo was passing through the same street, traveling about 60 kilometers per hour. An Uber employee was at the wheel, but the car did not detect a person crossing the road in the dark. The Volvo struck [3] the pedestrian, killing her instantly.

The authorities issued a statement asserting that “it would have been difficult for even a human driver under the same circumstances to avoid an accident.” However, after video footage of the accident was released, many are blaming the vehicle’s detection technology for the incident.

Experts argue that though the victim was not at a crosswalk, the technology that the Volvo was equipped with, its sensors and radars should have been more than sufficient to bring the vehicle to a full stop.

Others argue that there was a bug in the system, and that Uber is ultimately to blame, since autonomous vehicles are not advanced enough to take on unpredictable situations.

Having now read these three incidents of autonomous vehicles gone wrong, what are your thoughts? Do you feel comfortable facing a future in which autonomous vehicles run the roads?

자율주행차의 위험성

여러 분야의 전문가들은 오래전부터 자율주행차가 우리의 미래라고 예측해왔습니다. 아직 완전 자율주행차의 개발까지는 도달하지 못했지만 그 미래는 가까이 다가왔습니다.

자율주행차에 대한 비판적인 목소리는 아직 존재합니다. 자율주행차량이 길거리와 집에 등장하기 시작하면서 어쩔수없이 많은 복잡한 상황들이 전면에 드러나고 있습니다. 실제로 자율주행차량과 관련된 사고는 현재 이미 일어나고 있습니다. 다음의 사례 연구는 과거에 일어난 자율주행차량 사고를 상세히 분석하고 있습니다.

테슬라 자율주행차 사고: 2016년 5월 대낮 미국에서 테슬라 모델S가 오토파일럿 모드(완전 자율주행이 아닌 운전자 보조 모드)로 주행하고 있었습니다. 고속도로로 진입한 운전자는 운전자 보조 모드에 전적으로 의지한 상태였고 흰색 대형 트럭이 갑자기 차 앞으로 끼어들었을 때 제때 반응을 하지 못했습니다. 테슬라는 그 트럭과 충돌했습니다. 주목해야 할 점은 오토파일럿 모드 지침대로라면 테슬라가 자동적으로 바로 브레이크를 밟아야 했습니다.

경찰은 오토파일럿 모드에서 테슬라 모델S의 차량감지장치가 밝은 하늘의 흰빛과 트럭의 흰색을 구분하지 못했다고 발표했습니다. 경찰은 자동차의 경고에도 불구하고 운전대를 손으로 잡지 않고 부주의하게 주행하다 사망한 운전자의 과실로 사고를 결론지었습니다.

그로부터 2년 후 2018년 3월 테슬라 모델X도 이와 유사한 기술적 결함으로 인해 인명사고를 일으켰습니다. 이번에도 오토파일럿 모드가 제대로 작동하지 않아 테슬라가 분리대를 정면으로 들이받았고 곧 불이 붙었습니다. 이어 차량 2대도 연달아 테슬라와 충돌했습니다.

이 두번째 사건은 아직 결론이 나지 않은 상태로 경찰은 사건의 궁극적인 원인을 조사중입니다.

구글 자율주행차 사고: 2016년 9월 햇빛이 쨍쨍한 어느날, 교차로에서 빨간불 신호를 무시하고 달리던 트럭이 운행중이던 구글의 자율주행차를 들이받는 사고가 발생했습니다. 다행히 인명 피해는 없었지만 구글 차의 옆부분이 크게 파손되었습니다.

사건 직후 구글측은 “구글의 자율주행차와 관련된 사고는 거의 대부분 상대 운전자의 과실에 의해 발생한 사고”라고 발표하고 “우리의 현안은 예측하기 어려운 운전자의 돌발 행동에 적절히 대응할 수 있는 자율주행기술을 개발하는 것”이라고 덧붙였습니다.

인명피해가 없었다는 것은 매우 다행이지만 구글의 이러한 입장문에 의아심을 가지는 사람들도 있었습니다.

우버 자율주행차 사고: 2018년 3월 18일 유난히 어두웠던 밤에 한 여성이 무단횡단을 하고 있었습니다. 이때 우버의 자율주행차인 볼보가 이 거리를 시속 60km로 달리고 있었습니다. 운전석에는 우버 직원이 앉아 있었지만 차는 어두운 데서 길을 건너는 보행자를 감지하지 못했습니다. 볼보는 보행자를 치었고 그 여성은 즉사했습니다.

경찰은 “같은 상황에서 사람이 운전을 했어도 피하기 어려웠을 사고”라는 성명을 내놓았습니다. 하지만 사고 영상을 본 많은 사람들은 자율주행차의 감지 기술에 원인이 있다고 말했습니다.

전문가들은 보행자가 횡단보도에 있지 않았을지라도 이 볼보에 장착된 기술과 센서, 레이다는 차를 완전히 멈추기에 충분해야 했다고 주장합니다.

어떤이들은 시스템에 결함이 있었고 결국은 우버의 잘못이라고 주장합니다. 아직까지 자율주행차량들은 돌발상황에 대처할 수 있을 만큼 진보된 수준이 아니기 때문입니다.

자율주행차량 관련 세 건의 사고에 대해 읽은 지금 여러분은 어떤 생각이 드십니까? 운전사 없는 자율주행차량들이 도로를 달리는 미래가 다가오는 것에 별 걱정이 없으신가요?

Discussion Questions
Q1
In your own words, please briefly summarize the article.
Q2
Do you have any experience riding in a partially autonomous vehicle, such as a Tesla? Would you be interested in one?
Q3
Before reading this article, did you know that there were accidents related to autonomous vehicles? Has your attitude toward autonomous vehicles changed upon knowing these accidents?
Q4
On the whole, do you think autonomous vehicles will be safer, or less safe?
Q5
What is most exciting or worrisome to you about the future of autonomous vehicles?
Q6
What type of expertise do you think the authorities develop to conduct investigations on these accidents?
Q7
How much do you know about the features of your car? Which features do you use most often?
Q8
Are you willing to purchase a fully autonomous vehicle in the future? If not, What would make you consider buying one?
Q9
If you have a question or questions that you'd like to discuss during your class, please write them down.
Expressions
reach the/a point
to get to a stage where you can make a decision
Ví dụ
1

I reached a point where I couldn’t talk to him anymore; that’s when I knew we had to break up.

Ví dụ
2

That is the point of no return-- once you get there, you cannot turn back.

come to the forefront
something moving to a place of prominence
Ví dụ
1

The issue of disabled people came to the forefront during our engaging conversation.

Ví dụ
2

His debate brought his internalized stigmas to the forefront.

struck
verb: past tense of strike, hit forcibly
Ví dụ
1

The clock struck noon, but my heart wasn’t ready for it.

Ví dụ
2

He struck the ball with such force that it exploded instantly.

본 교재는 당사 편집진이 제작하는 링글의 자산으로 저작권법에 의해 보호됩니다. 링글 플랫폼 외에서 자료를 활용하시는 경우 당사와 사전 협의가 필요합니다.

Experts from across fields have long predicted that autonomous vehicles will be the future. Although we haven’t quite reached the point [1] of development for fully autonomous vehicles, the future is near.

Criticisms of autonomous vehicles still exist, and many complicated scenarios are bound to come to the forefront [2] as autonomous vehicles begin to populate the streets and our homes. Indeed, car accidents with autonomous vehicles have already happened in our present. The below case studies will serve to provide detailed breakdowns of past autonomous vehicle accidents.

Tesla’s driverless car accident: In May of 2016, during a bright daylight drive, a Tesla Model S was operating on autopilot mode (which serves as driver assistance mode, and isn’t fully autonomous). The driver who had directed the Model S onto a highway, was fully reliant on this mode, and did not react in time when a large white semi suddenly cut in front of the Model S. The car collided with the semi. It is worth noting that per the guidelines of the autopilot mode, the Model S should have begun braking instantaneously.

The police statement said that the Model S’s vehicle detection system failed to differentiate the white of the brilliant sky and the white color of the semi. The police report ultimately concluded that the deceased driver was at fault for not having his hands on the wheel in spite of the car’s warnings.

About two years later, in March 2018, a Model X suffered a similar malfunction, resulting in a fatal car c rash. The autopilot again failed, resulting in the vehicle crashing directly into a barrier. It then caught fire, and two other cars collided with it.

There has yet to be a consensus on the latter case, as police are still investigating the ultimate cause of the crash.

Google’s driverless car accident: On a sun streaked day of September in 2016, a truck driver ran a red light and collided directly with one of Google’s self-driving vehicles. Though there were no casualties, the side of Google’s car was completely crushed.

Shortly thereafter, Google issued a statement, saying: “in accidents involving Google’s self-driving cars, it has almost always been the other driver who was at fault” and added “our task at hand is developing an autonomous vehicle technology that can react appropriately to other unpredictable drivers on the road.”

Although it is fortunate that nobody was harmed, Google’s statements left some people wondering.

Uber’s driverless car accident: On one particularly dark night on March 18th, 2018, a woman was crossing the street illegally. At the time, Uber’s self-driving Volvo was passing through the same street, traveling about 60 kilometers per hour. An Uber employee was at the wheel, but the car did not detect a person crossing the road in the dark. The Volvo struck [3] the pedestrian, killing her instantly.

The authorities issued a statement asserting that “it would have been difficult for even a human driver under the same circumstances to avoid an accident.” However, after video footage of the accident was released, many are blaming the vehicle’s detection technology for the incident.

Experts argue that though the victim was not at a crosswalk, the technology that the Volvo was equipped with, its sensors and radars should have been more than sufficient to bring the vehicle to a full stop.

Others argue that there was a bug in the system, and that Uber is ultimately to blame, since autonomous vehicles are not advanced enough to take on unpredictable situations.

Having now read these three incidents of autonomous vehicles gone wrong, what are your thoughts? Do you feel comfortable facing a future in which autonomous vehicles run the roads?

*This material is designed for the exclusive use of Ringle students on the Ringle platform.