Tesla preferred to hit the oncoming car instead of hitting the pedestrian who fell on the road.
byu/1heavyarms3 inlegal
これは誰のせいですか?
海外の反応
・誰のせいかと言われても…十分な情報がありませんね
誰かが歩行者を歩道で突き飛ばしたのかもしれないし、躓いて転んだのかもしれない
誰もテスラのせいだとは思わない
・防犯カメラの映像を見ると、歩行者は一人で足を滑らせてつまずいていますね
これは歩行者に完全に責任があります
・歩道に穴が開いたのは市の責任です
・市はもっと早く修理しなければいけなかった
・テスラの運転手のせいではない
・保険会社に言わせれば、テスラの運転手は過失に当たる
人の命を救ったけど、対向車線にはみ出して車にぶつけた責任はテスラの運転手にある
・簡単です
転んだ人が事故を引き起こしました
・自動運転中だったのだろうか?
もしかしたら運転手が運転していたのかもしれない
・少なくとも人の命は救ったんだからテスラは称賛に値する
・僕が自分の車を運手していたら転んだ人は亡くなったか大怪我してるはずだ
At least 13 fatalities were reported in Tesla’s “autopilot” driving support function using self-driving technology, according to a report released by federal authorities on April 26 (U.S. time). The report also points out that Tesla could have foreseen such a situation and that more measures should have been taken to prevent it.
Moreover, Tesla’s driving support function did not include some basic preventive measures, such as those introduced by competitors, and the report even made Tesla an “industry bulwark.” Tesla has updated its autopilot to fix basic design problems and prevent fatal accidents. Still, regulators question whether it was appropriate. Of the 109 accidents caused by “front collisions” carefully investigated by government engineers, or Tesla’s direct collision with vehicles or objects in the direction of travel, at least half of them were visible from more than five seconds before the collision. The government’s team of engineers concludes that careful drivers should have enough time to prevent conflicts or at least avoid the worst.