即使让人类接管也容易出事,wiki上这个说法很值得思考
Two human-factor challenges are important for safety. One is the handoff from automated driving to manual driving, which may become necessary due to unfavorable or unusual road conditions, or if the vehicle has limited capabilities. A sudden handoff could leave a human driver dangerously unprepared in the moment. In the long term, humans who have less practice at driving might have a lower skill level and thus be more dangerous in manual mode. The second challenge is known as risk compensation: as a system is perceived to be safer, instead of benefiting entirely from all of the increased safety, people engage in riskier behavior and enjoy other benefits.
这两个问题一定会出现在半自动驾驶车辆上,当车辆推卸责任给人类的时候,人类斯基因为上述原因无法及时做正常反应
【 在 whistlingMe (哈哈) 的大作中提到: 】
: 可以这么认为,本质上还是现在各种机器学习的系统原理上就不能识别没见过的东西,只会分类为训练时的类别,所以才会出错。L3和L4都要求系统能在无法处理的情况让人类接管,但实际上系统很难区分什么情况是自己无法处理的,
--
FROM 73.63.245.*