国外丝滑个屁,陈震在美国也测试过,油管上吐槽的也很多。
Tesla's Full Self-Driving (FSD) technology has long been hyped as the next big thing in the automotive world. While Tesla CEO Elon Musk often touts the system's capabilities, independent testing and data have often painted a more sobering picture. Now, a new evaluation by AMCI Testing, an independent automotive testing firm, has further exposed the limitations of Tesla's FSD, raising significant questions about its readiness for widespread deployment.
AMCI's testing, conducted over 1,000 miles in a 2024 Tesla Model 3 Performance equipped with Hardware 4 and running FSD versions 12.5.1 and 12.5.3, revealed a disconcerting reality: human intervention was required on average once every 13 miles. This translates to over 75 interventions during the testing period, far exceeding the expectations of even the most critical FSD observers.
The findings stand in stark contrast to crowdsourced data, which suggested a much higher average distance between disengagements. While AMCI acknowledges that Tesla's FSD system is impressive in its ability to mimic human-like driving behaviors, especially for a camera-based system, it also warns against the dangers of complacency.
Guy Mangiamele, Director of AMCI Testing, cautions that the system's initial "infallibility" can create a false sense of security, leading drivers to take their hands off the wheel or become distracted. This, he emphasizes, is incredibly dangerous, as even professional drivers operating with a testing mindset had to remain vigilant to catch split-second miscalculations by the FSD system.
Perhaps even more concerning is the unpredictable nature of FSD's failures. Mangiamele notes that the system can successfully navigate a particular scenario multiple times, only to inexplicably fail the next time around. Whether this is due to a lack of computing power, buffering issues, or shortcomings in the system's assessment of its surroundings remains unclear.
AMCI's testing also highlighted persistent failures stemming from basic programming inadequacies. For instance, the system often initiated lane changes towards a freeway exit mere tenths of a mile before the exit itself, hindering its overall functionality and raising doubts about the quality of its underlying programming.
As the automotive industry continues its march towards fully autonomous vehicles, Tesla's FSD remains a fascinating yet controversial example of the challenges and complexities involved in achieving this goal. With AMCI planning to release more videos and test future FSD updates, the ongoing scrutiny of this technology is likely to continue, keeping both Tesla and the broader automotive community on their toes.
【 在 CCERCCUS 的大作中提到: 】
: 必然有戏,在国外已经做到丝滑且远低于人类驾驶事故率了,但是不让直接用国外的AI引入国内,国内得从头开始训练,但又不让用国内的实际驾驶数据,只能用公开的数据模拟对抗训练。并且国外的交通信号灯没中国这么复杂,比如路中间的小信号灯,比如左转灯,掉头灯,所以出现这种交通违规也很正常。
: 目前看,所谓违规,还没出现哪种争抢硬闯、急刹和幽灵停车这种。
--
FROM 219.232.8.*