U.S. regulator probes Tesla's self-driving mode after crashes
Raw StoryThe U.S. auto safety regulator said Friday that it has opened an investigation into Tesla's Full Self-Driving software after receiving four reports of crashes, one of which involved a pedestrian being struck and killed. The crashes all occurred when "a Tesla vehicle traveling with FSD engaged entered an area of reduced roadway visibility conditions and Tesla's FSD continued operating," the statement from the National Highway Traffic Safety Administration said. In April the company settled with the family of an engineer killed when his Model X -- which used Tesla's Autopilot driver assistance software -- crashed in Silicon Valley in 2018. Last year the company was forced to recall nearly 363,000 cars equipped with FSD Beta technology, and more than two million vehicles over risks associated with the Autopilot software.