cross-posted from: https://fedia.io/m/[email protected]/t/2201156
In case you were worried about the roads being too safe, you can rest easily knowing that Teslas will be rolling out with unsupervised “Full Self Driving” in a couple days.
It doesn’t seem to be going great, even in supervised mode. This one couldn’t safely drive down a simple, perfectly straight road in broad daylight :( Veered off the road for no good reason. Glad nobody got badly hurt.
We analyze the onboard camera footage, and try to figure out what went wrong. Turns out, a lot. We also talk through how camera-only autonomous cars work, Tesla’s upcoming autonomous taxi rollout, and how AI hallucinations figure into everything.
There’s a lot wrong with Tesla’s implementation here, so I’m going to zoom in on one in particular. It is outright negligent to decide against using LIDAR on something like a car that you want to be autonomous. Maybe if this car had sensors to map out 3D space, that would help it move more successfully through 3D space?
I’ll go further and say that LIDARs should be mandatory by law when vision is used. A lot of industries have regulations, and this guy can slap shitty webcams in his cars. That’s a big problem.