I find it hard to think that a group of engineers didn't consider stationary objects when _designing_ an autopilot system. I would no longer consider it a flaw if it was considered.
Here is a more technical explanation of the limit.
You send a radar signal out, then it bounces off of stuff and comes back at a frequency that depends on your relative motion to the thing it is bouncing back from. Given all of the stationary stuff around, there is a tremendous amount of signal coming back from "stationary stuff all around us", so the very first processing step is to put a filter that causes any signal corresponding to "stationary" to cancel itself out.
This lets you focus on things that are moving relative to the world around them. But makes seeing things that are standing still very hard.
Many animal brains play a similar trick. With the result that a dog can see you wave your hand a half-mile off. But if the treat is lying on the ground 5 feet away, it might as well be invisible.
That is not accurate. In fact one of the key features of radar processing is to find objects in the clutter, usually by filtering on Doppler shifts.
As far back as the 1970s helicopter-mounted radars like Saiga were able to detect power lines and vertical masts. That one could do so when moving up to 270 knots and weighed 53kg.
Actually it needed to detect a crash barrier against a background of lots of other stuff that was also stationary.
The easier that you distinguish the signal you want from the signal that you don't, the easier it is to make decisions. For radar, that is far, far easier with moving objects than stationary ones.
Right. Besides, pulse doppler can detect stuff that is stationary relative to the radar anyway.
The real issue is false positives from things like soda cans on the road, signs next to the road, etc. Can't have the car randomly braking all the time for such false positives. As a result, they just filter out stationary (relative to the ground) objects, and focus on moving objects (which are assumed to be vehicles) together with lane markings. This is why that one Tesla ran right into a parked fire truck.
Interestingly, I've discovered one useful trick with my purely camera-based car (Subary equipped with Eyesight): if there is a stationary or almost stationary vehicle up ahead that it wasn't previously following, it won't detect it and will consider it a false positive (as it should, so it doesn't brake for things like adjacent parked cars), but if I tap the brake to disengage adaptive cruise control and then turn the adaptive cruise control back on, it will lock on to the stopped car up ahead.
The problem is not whether the object is moving relative to the radar. It is whether the object is moving relative to all of the stationary objects behind it that might confuse the radar.
It is a ROC curve/precision-recall issue basically. Radar has a terrible lateral and even worse to non-existing elevation measurement. Potholes, man holes, Coke cans and ground clutter look all alike and can in fact be detected as "having the negative relative velocity as my car's own velocity". You want to stop for only very few of all those stationary objects, otherwise you won't drive at all. The problem is you can't classify the relevant ones with radar. Which is why the camera helps, but obviously (for false positive suppression and a high availability of the autopilot) only if it positively classifies a car's rear.
No, since you need movement to create the along-track aperture, you would have already moved through the aperture, running over what you are trying to detect.