(Added in edit to explain someone else's comment and reference) ... it doesn't beg the question, it raises it. But putting that to one side ...
They're not all the same URL, and when a submission is old enough it's no longer tested against. Minor variations in the URL will also thwart the dup detector.
Several suggestions have been made in the past about improving the dup detector, but it's unlikely they'll be implemented.
Not sure if this has been suggested or not, but what about a list of "similar articles" below yours (based on a search of past articles based on title/content/etc) so that you can verify yours is unique?
It's an interesting idea, and could be useful. I think that finding a selection that matches your intended submission could be difficult. Would be cool, though.
It also begs the question why don't comments regarding duplicates on duplicate submissions automatically get de-duped?
The fun aside, it boils down a matter of ROI on effort. It is not trivial to detect duplication (save exact matches) on URLs. If the community is enabled to deal with it in a non-automated way, it is not worth the trouble dealing with all the threads about false positives and other similar troubles.