Yes, most human drivers go above the speed limit. I agree that it can be unsafe, but wouldn't a self-driving car be safer at going above the speed limit than humans? I feel like it should be OK for it to go 5-10 miles over, especially if that's what the flow of traffic is.
I don't know if it's actually safer, but I feel like programming your self driving car to regularly break the speed limit would be a hard sell from a regulatory perspective. It's different in "self-driving" cars with drivers who can choose the speed of the vehicle. In this case, Waymo is programming the car to independently pick a speed, and making that programming decide to break the law seems like it could be a problem.
Probably a better solution is more reasonable speed limits and more consistent enforcement of those limits, but now I'm just engaging in wishful thinking.