This must be infuriating for webdevs if some of their users report an error because they're in the percentage while the devs who have to fix things aren't in it, unable to reproduce the error.
The browser dev tools warn you about it (with links to how to fix it) and I think if you’ve got the change they also tell you which cookies have been blocked. I don’t mean that to sound holier-than-thou, even with the messages I spent a couple of hours last week debugging the exact problem you mention on an internal tool.
This is why web developers and testers should test pre-release browser versions. Better to find out that a code change in Chrome Canary or Firefox Nightly broke your website 4-8 weeks before the new browser ships than after it ships. If the breakage is a browser bug, you still have a chance that Google or Mozilla can fix the regression before it affects your users.
There's an argument to send pertinent A/B study information in a request header of some sort, for this reason. It's now no longer enough to just look at the UA.
Even worse when it's retail customer facing and it works for all developers. This broke our chat widget and is probably the explanation for our declining chat numbers. Nobody connected the dots until some developers got included in the rollout. Now all hell broke lose because the business realized we've been serving a broken chat feature to some users for months.