Advanced Audio Master visibly reduces C1 errors in recorded media, to almost negligible levels. While C1 error can be corrected without any degradation theoretically, its result is up to the CD player's capabilities (and quality).
CDRInfo's tests back in the day showed dramatic improvements in C1 levels, see [0]. Considering some of the lower end CD players by leading manufacturers didn't even had 16bit audio decoding and used late stage oversampling, reducing C1 errors was/is a big deal in recorded media.
As I said in my earlier comment, this mode lead to clearly audible improvements in my older, low-end Sony CD player. I don't how how will it fare in my new Yamaha player due to technology improvements.
Sure, the C1 rates are lower here for one kind of media in one actual comparison, but C1 errors below a certain rate are entirely correctable errors. Meaning, you're not going to lose data when it gets reconstructed. You are not losing fidelity of the CD by having a lower C1 rate in the 20s to 40s; every bit decoded is being played back exactly as originally encoded. Otherwise reading data off a CD would always have significant amounts of corruption every time you installed software. Every time you opened a photo off a photo CD you'd get all kinds of JPEG corruption.
Assuming you have a properly functioning CD player, these errors have zero input in the quality of a CD being played. If you have a noticeable difference in audio quality between a CD with a max error rate of 24 or 32 C1 errors, you've got a tremendously faulty CD player that is complete trash.
Its funny too because in this table it shows the Mitsubishi media performed about identical or better in every speed while being significantly faster at recording. The 1x speed with AudioMaster on for Plasmon is the worst result, ignoring the time they burned 16x media at 44x speed. This table is also challenging to actually compare, because they show different speeds for the different modes (1,4,8x for AM, 4,16,44x for regular) so the only one we can really compare fairly is the 4x. Even then, at 16x speed AM off it had a lower average error rate than 8x speed with it on!
Don't get me wrong, burning a lower error rate from the get go is good, it implies the burn will possibly be more reliable over time as you get things like scratches and other imperfections on the disc. But arguing that a disc with an average of 2.1 C1s vs 1.1 is going to be noticeably different in the sound is absurd. And once again, even then this showed an improvement only in one of the two medias tested. Maybe its better in more media, maybe its worse, maybe there was just something odd with their burns and this is largely just noise in the overall results of burn results from this drive.
I'm still not convinced AM actually did anything but reduce your recording time. Your link doesn't say anything of actually increasing the audio fidelity in the slightest, just that in one straight comparison the C1 errors were lower. Practically every result in that table other than the 16x media being written at 44x speeds is already massively in the "negligible levels." Being below 220 is "negligible", and all of these burns (aside from the one using AM at 1x speed!) are well below that.
CDRInfo's tests back in the day showed dramatic improvements in C1 levels, see [0]. Considering some of the lower end CD players by leading manufacturers didn't even had 16bit audio decoding and used late stage oversampling, reducing C1 errors was/is a big deal in recorded media.
As I said in my earlier comment, this mode lead to clearly audible improvements in my older, low-end Sony CD player. I don't how how will it fare in my new Yamaha player due to technology improvements.
[0]: https://www.cdrinfo.com/d7/content/yamaha-crw-f1e-cd-rw?page...