Something about this I found surprising is that Linux distros are pulling and packaging pre-built binaries from upstream projects. I'd have expected them to build from source.
# Remember: we cannot leverage autotools in this ebuild in order
# to avoid circular deps with autotools
Namely, to unpack autoconf-2.72e.tar.xz from gnu.org you need xz-tools. And this is just the shortest circle. It is not very common, but xz-utils was one of few rare cases where regeneration of autohell files was considered as unnecessary complication (it backfired).
Unfortunately, those GitHub links are no longer valid, so we randos can't use them to learn what went wrong here. Hopefully GH will reverse this decision once the dust settles.
GitHub should not just reverse and make repo public and archived "as is", because there are many rolling distributions (from Gentoo to LFS), submodule pullers, CI systems, unaware users, which may pull and install the latest backdoored commit of archived project.
However if you want to access exact copies of backdoored tarballs, they are still available on every mirror, e. g. in http://gentoo.mirror.root.lu/distfiles/9f/ . For project of this level artifacts are checksummed and mirrored across the world by many people, and nothing wrong with that.
The gist of it is: The "good" one is the auto generated "Source code" releases made by github. The "bad" one is a manually generated and uploaded source code release, which can have whatever you want.
Sorry, by "they" I mean "Debian and Fedora", which (when including derivatives) include most Linux systems which use a Linux distro in the standard sense.