JXL has Guetzli lossless JPEG compressor integrated into the standard so it produces reversible and completely standard compliant JXL images that are 15-20% smaller size.
Reversible in sense that you can still convert the image back the original JPEG, that is bit exact file as the input JPEG was (it takes care of all the metadata also - it has to).
Also if you decide to forgo the reversibility you can get a bit more out of it as JXL is actually a superset of JPEG, so it can read the JPEG stream and convert it to JXL without complete recompression - it will just use more efficient structure of JXL and much more efficient (ANS vs. Huffman) entropy encoding. The additional savings compared to the reversible mode aren't big however.
The lossless thingy is Brunsli. In the last meters of the standardization, Brunsli in JPEG XL was replaced with "Brunsli 2.0", the more natural formalism in JPEG XL format, allowing for a smaller spec and decoder as well as parallel decoding.
Guetzli is a slow high quality jpeg encoder. One can use jpegli for that need nowadays, 1000x faster...
xHE-AAC from 2016 (also known as USAC) yes. The older HE-AAC from 2003 and HE-AACv2 are not. Codecs have similar names, but they are different and released at different times.
Note that AAC (presumably they mean "Main Profile" rather than AAC-LC) has effectively the same efficiency as Opus. HE-AAC and HE-AACv2 have a higher efficiency than both Opus and AAC, and works great at lower bitrates in comparison to AAC.
Note that AAC (presumably they mean "Main Profile" rather than AAC-LC) has effectively the same efficiency as Opus. HE-AAC and HE-AACv2 have a higher efficiency than both Opus and AAC, and works great at lower bitrates in comparison to AAC."
This chart just roughly outlines (according to the feeling of Opus developers at that time) what to expect from Opus - a wide range of useful bitrates. It's not anything that was actually measured or something that can be used for drawing any conclusions from it. I mean - those nice curves and lack of any detail about the codecs used should give it away.
According to public (double blind) listening test that were performed by the Hydrogen audio group Opus does win over best HE-AAC codecs available at time when the test was performed - both at 64kbps and 96kbps bitrates [1] (Multiformat Tests).
Camera manufacturers are old spineless companies. In all those years they have done nothing for digital image formats and it is the most important thing in a camera. They weren't even able to come up or attempt to make a standard RAW format. All that came from Adobe (DNG), which the camera manufacturers happily ignored for their own proprietary solution until this day.
Well RED is mostly suing for their RAW video compression patents, which are just dumb an should never be allowed to be passed in the first place (and AFAIK Nikon is currently battling to invalidate that). But this is also their own problem - they haven't put almost no R&D into the software side of digital photography and videography like formats, processing. They have a nice camera which outputs 12-bit and higher images - they should be the first ones requesting and defining a new image format for consumption, which can handle that.
DSLR market has pretty much stopped (rarely any new DSLR camera is released) as everyone shifted to mirrorless cameras. Camera manufacturers mostly added HEIF (Canon, Sony, Fuji) as the non-RAW image format, because they already have HEVC for video.
Camera with a lossy / lossless 12-bit, 14-bit JPEG XL would definitely be interesting for many photographers. Not everyone wants to be forced to do a complete post-processing with RAW. JPEG is to limited (8-bit only) and HEIF isn't much better, while not having much support (especially on the web) because of the patent situation.
Also good to know that Jpegli (a traditional jpeg codec within libjxl) allows for 16 bit input and output for '8-bit jpegs' and can deliver about ~12 bits of dynamics for the slowest gradients and ~10.5 bits for the usual photographs.
Jpegli improves existing jpeg images by about 8 % by doing decoding more precisely.
Jpegli allows for encoding jpeg images 30 % more efficiently by using JPEG XL adaptive quantization (by borrowing guetzli's variable dead zone trick), and the XYB colorspace from JPEG XL.
Together, you get about 35 % savings by using jpegli from the decoding + encoding improvements. Also, you get traditional JPEG that works with HDR.
Jpegli is predicted to be production ready in April or so, but can be benchmarked already.
In what way does Google control WebP? They have frozen the format and nothing will be added to it. WebP2 was abandoned. Google is also involved with JPEG XL - actually some developers that worked on WebP now work on JPEG XL.
If someone controls something it is the Chromium/AOM/AVIF/AV1 team that controls and decides what multimedia formats go inside Chrome/Chromium thus controlling what is used on the web and ignoring what the web community has to say about it.
No. VP8 and VP9 bitstreams are frozen and stayed frozen when they decided to freeze it (they made that sure with automatic tests in libvpx that calculated the bitstream checksum) otherwise old VP8/VP9 files would stop playing. The difference is only with the approach as with VP8 and VP9 the "reference" decoder is the specification, which is not what many people like for various reasons (for example there were some errors in the implementation, which needed to be implemented by everyone because of that).
The first link you posted about VP8 just writes that they'll create an experimental branch (back in 2010) where incompatible bitstream changes can go in. Guess what eventually happened with those changes? VP9
The second link is the link to the unfinished VP9 specification, however with VP9 they still had that "reference" decoder code is the specification approach. As VP9 started to gain traction, the demand for the specification became greater, so they post-hoc started writing one but as you can see, they never finished it. Still this doesn't mean the bitstream isn't frozen.
Apple probably held off adoption so that HEVC patent license situation improves a bit (at least most companies with patents join in some pool or start to offer licenses).
If AV1 would be released sooner it wouldn't be even as good as HEVC, which would be bad. VP9 is already that free-but-not-as-good-as-HEVC codec. Also they would need to release AV2 in 1 or 2 years (as the better-than-HEVC codec), which is just too soon after AV1 and would put a big question on why AV1 was released at all...
Also if you decide to forgo the reversibility you can get a bit more out of it as JXL is actually a superset of JPEG, so it can read the JPEG stream and convert it to JXL without complete recompression - it will just use more efficient structure of JXL and much more efficient (ANS vs. Huffman) entropy encoding. The additional savings compared to the reversible mode aren't big however.