How do we know that we've hit a wall with lossless compression? As a non-expert, I've been really impressed with the likes of zstd and lz4 which are relatively new
I should clarify: There's still room for improvement in the speed of lossless compression/decompression operations, and that's where most improvements happen nowadays. But there's little room for improvement in compression ratio, because most modern algorithms produce outputs that are very close to random already.