Hacker News new | past | comments | ask | show | jobs | submit login

Tricks for packing data are still relevant today, though obviously hardware has changed so the tricks are different. One important concept on today's machines is that memory latency is huge relative to the cost of executing other instructions (eg: arithmetic, logic). In some senses, such packing has become even more important as time has gone on. There was a point in history when memory operations and ALU operations were about the same speed, so it made sense to keep even small computations in memory, and retrieve them later (eg: a lookup table, possibly pre-computed). Nowadays, it is often faster to re-compute values from scratch when needed, rather than going to memory, even for biggish computations.

In today's world, the GPU largely deals with textures, but the same rule holds — it can make sense to compute values rather than doing texture fetches. Anyway, I'd say your instinct is right — developers are still using lots of tricks to get things where they are. There is still a lot of competitive pressure to provide more, more, more and squeeze the most out of the hardware.

All that said, the 100GB download is often going to be audio/video. Then probably textures. There is certainly standard compression going on for a lot of that. But not the sort of manual "these texels go here, those texels go there" fiddling of data like described in this article. Though UV unwrapping is still and art and a science (:




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: