Hacker News new | past | comments | ask | show | jobs | submit login
Demo of megatextures running on n64 hardware (github.com/lambertjamesd)
182 points by zdw 9 months ago | hide | past | favorite | 53 comments



Appear to be videos of this project in action:

"How I implemented MegaTextures on real Nintendo 64 hardware" - https://www.youtube.com/watch?v=Sf036fO-ZUk

"Megatextures Tech Demo - N64brew Summer Game Jam 2023 Submission" - https://www.youtube.com/watch?v=plh9OGel-lM


TL;DW on the video: It's a texture atlas + mipmapping + streaming only parts of the texture atlas into RAM depending on where the view frustum intersects + discarding the z-buffer for a little more memory. Does not require the expansion pak. Impressive because a lot of N64 devs at the time didn't do these now-common things, and yet the N64 remains capable of them at a playable framerate.


Plus the video was using the N64’s high-res mode that most games didn’t use.

Presumably using the more standard resolution would let you use smaller textures and get a higher frame rate.

I hope someone makes something really cool with this some day. It’s always amazing to see what people are capable of pushing old hardware to do given the additional the knowledge we’ve gained and without the constraints of commercial timelines.


There's one developer that has spent a few years optimizing Mario 64 performance while also fixing various bugs and interactions. He also makes some videos showcasing other modern N64 games.

https://youtube.com/@KazeN64


     Impressive because a lot of N64 devs at the 
     time didn't do these now-common things
Absolutely, but it's also worth noting that these techniques might not have been viable in the context of a game (or at least, most games) anyway.

The author himself addressess this at the end of one of the videos and concludes that it might be a stretch unless a game was designed specifically around this rendering strategy.

He skips the N64's hardware Z-buffer (to conserve bandwidth) and DIY's it, which works for the demo because the demo room has exceedingly simple geometry. I don't know that this approach would work once you involve player models with 500-700 polys as seen here: https://www.copetti.org/writings/consoles/nintendo-64/

The transition between texture detail levels is also a bit jarring: it's an amazing demo, but might be too jarring or frustrating for a game.

On the positive side, the demo runs in the N64's hires mode, which could be avoided to free up some (a lot of?) perf. Also the author admits that the highest detail level of the textures is not something that would necessarily be needed for a game.

One also wonders if a hybrid approach would work: "megatextures" and DIY z-buffering for the scenery, trad rendering for the characters. I don't know if that is possible.


Bypassing Z-buffer was one thing late period correct games actually did to speed things up http://gliden64.blogspot.com/2019/02/hle-implementation-of-b...


> unless a game was designed specifically around this rendering strategy

This used to happen all the time.

An N64 game with this level of graphics would have sold like crazy back in the day, even if the game wasn't that great.


The N64 lifecycle was only about six years. Homebrewers have had nearly thirty years plus better tools to come up with new techniques.


He also states in the video the textures take up 40mb of the 64mb an N64 cart can realistically hold.


The author noted that this approach might be workable for a "real" game with smaller textures -- the biggest levels of detail for textures in the demo are 1024x1024. (And, I don't think the author mentioned this explicitly, but the demo is running in hires mode which could be dropped for a game)

His approach might be feasible with smaller textures and still represent a visual upgrade over the typical N64 "blurry mess"


> typical N64 "blurry mess"

At least N64 had correctly-projected textures.

PlayStation had sharp pixelated textures that were incorrectly mapped onto 2D triangles (they didn't account for depth).

So N64 was blurry but correct. PlayStation more aptly deserves the "mess" descriptor.


For the disk limit, what about procedurally generated textures?


The N64's games had tiny textures and a very small draw distance. The Game Cube came too late and the PS and PS2 steamrolled Nintendo except for the handhelds with Pokémon. Later the Wii turned the tables around, which is basically applying the Game Boy philosophy to consoles by beefing up an off-the-shelf G3 PowerMac architecture and adding revolutionary controllers: (Game Cube -> Wii).


Makes you wonder what modern hardware is truly capable of, if only we had the sacred knowledge.


Intro algorithms courses discuss how much of computers advancement isn’t just raw speed but how we solve problems. Many problems had large inputs so if you have a shorty algorithm is really doesn’t matter if you have a 10x or even 1000x faster processor.

Which brings up the interesting question, if we didn’t keep adding complexity how flashy could we make old hardware? What things could we make it do that it never could in the day. Feels like this is one of those things?



Also look up 8080 MPH and it’s sequel Area 5150!


Just what I hoped someone would comment, amazing.


Yeah this is definitely one of those things. Another one I've seen was 'demaking' games like Doom to run on older hardware, like the Amiga. The demoscene and the like are really interesting to follow for those advances.


> 'demaking' games like Doom to run on older hardware, like the Amiga

Come on, brother. You can't just say something like that and not drop a link.


If the N64 could be stretched this far with some ingenuity, I wonder how far one could get on hardware that's old, but somewhat modern like the PS3 or the PC hardware contemporary to it.


I do think there are a few things that make the N64 kinda "special" for going back and squeezing more performance out of it. The 3D hardware is remarkably modern in some ways, but there are a couple very thorny resource limitations that you have to work within, and the system complexity is higher than you would probably like.

The main resource limitations you deal with are the main RAM banwidth and the TMEM size. The complexity comes from the division of work between the CPU and a coprocessor called the RSP, which is basically a stripped-down MIPS CPU core that has a SIMD unit and some scratch RAM. You can come up with cool ways to use the RSP, but if you eat up more of the main RAM bandwidth, you'll hurt performance.

The demo here is focused on working around the TMEM size limitation, but it looks like it also reduces the RAM bandwidth use by drawing in Z order rather than using the Z buffer... which is an overall solid approach for improving graphics performance on the N64.

The PS3 has some striking similarities to the N64, in that both consoles have coprocessors with SIMD units that operate on scratch RAM. Both consoles have a reputation for being difficult to program for. The PS3 takes things a bit farther in that the Cell SPEs can only access main memory through DMA. I can only imagine how hard it was to effectively use the SPEs.


The framebuffer tricks with the N64 and PS2 gave the developers (and later the emulator creators) lots of headaches.


I heard that The Last Of Us on PS3 really pushed the hardware to its limits, and seeing that game's visuals, I believe it. I think Naughty Dog put an enormous amount of time and effort into building and optimizing a game engine that could make thorough use of PS3's cell architecture, being very smart about the timing and order of various tasks to ensure maximum parallel processing during a generation where everything else was still running on one or two cores.


Additionally, Naughty Dog famously been pushing/punishing gaming systems for a long time, even games like Crash Bandicoot (1996 - PlayStation 1) and Jak and Dexter (2001 - PlayStation 2) really went up against the limitations and lots of hacks were incorporated to get the games to run well.

Uncharted and The Last of Us is just continue what is tradition at this point :)


There's an interesting documentary on the development of Crash Bandicoot on YouTube, https://www.youtube.com/watch?v=izxXGuVL21o


It would have been impressive in 2005, but by 2013-2014 TLOU wasn't that special any more. Average computers were much stronger than consoles by then.


Sure, but we're talking about impressive technical achievements on dated hardware, which is a different conversation than whether it bested modern PC hardware on the year of release.


What's funny is, I heard before that the N64 could stream textures from ROM. But I can't think of any games that were known to do so. In this thread someone mentioned it in 2021.

https://retrocomputing.stackexchange.com/questions/17566/why...

I guess the big deal is that this time it's optimized for absolutely as much resolution as possible, even at the cost of poly count


I think Turok did it. They had to store them uncompressed which came at the cost of cartridge space


This is the same guy who is making Portal 64


Yep. Very impressive.

At the end of the demo video for this he mentions he probably won’t use this for Portal. But he may adapt a version/use a somewhat similar technique just for the Ratman graffiti.


I'm so relieved and happy to see people who are able to pull of such feats of engineering and openly share how they did it. It really warms my heart.


No description on how this is achieved? I figure it must be creative since the N64 has a scant 4kB of texture cache. The textures must be streamed directly from ROM somehow, but in the real world this didn't work so N64 games were mostly Gouraud shaded with textures used sparsely.


Roughly:

1) 40MB of textures 2) split into mipmaps 3) split into ~32x~32 tiles. 4) streamed in according to camera 5) with the lowest resolution layers always loaded to handle whipping the camera around quickly. 6) and the Z-buffer off to save memory bandwidth.

The video is worth watching too.


watch the "how" video linked in the other comment


> Gouraud shaded

I thought it was phong shaded?


by running SUPER SLOW code in an emulator. There is a reason every single N64 game was either sparsely textured or looked like blurry mess - 4KB of texture cache.


Check the link, the video, or the title. This is all running on real hardware.


Smooth running video https://www.youtube.com/watch?v=plh9OGel-lM does 60 fps on emulator. Real hardware video shows 15-20fps in a small room and author himself confirms its unfeasible for anything more complicated (~8 minute mark).


15-20fps is a pretty normal framerate for many games on the real Nintendo 64 hardware, especially the later ones from Rare.


The PAL versions of Zelda ran at 15-20FPS even on emulators in the PC. Yet they were pretty playable.


It probably looked not as bad on a CRT since LCD and OLED have terrible motion resolution. See https://www.youtube.com/watch?&v=z4xgLUdQhKA


I emulated it back in the day on a CRT monitor, true.


It's not a texture cache, it's 4KB of texture memory that needs to be manually babysat both in an emulator and hardware.


It might be the same situation as with Atari Jaguar where silicon bug turned texture cache into a fixed buffer you have to load manually.


AFAIK, the memory inside the Tom chip was never a cache. The bug you're probably thinking of is one that makes it difficult to run code on Tom's RISC core directly from main RAM. IIRC, jump instructions are pretty bugged when not running from internal RAM, but the homebrew community came up with a workaround. It's still not acting as a cache in that case though. It just fetches instructions from RAM as it goes.


It's not.


I’m always in awe of folks who have not only the understanding but the dedication to pull projects like this off. I consider myself a good engineer and have written some graphics stuff in the past but nothing like this. This is awesome.


Wouldn't this technique work better on the PS1 which has a lot more storage capacity? Or is the slow CD drive an issue?


The most you can accomplish on a PS1 with CD streaming is, roughly, a Crash Bandicoot level[0]. Texture data is relatively bandwidth-heavy - and a single speed drive can only manage 150 Kb/s. So the optimal use case for "showing as much graphical data as possible per second of gameplay" biases towards using load-once textures, then streaming in more geometry with a camera designed to only go forwards: then everything matches the linear access patterns of the CD.

[0] https://all-things-andy-gavin.com/2011/02/04/making-crash-ba...


Just a small correction, PS1 had a 2x CD drive (but still supported 1x speeds of course)[1], but the point still stands.

[1]: https://www.psdevwiki.com/ps1/index.php/CD_drive


No, you would need to seek and load from the CD while moving around in the level. The N64 can do that almost instantaneously from the cartridge.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: