Hacker News new | past | comments | ask | show | jobs | submit login
Architecture of the Nintendo DS (copetti.org)
351 points by Polylactic_acid on Aug 18, 2020 | hide | past | favorite | 103 comments



Fun fact! The Nintendo DS audio chip has a distinctive buzz that's visible in all audio tips uploaded to YouTube, and you can even hear it on this page! It comes from.... a bug in the popular DeSMuMe emulator. Yes, it turns out that the people that designed the "2sf" format used for unofficial music rips simply used stripped down ROMs and code extracted from a version of DeSMuMe. Community tooling later allowed you to batch convert them to MP3, and the rest is history. Line rips recorded from the audio jack of the device don't have this distinctive buzz. Extra fun when some official soundtrack releases contain the buzz :)

There's some other fun facts about the DS (its 3D hardware is scan line based, not frame buffer based! Which means you can't do post processing on it without some hacky tricks... can you tell it was from the team that made the PPU, rather than a modern GPU or even an SGI-derived one?), but that one is my favorite.

Edit: minor correction, the N64 supported AA, which is another reason jagged polygons are less common.


Interesting! Do you have any examples or further reading links about the buzz? I skimmed through the audio samples on the page but I couldn't quite hear it.


It's probably most notable in Mario Kart DS's bassline. I'll see about getting a comparison with what it should sound like later today.


You say in the article that the 3D engine can only display to one screen at a time, but some games definitely play 3D content on both screens (like Legend of Zelda Phantom Hourglass). How do you think these games are accomplishing this?


I didn't write the article, and I'm not much of an expert on DS graphics (it's too surreal for me). But you can copy 3D content to 2D images and then display it using the 2D engine, I believe, and then use 3D on the other screen through the direct method. Pretty much everything on the DS is a bizarre special case, and Nintendo's in-house staff is very good at exploiting every last trick in the book.


The article mentions that the Main 2D Engine had a mode that could write to a framebuffer that was the combined area of both screens, and could pull from the 3D engine into that full framebuffer layer.

Also, you'll notice in the New Super Mario Bros example that many of the sprites were pre-rendered 3D and the 2D engine was just pulling bitmaps at that point.

So a bunch of options with all sorts of interesting trade-offs and it would be interesting to see breakdowns of specific games and how they used the trade-offs.


From what I remember playing with the homebrew devkit, there is a hardware flag saying which screen the 3d engine writes to. So you will flip that flag every frame and have 30fps per screen.


> Edit: minor correction, the N64 supported AA

So did the DS: http://melonds.kuribo64.net/comments.php?id=32


> To begin with, antialiasing is applied to polygon edges based on their slopes, in order to make the edges look smoother. There is no antialiasing applied inside polygons or at intersections between two polygons

Oh my. You really can tell this GPU was done by a 2D PPU team.


And look at the way it tries to figure out a "foreground" and "background" color by doing some sort of bizarre approximation of depth peeling. What a glorious, and terrifying, hack.


The GBA and NDS are surprisingly pleasant to develop for, especially compared to smartphones. My first C programs were written for these.

I still can’t believe what that thing was capable of with a honebrew cartridge. I had dev tools, emulators for old desktop computers, web browsers and all kinds of stuff on a machine with just 4mb of ram (and this was before smartphones so being able to have that in your pocket was still niche.)

I thought smartphones would be like homebrew except projects would be abandoned less often because people could make money with them. I don’t think I’ve ever been so disappointed in my life.


GBA is a great platform for CS pedagogy too. There are a number university-level computational architecture classes that use it. And I think Nintendo could be amenable to licensing their IP for more curricula ;)

NDS development has always been something I wanted to explore. This article provides a nice excuse to get started.

Just to put things in perspective. DS could run a full 3D platformer like Super Mario 64 DS back in 2005 using only 32-bit ARM7 at 34 MHz. New PS5 specs: 8-core Zen 2 at 3.5GHz, 10 TFLOPs GPU, 16 GB GDDR6


The original DS had some impressive full 3d capabilities. Between Mario 64, mario kart DS, as well as some of the beautifully rendered scenes like the Sky Arrow Bridge in pokemon black:https://www.youtube.com/watch?v=IE16BT9RFEc


I remembered having my socks knocked off the first time I walked down Sky Arrow Bridge! I always wondered if it was truly 3d or if they were using some doom-style hack, but watching the video again I think it must have been true 3D. Also, it's amazing how much higher-fidelity it looked in my memory


3d DS games look terrible on larger screens, because there's not much of anti aliasing going on in the original hardware.

Luckily if you play on something like DraStic, you can add AA yourself which looks much better upscaled.


The PSX had similar MHZ but with a less powerful MIPS CPU and less RAM.


I would love to run a university course like this, but I am too afraid of getting the Nintendo banhammer half way through the year.


What bannhammer? Home brew is completely legal in the US and they can’t just come and take your hardware or devtools away (provided you use the open source ones.)


One does not simply avoid Nintendo’s lawyers by following the law :P


Don't the flashcarts count as breaking DRM and thus not allowed by the DMCA?


Usually the students would run their programs on an emulator instead of the original hardware (this also means they don't need to be provided with a GBA each).


Well, there is not just Android & iOS - there are other Smartphone OS (such as Sailfish OS which I use on my daily device) & with PinePhone we can hopefully see mobile Linux distro explosion, like what happened on PCs in the past.

Basically, I think modern mobile hardware is fine, it's just the software from Google & Apple that makes it often unusable & out of user control.

Thankfully there are people working on that & I'm sure they are looking for contributors. :)


It's nice to have a fixed resolution to develop for instead of any and all resolutions


I totally understand why they decided they didn't actually want to die on this hill, but it has always seemed to me that the introduction of multiple screen widths was the moment when the average UI quality in iPhone apps took a turn for the worse.


I have this 4y old Android smartphone which is slow for no good reason. I tried everything, removing/disabling apps, reset etc.

The problem is that apps are just too big. Tinder is unusable.

I thought I could root it and install a custom Android, but I realize now that it's not as easy as installing Debian on a desktop.

If you combine capitalism with wirth's law, you end up with expensive hardware that you cannot use. It's the same problem with JavaScript and web page size.

Everything is bloated.


Not everything; at least early on, iOS apps were great. But over time, on the one side the developers started adding more and more features, or switched to web tech or some hybrid technology, convinced that it would save them time and money (it didn't and the user experience is the worse for it). I also believe Apple itself added more and more cruft to the system.

One thing my colleague (who was a bit more hardcore than me) was convinced of is that hand-coding interface code was faster than using Interface Builder. I want to believe; I can imagine that the IB interface was just an XML file that had to be read, parsed and its layout built up, whereas handcoded layouts is very dumb code that would compile easily and execute quickly. But at the same time, I want to believe Interface Builder would be converted to obj-c code and compiled as normal.

Anyway Apple had years of head start in terms of performance and user experience (= perceived speed) to Android, their technology decisions was one of the reasons.

And yet, thinking about it, it wasn't even as fast as I could be; because of how objective-C works, function calls are somewhat dynamic so every function call, the runtime has to look up what to call the function on. Later on (with Swift) they managed to add an optimization that could omit this check if they detected the target could not change.

I'm rambling a bit and speaking from memory here btw, take this comment with a grain of salt. I'm no computer scientist, just a developer.


Objective-C message sends are ridiculously fast these days, like less than a handful of nanoseconds on average (1-2 on the fast path). There are ways of making it go even faster, but Apple has chosen to sully the language by taking away some of its dynamism instead :( It’s kind of sad to think about how much the original iPhone could do and how much code goes into the average app these days in comparison. You’d guess it was assets or something but no, if you look at major apps these days they have all sorts of framework dependencies with multi-megabyte binaries. It’s so sad to see…

(And Interface Builder files get compiled to a format that is then read out at runtime and deserializes the right things.)


There are a handful of natively written apps on iOS that have maintained the discipline to avoid bloat. One that I always admire is Overcast for podcasts. It's 5.9 MB and takes < 5 seconds to download and install.

It always makes me slightly resentful or sad about apps that are collections of web views but somehow 100+ MB.


> I have this 4y old Android smartphone which is slow for no good reason. I tried everything, removing/disabling apps, reset etc.

Does it use eMMC storage, or is it new enough to be NVMe? Any pre-2016-ish smartphone is built with basically the guts of an SD card (or MultiMediaCard rather) for internal storage and is a ticking time-bomb with a limited number of write cycles. They'll get slower and slower and then eventually stop working completely. My NoteⅡ and then later my NoteⅣ both succumbed to this. I never owned one but I remember the Nexus 7 being especially notorious for dying this way. You can replace those chips with the right equipment but I've never tried.


Why would eMMC NAND flash memory be less reliable than NVMe NAND flash?


The bus itself doesn't affect reliability obviously, but the storage chips they use on modern phones are much nicer and more expensive better parts :)


> The problem is that apps are just too big.

At least in part, I expect this is because of the AppCompat library that Google recommends using. If you pull in the whole thing, as tutorials show, it adds a huge number of images (icons and such) that don't get trimmed during compilation (which they're supposed to, if unused) - these make up something like 95% of the size of an app I'd recently been working on. I'd only noticed it because this was a rework of one I'd made ~10 years ago, before AppCompat existed.

I can't help but wonder if some of the "optimizations" done by Google when you turn over your keys and let them compile the apk is just removing this extra bloat.


A new out-of-box “Empty Activity” app project in Android Studio compiles to a 1.59 KB APK (if implemented in Java; it goes up to 2.26 MB APK with Kotlin). Rip out the dependencies on AndroidX, and the APK size plummets to 103 KB. (The APK for the de-Appcompat'd Kotlin version is, correspondingly, 709 KB.) 1.5 MB overhead – fifteenfold – for a more convenient layout[0] and the Material theme. Mind-numbingly wasteful. In a big enough app, that overhead will be overshadowed by the size of the meaningful bits, but how many apps really get that big? How many games include Appcompat and then draw their own UI anyway?

[0]: https://developer.android.com/training/constraint-layout


Is the battery easily replaced? I feel like at least some part of slow down I've encountered is deterioration of the battery, since the battery can't deliver the same power for peak moments like before the device feels slower.


As to whether installing an AOSP-derived Android distribution is easy to install or not mostly depends on whether the device is locked in some way or other. If it is you'll need to unlock it which - again depending on the device - can be as easy as flipping a switch or as hard as trying to get some code from a vendor which doesn't really want to give that code or which might not even exist any more. Installing a distribution is no harder than installing Debian, keeping it up to date is easy as it supports OTA updates. Once you're over the first hurdle - unlocking - it should be easy enough.

As an alternative you could root the device and disable (or remove) the bloat in the stock distribution, this often goes a long way towards getting a more usable device. This is what I did on those few devices which I've used on the stock distribution - remove as much bloat as possible, remove all social media cruft, etc. This can lead to surprising results like the battery lasting twice as long and with that giving new life to a tired old device.


How exactly do you install a custom android?


I use LineageOS, which has step-by-step instructions on how to prepare your phone/install the OS. A list of supported devices is at https://wiki.lineageos.org/devices/, with various guides on each device-specific page.


Try buying an iPhone.


I didn't realize there was much of a homebrew DS scene. What is the state of it?


Weirdly there is a fresh homebrew scene for the dsi which lets you install custom firmware and run games off the built in sd card slot.


You can play Rogue in the GBA and Nethack/SCummvm ports exist for the NDS.


You say that like it's an impressive feat for the GBA, but Rogue was originally developed for a computer that had less than 1/3 of the clock speed of the GBA's CPU. :)


But the GBA is pretty tiny on storage and limited on input ;).

Yes, I know you could play Rogue under an 8mhz-m68k mac perfectly, but bear in mind GBA cartridges and RAM are really tiny.

But hey. On input, I played Nethack under a PSP, underclocked to 25-50MHZ, the battery lasted a week and more.

Now I am trying to test which older machines could run Nethack 3.6.6. I guess a 2MB m68k Mac would be enough.

No, using telnet/ssh is cheating.


> web browsers and all kinds of stuff on a machine with just 4mb of ram

The PDA's have less RAM and they still had browsers and in some cases, even emulators.


I have always wondered if it's possible to run Gameboy Advance games in DS Mode (and thus access the DS mode hardware) and patch the GBA Link Cable serial comms to instead use the Nintendo DS's Wi-Fi connection.

Conventional wisdom says it's impossible to run in AGB Compatibility Mode and retain DS functionality, but I still wonder if it could be done.

Also given the ancient cryptography, I wonder if it's possible to crack DS Download Play and introduce a new "Wireless Multiboot" exploit that uses real DS hardware similar to the early exploratory attempts using "NDS WifiMe" before flashcards become commonplace. The hardware only has ~256 kilobytes of RAM (and doesn't even use a standard Wi-Fi stack if I recall correctly), but it would be SUPER COOL from the perspective of getting hardware to do something never thought possible.

One use-case for this hack I imagine is playing Zelda Four Swords Adventure played on a Wii / Wii U (via "Nintendont"), with the game patched to use Wi-Fi for the communication to the Gameboy Advance controllers rather than a Link Cable. The players won't need to have a flash card each, but could boot up the payload using DS Download Play on any DS variant (including DSi and 3DS).


> I have always wondered if it's possible to run Gameboy Advance games in DS Mode (and thus access the DS mode hardware)

It is! GBARunner2 is a DS homebrew hypervisor that can run (a lot of) GBA games in DS mode [1]. It can e.g. be used to run GBA games on the Nintendo DSi, which lacked the GBA slot and AGB mode, or run GBA games directly from a DS flashcard.

> and patch the GBA Link Cable serial comms to instead use the Nintendo DS's Wi-Fi connection.

It isn't - practically at least. There have been attempts and it works as a proof of concept with special builds of GBARunner2 [2]. Sadly the game logic in almost all GBA games does not appear to accept the huge added latency for wireless connections, compared to the almost nonexisting latency for a serial cable. This is also a problem for GBA emulators like mGBA, which only support game linking on the same computer and not e.g. on the network [3]. Games made only for the Link Cable (and not the niche Wireless Adapter) just can't deal with network latency.

[1]: https://wiki.gbatemp.net/wiki/GBARunner2 and https://github.com/Gericom/GBARunner2

[2]: https://wiki.gbatemp.net/wiki/GBARunner2/Link

[3]: https://github.com/mgba-emu/mgba/blob/0.8/README.md


Very cool, thanks for the links!

Sounds like the easiest way to solve the problem you describe and play multiplayer Zelda Four Swords Adventure over the internet (without modifying the code itself) is for the "server" (Gamecube game) and the client (the Gameboy Advance console) to be physically closeby and then stream the (emulated) GBA and (emulated) Gamecube video output framebuffer across the internet using something like the SteamLink so that the 100ms of latency would not cause the Link Cable to timeout.


Yeah this is going to be very hard to accomplish unfortunately. I'm guessing the only possible option to do this over the internet at the moment is using Dolphin and VBA-M on the same machine and some complex streaming setup. Dolphin supports linking to the VBA-M GBA emulator, and while this link setup is prone to bugs there are reports it works with Four Swords Adventures [1].

[1]: https://wiki.dolphin-emu.org/index.php/The_Legend_of_Zelda:_...


It's 100% possible to play with all the emulators running on the same computer. I worked my way through most of game with multiple players. There was a little bit of input lag but not too bad (and the game is pretty forgiving). If you have a couple friends who are into LoZ games I definitely recommend it.

We also just embraced screen-peeking as everything was up on the same screen for everyone to see.

https://i.imgur.com/pnhyEjS.png

It was definitely a bit of a pain to set up each time. Something like a batch script that automates setting the emulators to a known-good config, configuring the controls, opening up the emulators, and starting the game would be possible and would make it a lot easier.


Yeah I've played through about 20% of Four Swords Adventures with a PC running Dolphin and 3 emulated VBA-M instances, a TV, and 3 monitors, and 3 game controllers. Once up and running it's a great experience, but requires a lot of planning and setup (just like running on real hardware).

It's a shame so few people get to play that kind of game. It's a very social experience and exactly what the Wii U was intended for.


Ooh, very cool! I remember using a very experimental emulator at some point a couple years back and it was sadly too broken to be usable (likely resource constraints?). Will have to dig up the old DS Lite and a flashcart to try this one out.


Just be aware that it sometimes requires some per game tinkering with the four different GBARunner "flavors" to get things to work, and some games have annoying problems regardless of the flavor you pick. Overall it works well enough that it's a very convenient thing to have on your DS though :)


GameCube BBA emulation via the Wii's stock networking was recently added to Nintendont:

https://github.com/FIX94/Nintendont/issues/144


It works for the Wii U too. Note the "(v)Wii", vWii is the Wii inside the Wii U.


>Considering the new developments in the ARM world, why did Nintendo ultimately choose an awfully slow ARM9 combined by an even slower ARM7, instead of a faster ARM9 (or even a StrongARM)?

Nintendo's core philosophy for hardware has always been to use old hardware and develop one really unique feature with it, especially for its handhelds. N64 is an exception to this. The also tend to hold on to something from the previous generation, leading to some pretty strange architectures. They are fun to read about though, and there's something about this formula that seems to hold up over time better than the other consoles.

The switch, using an Nvidia SOC is a real departure from this, since it's using very standard hardware and its main new feature is a kind of convergance of the Wii U's components. It's a bit sad in a way, but at least Switch gets a lot more ports than previous generations.


It's become known as a "Blue Ocean" strategy - effectively find a part of the market where you are not competing directly with other players on their terms (i.e. cpu and graphics power).

https://www.blueoceanstrategy.com/teaching-materials/nintend...


You also can't compete on speed anymore. Both Sony and Microsoft are buying the best parts available from AMD for the price so the only way to win this battle is to buy more expensive parts or to design your own that beat what AMD can make.


It’s a remnant of Gunpei Yokoi’s philosophy: lateral thinking with withered technology. https://en.m.wikipedia.org/wiki/Gunpei_Yokoi

I find it refreshing that Nintendo has such strong opinions about these types of things, even when it’s to their detriment.

This philosophy is a part of why their products always feel like “toys” and inspire all the feelings that come along with that.


I would argue that the Gamecube was pretty state of the art when it came out and in some ways ahead of its competition at the time.


Nintendo has been state of the art with the Nintendo 64 and GameCube and probably the Game Boy Advance in the same era. It was an era where - with clever engineering - you could be on the cutting edge and still sell your consoles for cheap. With Nintendo, it's always about cost control.

They learned that being on the cutting edge does not help them sell more hardware or games, as the N64 and especially GameCube sold below expectations. They have not been on the cutting edge since then - nor were they really before. The Switch is pretty great because although it was not really on the cutting edge it had relatively up to date hardware at launch "for Nintendo" - especially compared to its 3DS and Wii U predecessors.


I remember there being distinct periods where:

- Genesis was out but we were still mostly playing NES games.

- Playstation was out but we were still mostly playing SNES games.

- PSP was out but everyone stuck to GBA.

- I was on hiatus from video games when Wii happened, but I was aware that most my friends had one, even if you wouldn't know it, because ps and xbox were more popular with (the minority of) people who want to talk about video games all the time.

My guess is that Nintendo has a few key parts of their strategy: More casual gamers don't actually care about having the best graphics (corollary: more casual gamers won't spend an extra $100 on better graphics), you need something that feels new & novel to attract the attention of more casual gamers and motivate them to buy, and technical constraints foster the kind of creativity that does that. Perhaps even more so than clever hardware gimmicks.


The Famicom and SFC were both the most powerful consoles on the market in most regards when they came out, too. I'd say that rather than the N64 being an exception, this strategy only really solidified after the back-to-back quasi-failures of the N64 and the GC.


Hm, but couldn't you say the hardware in the SNES fits the description above; the 65816 was essentially last-generation 8-bit hardware with some modifications for 16-bit/24-bit addr support. They went this direction while most everyone else was going 68000. Remember this is the era of the Amiga and the X68000 and many arcade games were going 68000 based.

They then bolted that to some pretty good video hardware but again it wasn't the kind of bitmap addressable blitter supported hardware that was out there on more advanced systems. It was still sprite/tile based stuff, with specific hardware support for common character-movement/scrolling/sprite/tile management, basically a more advanced version of what early 80s gaming systems had done.


The SNES was built to be a dedicated 2D gaming powerhouse, and in 1990 nothing came close to it. Yes, it was basically a more advanced version of what early 80s gaming systems had done - but it was the most advanced version of what they'd done. Its direct competition was the PC Engine and the Mega Drive, and it smoked both of them.

Comparing it to arcade machines and computers isn't really fair - totally different price points and market segments. But outside of high-end workstations nothing could spit out the kind of visuals the SNES was in 1990. The Amiga 500 was, what, like $500 in 1990? At least double the price? And that could display a mere 32 9-bit colours on-screen, two background layers, no alpha, no scaling or rotation. PCs were starting to come with VGA cards standard, but you'd need to spend at least 10x as much on one to get a CPU that could do much with it. And that's not even getting into the sound hardware!

The 65816 was an unorthodox choice to be sure, but you're doing it a bit of a disservice: it's still a fully 16-bit CPU internally, and even at ~1/2 the MHz the performance delta between it and the 68k is generally overstated (because the 68k is such a cycle hog), especially for the kind of calculations 2D games tend to do.

Anyway - weaker CPU + powerful video hardware (that gets outmodded by home computers within the next two-three years) has been, with few exceptions, the model for high-end game consoles since, well, at least the SNES. It's true of the PS4/XB1, it was true of the Xbox 360 (and the PS3, sort of, the whole Cell thing is complicated), it was true of the GC...CPU is rarely the bottleneck for games, so as a rule console designers don't get too spendy there. Exceptions might be the N64 and the Xbox, although even they had CPUs that could best be described as mid-range for their release dates.


The SNES took a several years old, cheap CPU used in early PCs, and added some good 2d graphics capabilities. I wouldn't call it a powerhouse, but it was powerful for specific use cases/types of games.

Comparing to PCE and MD is interesting, the differences in capability seem roughly in line with the different release dates. Being first to market doesn't work in the industry and many consoles were skipped by most consumers. These days MS and Sony are synchronized in terms of release dates, and Nintendo does it's own thing, releasing whenever they feel like it.

GC is kind of interesting, it's kind of halfway between the high performance target of N64 and the philosophy of older hardware + something unique.

For what it's worth SNES is my all-time favorite.


> the 65816 was essentially last-generation 8-bit hardware

I really wanted to say you're wrong ...

but if you look at a 65816 pinout, it does only have 8 data lines and 16 address lines.

Furthermore each instruction is only 8 bits wide,

And if you do look at this SNES schematic https://wiki.superfamicom.org/uploads/snes_schematic_color.p... I'm seeing 8 data lines and 24 address lines.

Never thought of the 65816 as a supercharged 8-bit CPU but looking at it, that's really not far from the mark.

> bitmap addressable blitter supported hardware that was out there on more advanced systems.

In 1991 when the SNES hit the states you had the Genesis, which was also sprite based, and I think the TurboGrafx-16, which was also sprite based. All of your arcade games of the time had those insane CPS-2 like sprite engine. Maybe CDI was out by then, but I don't think that thing had any graphic acceleration.


I've done plenty of 816 programming and it is basically just a 6502 with a couple 16 bit registers and ALU (accessible only by switching register modes, quite awkward) and a 24-bit address bus bolted on with a page register. Program counter, etc. wraps at 64 boundary, stack can only exist in the bottom 64k, etc. It's really a supercharged 8 bit CPU, yes. They probably made the right choice with it, though, it makes some sense for a game CPU because interrupt responsiveness and cycle efficiency is primary there, and the 6502 series is king for that.


And it was the worst-selling of the Big 3 consoles that generation.


I’m going to make a crazy assertion. Since the wii was just a souped up GC the GC platform was the best selling platform over two generations of consoles.


If you're reading this you may be interested in Wiimmfi [1], which is a re-implementation of the Nintendo Wi-Fi Connection (WFC) online gaming service that discontinued on May 20th, 2014. It was the online service used to play games like Mario Kart DS online.

Wiimmfi is similar to Xbox Live Project Insignia [2], and to a much lesser extent XLink Kai tunneling for local System Link over the internet. What a wonderful age we live in!

[1] https://wiimmfi.de/

[2] https://www.youtube.com/watch?v=5DVKwTtJtS8


Connecting to Wiimmfi on the DS is surprisingly simple. Thanks to a mistake in the SSL certificate verification code on the DS [1], all that is needed is specifying a custom DNS server (164.132.44.106 [2]). The hardest part is finding an open or WEP hotspot, as most DS games aren't compatible with newer Wi-Fi security types [3].

[1]: https://github.com/KaeruTeam/nds-constraint

[2]: https://www.youtube.com/watch?v=Sh0AZR-tKwM

[3]: https://wii.guide/wiimmfi.html


Wiimmfi is closed source software, however. If you actually want to look at how the internals worked, the independent altwfc project[1] may be much more interesting.

[1] https://github.com/barronwaffles/dwc_network_server_emulator...


This was really cool to read, but I’m disappointed that the section on the touchscreen didn’t answer something I’ve long wondered about the Nintendo DS: what was the extent of the touch screen’s multi-touch capability?

My first encounter with the DS’s multi-touch capability was in a homebrew drawing program, Colors! (https://www.gamebrew.org/wiki/Colors%21): I noticed that if I touched the screen with my finger at the same time as the stylus, the brush drew at the midpoint between my finger and the stylus. My second and last encounter with multi-touch was in the published game Hotel Dusk: Room 215. It had a puzzle where two switches were displayed on the touch screen, and the solution was to put two fingers on the touch screen and pull the switches at the same time. I tried dragging with just the stylus at the midpoint between the switches, and that didn’t activate it: I truly had to put both fingers on the screen for it to work.

My question: why wasn’t this capability of the Nintendo DS more widely used or advertised? Was there some technical limitation on the multi-touch feature that made it only work under certain conditions? Did it require some private API (that was somehow figured out by a third-party game developer and approved by Nintendo nonetheless)? Or was the only reason that most players didn’t have two styli, and developers didn’t want to encourage players to put their dirty fingers on the touch screen (which was, admittedly, prone to scratching)?


Thats actually a super interesting question and something that should not be impossible to work out. The touchscreen is connected to the mobo with a 4 pin connector. I don't see an IC on it so I assume it must be analogue signals coming out. I might see if I can work out the way to read data off it and see if there is an observable difference for multitouch


Wow this blog has an entire article series in this format with more than a dozen write-ups. Incredible that I haven't found this site sooner, what a gem!


Hopefully the author receives ample support through his Patreon link!


Note that this article is one in a series: https://www.copetti.org/projects/consoles/


One of my first hardware projects was homebrew related for the DS: http://kraln.com/passme/


I got one of those! Kudos for making them - it got me started on a lot of things (including reflashing a CPLD with a parallel cable, come to think of it, although I can't remember why). I ended up flashing my DS's firmware later on, but I think I still have my PassMe floating around - it's a fun little memento :)


The amount of detail going into all the write-ups on this page is insane, great work!


Hopefully the author receives ample support through his Patreon link!


username checks out...


(For the record, I'm not the author of the blog post)


It's interesting to have game formats for which emulation does not capture the magic of the hardware. When the last DS dies how will people play these games properly?


Clone hardware is becoming more popular. There have been Super Nintendo and Sega Genesis / Megadrive cloned consoles for decades, but recently there's been products like the Hyperkin Ultra Retron (which uses emulation), innumerable Gameboy clones (usually using emulation), but most interestingly also FPGA and ASIC based solutions.

Checkout the MiSTer [1] [2] an FPGA-based system where the community are trying to create faithful reproductions to the original hardware in Hardware Description Languages (HDLs) like Verilog. This FPGA based approach is NOT emulation in software, but reproduction of the electrical logic of actual silicon like the CPU and DSP chips.

I'll mention flashcards exist, and so do reproduction cartridges (could be called "bootlegs" but the original cartridges are no longer manufactured so if advertised clearly nobody has much issue with reproductions).

That's not even getting into the world of Android-based emulation (and the myriad of phone holders + custom controller type products).

In some ways the future of video game preservation is bright (but in other ways we're still settling for good enough instead of perfect reproduction)

[1] https://www.youtube.com/watch?v=e5yPbzD-W-I

[2] https://www.youtube.com/watch?v=dibLXWdX5-M


New NES, SNES and N64 games are a thing now too!


This is especially true for the NES due to tools like NESMaker http://makenesgames.com/

There are so many new NES games now it's impossible to stay on top of it all.



I remember seeing this video on that about the clever compression tricks used to cram the sprites and map layouts into 40K.

https://www.youtube.com/watch?v=ZWQ0591PAxM


And stuff like the SA-1 port of Gradius 3. Fixing the crippling slowdown of the original SNES version by using an enhancement chip from later SNES games.

https://arstechnica.com/gaming/2019/05/28-years-later-hacker...


Its still possible to get working original gameboys. Since the DS was massively popular you will still easily be able to source a working device. I did see a product recently that actually replicates the hardware of multiple game consoles using an FPGA. I guess something like that could be used for retro DS games in the future.


Foldable smartphones are going to be Nintendo DS emulator machines.


Emulators, as the grandparent said, do not capture the magic (or convenience, or affordability, or energy efficiency) of consoles like this though. They are an expensive, imperfect approximation.

I do hope there's conservationists out there that have a way to "freeze" hardware like this in time, shield it from any kind of wear, corrosion, and idk, brittling of materials through time.


A portrait touch screen split top/bottom?


Some games even require you to close the screen! I suppose that's one of those emulation quirks you will just have to live with.


Yup, that particular gimmick in Phantom Hourglass got me for a couple seconds before it figured it out. While not at the same level as the traditional console games, the DS Legend of Zelda games have their own charm :)


That gimmick didn’t even last through the next console generation. Neither the original 2DS nor the Wii U have folding screens, you have to suspend/activate the system to get past that part.


Maybe we will see a hardware project from the Analogue people or something else in the future :)


Yeah, the nintendo DS is one of those consoles which is mostly a joy to play emulated on a modern android phone.

Many games are very enjoyable with modern touch controls -- Inamuza Eleven for instance.


Ace Attorney, too. I think on iOS you can buy the games remastered to work natively, but on Android you have no option but to emulate


One thing you'll notice is that games run through a good emulator run even better than games running "natively" to mobile ports.

This has been true for a lot of Square Enix classics.


This is a bit tangential, but the format of the article as well as the visual, are really pleasant. It's a great combination of font, color, line heights, text widths, the use of tabbed widgets for specific sections, etc. It's just a great example of an article on a web site that takes advantage of the media (being a web site, that can be interactive) but not overdoing it.

Also another tangent, the game showcased in multiple photos there, Hotel Dusk Room 215, was one of my favorite DS games. I would've thought no one has heard about it, but here it is in an article about DS architecture. :)


He uses the dated phrase "master-slave" in his post. He should change it to "leader-follower" /s.

But quite honestly, I think that "leader-follower", politics aside, is plainly a better term to describe these kinds of configurations.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: