It seems a little like cheating to put modern microcontrollers inside the cartridge that do all the heavy lifting. At what point are you basically just using the game boy's screen connected to a much more powerful external computer? It would be like me creating a new "SNES game" with a raspberry pi stuffed inside the cartridge with GPIOs mapped to the SNES cartridge interface and saying "there oughta be a full 3d SNES game" but at that point the SNES is doing very little processing compared to the pi's SoC which is doing 100% of the 3d rendering.
All that said, this project is still pretty sweet and I enjoyed the writeup.
I get what you're saying, but it's worth also mentioning that your full 3d SNES game example is pretty much exactly what games like Star Fox[0] did, back in the early 90's. The Super FX chip is inside the SNES cartridge. From the wikipedia page:
> The Super FX was so much more powerful than the SNES's standard processor that the development team joked that the SNES was just a box to hold the chip.
Really interesting to read about, but there was a whole selection of other chips that various SNES games took advantage of: [1]
And really, from its very launch the base SNES couldn't handle most of its library - even Pilotwings, a day 1 launch title, needed an extra chip to handle its enhanced mode 7 background effects.
The most interesting one being the SA-1 chip, imho. It's the exact same processor as in the SNES except clocked three times faster, 10.74MHz instead of 3.58MHz. This means you can hack games to have their original code run on the SA-1 instead of the original CPU. Which is exactly what Vitor Vitela did for numerous games to remove their massive slowdowns.
> It seems a little like cheating to put modern microcontrollers inside the cartridge that do all the heavy lifting.
As a sometimes hobbyist homebrewer, instinctively I am inclined to agree. But I'll argue the counter-case.
The Game Boy (and consoles like the NES and SNES) were intended to be expanded with additional hardware in the cartridge, from the beginning in anticipation of new technology. There were commercial GB games in Japan that included an infrared transceiver, for example. Same with vibrators and battery-backed save RAM and so on. While extra cartridge hardware was never more than marginal on the Game Boy, it was fairly prominent on the NES/Famicom and especially the SNES.
And it was used to make full 3D games for the SNES. Much of Star Fox is drawn with 3D polygons, and would have been impossible without a co-processor to do the 3D rendering. So it includes the Super FX chip on-board, which is a fast coprocessor with vector facilities. In some Super FX games, the console's baseline hardware was basically relegated to being a glorified framebuffer displaying the coprocessor's rendered output.
And in some cases, no amount of extra hardware will circumvent the limitations of the hardware, which makes it a legitimate programming and game design challenge, IMO. You just can't display a full-screen bitmap on a Game Boy without tricks. There simply isn't enough onboard video RAM to hold a distinct 8x8 tile for every tile on screen. No coprocessor will work around that. It might give you an infinite source of tiles rendering a 3D scene for example, but you still have to shuffle them in and out of VRAM, which is a major bottleneck for complex Game Boy designs aiming for full screen and full framerate effects.
Actually you can display one full screen bitmap. Vram can hold 384 tiles, and the screen needs 360 tiles. It's gonna be difficult to update that, since there's no more vram for double buffering, you can only access vram during vblank and hblank, and there's no dma into vram (on the gbc it's possible, since it has a dma and double the vram).
The SNES example may be working against you here, since there were lots of commercially-released SNES games that included coprocessors in the cartridge to do heavy lifting the console wasn't otherwise capable of. At least one of those games actually used an ARM processor in the cartridge.
"I thought using loops was cheating, so I programmed my own using samples. I then thought using samples was cheating, so I recorded real drums. I then thought that programming it was cheating, so I learned to play drums for real. I then thought using bought drums was cheating, so I learned to make my own. I then thought using pre-made skins was cheating, so I killed a goat and skinned it. I then thought that that was cheating too, so I grew my own goat from a baby goat. I also think that is cheating, but I’m not sure where to go from here. I haven’t made any music lately, what with the goat farming and all."
That's pretty common for cartridge based games. Want your gameboy to support RTC? Add an RTC chip on the cartridge. Need more RAM or persistent memory? Slap more in the cartridge. This is no different.
I think the general vibe from the OP comment is that a lot of the fun with low spec and retro hardware is working with restrictions and squeezing new and novel amounts of use out of old hardware. But once you get to the stage of putting a much much more powerful computer in the cartridge, you come to the realization that you can do basically anything at this point and treat the gameboy as a dumb display/input device. Now it's just regular embedded programming.
Not to detract from the project, it is very cool. But the excitement kinda feels dwindled when you realize that you now have limitless power and ability. It becomes just regular programming rather than a puzzle.
For some perspective, even the oldest network enabled addons sold for the GB/GBC did the same exact thing (such as the official Mobile Game Boy Adapter). It's more like, without doing it this way, it's not even possible.
I don't see the complaint as "it's wrong to use powerful chips in a cartridge". As has been pointed out, even contemporary cartridges did things like that.
I see the 'complaint' is that there's a creative/artistic appeal to seeing what can be squeezed out with limited/constrained resources.
Or, at worst, does it make sense to use an old console like the Game Boy if you're going to use such a fancy cartridge?
I'm reminded of the MythBusters episode where they tried to make a cannon from a wooden log. They resorted to using modern power tools. "I'm not doing anything they wouldn't have done if they had access to these tools!".
You have protocol offload engines on modern NICs, what's wrong with basically a "Wifi Offload Engine"?
> It would be like me creating a new "SNES game" with a raspberry pi stuffed inside the cartridge with GPIOs mapped to the SNES cartridge interface and saying "there oughta be a full 3d SNES game" but at that point the SNES is doing very little processing compared to the pi's SoC which is doing 100% of the 3d rendering.
And that's the point of these esp modules in the first place. They were originally intended as sdio wifi modules to give plug and play wifi support to various embedded systems.
The programmablity wasn't the initial selling point. They were very much designed originally to be a shrink wrap addon to another embedded system.
Most little single purpose ucontroller based products not made by Broadcom have an SDK for writing code for the device too, it just normally doesn't go anywhere. Espressif just sort of won the lottery and an ecosystem formed around them, but that wasn't their initial market as it would be foolish to bet the company on that.
I didn't say they were sold as programmable/hackable. They were used in many simple smart devices like bulbs and power controllers already without any other microcontrollers. It was pretty obvious that it was doing all the work.
Use cases where they are the only processor (and thus necessitate use case specific programmablity like you're saying) came later. That wasn't the intially planned market.
It came before they were hackable. I have been following it for years. It was fun seeing cheap Chinese smart devices being exposed and each WiFi enabled one would have an ESP8266 and only it aside from some SMD capacitors and resistors. I’m not saying they were sold as hackable. I’m saying they were deciphered easily as the only microcontroller required for the smart devices.
I'm including Chinese system integrators here, they just figured out that these devices were hackable prior to the western maker community. I have it under good authority that those weren't espressif's original target market either.
Not surprised, there’s some very capable chips that aren’t fully utilized, I think a lot of smart watches have a certain Nordic chip too. Know of any more ESP like chips?
A ucontroller with way more compute and peripherals than it need for it's task, with either no or shoddy encryption on it's flash allowing you to write your own code or binary patch the existing code with a little elbow grease? That's most microcontrollers out there with the exception of chips by Broadcom (no flash and the patch RAM is already basically full fixing bugs in their crappy code ROM) or Nordic (because of the ubiquitous use of per device encrypted flash). Specific devices I've worked on in that capacity though are all tied up in NDAs with my employers though.
But what makes ESPs special is the community. Because of all of the public work put into them, it's an order of magnitude easier to manipulate them than pouring over a disassembly. You'd know about tchips like that if they existed. Bunnie tried to get that kind of community around the MT6260 chips, but it didn't really go anywhere.
Thank you. I think these hacks will be seen as a historical curiosity with an influx of RISC-V chips that will be more open, ExpressIf is moving to it and hackable IoT chips are utilizing them more too.
I think it could go either way. I get where you're coming from, but think it's equally likely that it'll go in the opposite direction. The underlying economic reasons behind why the internals of these chips aren't publicly exposed have a good chance of being exasperated by the in progress democratization of fabless chip design. More chips will be designed to fill a specific niche and not be publicly documented; that's because public documentation is a huge schlep that's not work towards their niche or value add. Yeah there'll be more RISC-V but most cores today are something supported by GCC as it is; that's not the impediment to understanding the chip, but instead everything custom around the CPU core is. And the openness of RISC-V could lead to fragmentation (but we have yet to see that, and I'll admit that's mainly an ARM propaganda talking point at the moment).
Good points. They’re entitled to their research and development not being cloned. I think that fracturing is likely with RISC-V in the short term but a natural hegemony will form under some protocols, hopefully they don’t suck and can scale.
The first esp modules were originally marketed and sold as simple serial ttl-to-wifi bridges, but it didn't take hardware hackers long to discover they were hiding a lot functionality underneath. It took some doing but eventually espressif opened up their toolchain and _that's_ when they became "powerful adruino with wifi."
They still are simple ttl-to-wifi bridges aren't they? They just had more processing power, it needed to be stronger than an arduino to run a web server and control with wifi. I don't think it was hidden as much as underutilized, like the linksys routers.
I agree with you, but understand the critique others used. The problem is standardization, if you need special hardware to do things you get fragmentation. For the people who rely on the other cartridges, those games are all standardized. The SNES had optional networking https://en.wikipedia.org/wiki/Satellaview, and the PS2 also had an optional expansion bay https://en.wikipedia.org/wiki/PlayStation_2_Expansion_Bay that would limit all games from utilizing it, and also as a requirement it would prevent much adaption.
Whats the point of developing on a device that needs lots of other parts that most people won't have? Its like selling a game system with no controllers and each controller can be used for only a few games (like eyefi).
This is why I don't bother with seeing m.2 SSDs enhancing online gameplay for faster loading. You always wait for the weakest and slowest loading game and network to start.
To be pedantic, the Famicom had audio bypass, but the NES really didn't (unless you hack around with the expansion connector, which wasn't used for anything Nintendo approved in the US)
Yeah, the cartridge pinout is slightly different. The famicom has a 60 pin connector and the NES has a 72 pin connector, but ten pins go to the expansion port and four pins are used for the license chip, so two pins were lost, and those two pins were the audio out/in. Castlevania (and some other games) has significantly different audio for Japan vs the rest of the world.
There's a few different ways that people modify their NES if they want to use cartridges with audio hardware, using the (otherwise unused) expansion port pins.
Yeah I've seen some youtube vid about famicon audio chips and it was insanely high fidelity for a tiny video game console. Must be weird to play such games as a kid back in the days.
Many people here are mentioning the SNES, but actually, outside of classic SNES games like Star Fox, Stunt Race FX, and Yoshi's Island using co-processor chips, SEGA also produced what they called the SVP chip for Virtua Racing on the Genesis.
Then we got the SEGA 32x - which was literally dumping a giant chain of extra chips onto the Genesis in order to do things like primitive 3D and scaling.
Beyond the SNES as well, the NES also had a habit of packing in small bits of extra hardware on their carts.
Mode 7 is a feature of the SNES's basic video hardware, not a cartridge add on.
Most cartridges are just ROM, perhaps some battery backed RAM, and if needed some bank switching / address decoding logic to glue things together. Nothing particularly smart. Certainly no extra processors running the show. All the smarts was in the console.
There are exceptions, like the SNES games with coprocessors, but not all of them are full CPUs, and even then there are 1500+ SNES titles in total, and less than 100 with extra chips in them[1]. The mapper chips in NES games often did a bit more than just bank switching, but they weren't in control either.
Cue someone mentioning the MB Microvision...
[1] Based on Wikipedia, and I hope I roughly counted the number of entries in the coprocessor game table correctly.
Mode 7 wasn't a cartridge feature, it was a background layer that could be rotated and scaled in the SNES hardware. You may be thinking of the FX chip (mentioned by others in this thread).
If you want to take that cheating to the logical extreme, check out this project made by a friend. It's explicit in the title that it's just streaming, but pretty fun nonetheless!
My favorite example of an expansion chip is a 32-bit ARM V7 at 21.47 MHz on a cartridge for SNES, which itself had a 16-bit CPU that is an enhancement of 6502, at nominal 3.58 MHz. All for a game of shogi, a Japanese variant of chess: https://twitter.com/Foone/status/1177652135841779713
Gonna pile on the bandwagon here and add, it shouldn't be a "central computer" that has all the cool shit and you attach peripherals to it. We should be merging equally powerful computers together, it should be a symbiotic relationship between different parts that get better over time. Two computers are better than one
Yes, it's a completely logical thing to do. It just makes the project a little mundane and back to the practical computing of the real world. With a modern microcontroller you can do anything you want without limits. The gameboy becomes mostly irrelevant.
It seems like everyone in tech does this. Someone has a mind blowing feat of engineering, but leave out the part where they cheated and arguably lied about it because "it's just some minor detail that doesn't really matter"... For some reason they think these are bragging points that put them above all other engineers.
"I built a calculator completely from scratch by myself with no help" (by importing calculator.*)
"I invented my own cloud microservice in a language I wrote completely myself" (It's hello world split in 2 files on google drive and the language is just JavaScript but it you added a "framework" (it's single method) that already exists in the wild but your version is slower and worse)
"I built my own computer" (by buying a prebuilt computer but swapping out the ram, or buying an essentially prebuilt computer but it comes disassembled)
I’m a billionaire with 22in, my dinner with Bezos went well, I had to ignore 22 messages from hot models wanting to fuck to post this, I could have been working for 12k an hour but I decided your post was worth replying to even though satellite Internet costs on my huge yacht costs a lot more than you make in a day.
I’m going to fly on my private jet now, feel free to contact me about how you can invest and fall for my crypto margin trading that ends badly for you.
All that said, this project is still pretty sweet and I enjoyed the writeup.