Hacker News new | past | comments | ask | show | jobs | submit login
The NES Homebrew Scene (tedium.co)
323 points by shortformblog on April 12, 2018 | hide | past | favorite | 132 comments



A while ago I wanted to try some NES ROM hacking but I couldn't find any tools for it that were really convenient - I ended up typing 6502 into some online assembler to copy the output into HxD and other processes like that. So, I wrote something more to my liking that was more like an IDE.

I couldn't think of a good name, but I was modding Kid Icarus at the time, and it's in .NET, so I just called it IcarusNet. It was primarily for ROM hacking, but as far as I know, it should be suitable for writing something from scratch in conjunction with tile editing programs and maybe other things you'd need.

Not nearly as advanced as NESMaker (very impressive from the looks of it) seems to be, but maybe a middle ground for someone who wants to write a game more to the metal but also wants something more graphical with the same "one click assemble/run" mindset.

Comes with instructions for a sample project to make an edit to a freely available ROM in less than 5 minutes.

https://github.com/ldyeax/IcarusNet

Since then I've heard that license might have some problems for software, so I'll change it if the need arises.


As someone who simply wanted a NES assembler, I've found cc65 [0] to be great, for example wrapped as a docker image in [1].

[0]: https://github.com/cc65/cc65 [1]: https://hub.docker.com/r/philhsmith/cc65/


Nowdays, it's impossible to do modern programming without relying on millions of lines of other people's code. It's frameworks and libraries all the way down.

On the NES however, every line of code is yours. There's no such thing as libraries, operating systems, or frameworks. They don't exist. It's refreshing.

One of the big problems of modern programming is how easy it is to add complexity. As Charles Moore once said:

>"Simplify the problem you've got or rather don't complexify it. I've done it myself, it's fun to do. You have a boring problem and hiding behind it is a much more interesting problem. So you code the more interesting problem and the one you've got is a subset of it and it falls out trivial. But of course you wrote ten times as much code as you needed to solve the problem that you actually had."

It's easy to add complexity on modern systems, but it's hard on the NES. The programming too clunky; it's too much work to anything but simple things. For that reason, it's far more common to simplify the problem and reduce features rather than add them. It's a wonderful thing.


I agree with you in general, but I think it's worth pointing out that programming the NES's PPU is actually very similar to programming with a 2D graphics library today.

NES games never wrote their own scrolling logic, or there own graphics logic. They wrote tile values into a background table and then told the NES how much to scroll the playing field, and the NES rendered and scrolled everything automagically.

NES games never drew their own sprites. They just said "Hey NES, write the sprite at this x and y location" and the NES did it.

Audio worked the same way.

Honestly, coding for the NES would be similar to working with a simple 2D graphics library. So I don't know if it would be accurate to say "every line of code is yours" on the NES. It's more just that the "libraries" used were implemented in hardware instead of software.


> NES games never drew their own sprites. They just said "Hey NES, write the sprite at this x and y location"

Yeah, but the hardware is so limited with weird corner cases that writing a bitblt might be easier. For example, some games have to reconfigure their sprites at the beginning of every horizontal scan line to overcome the hardware limitation on simultaneously displayed sprites (8).


Which games? To my knowledge this is at best exceptionally rare because there are only ~114 CPU cycles in a scan line. The reports I've heard are that if you do any OAM changes mid-frame, you get something like four lines of glitches. More common is to just do a full OAM DMA during vertical blank, at least from the ROMs I've seen.


I can't find the original link where I read about this, it may have been more theory than about specific games. These links seem to be relevant: http://wiki.nesdev.com/w/index.php/Sprite_overflow_games http://forums.nesdev.com/viewtopic.php?t=16299


This is only sort of true in that while there are hardware supported things, the way in which you work with them is both more finicky than most APIs, and they get you relatively less close to a finished renderer.

For example, concepts like a scroll register don't typically appear in an abstracted API. They're a hardware detail that happens to assist in implementing a smooth scroll by reducing the necessary computation. But they aren't a "smooth scrolling API" because it works on the tilemap data only - you can also have sprites scroll. That necessitates a software implementation specifically for the task of sprite rendering. Witness the numerous NES games that either let sprites disappear when they hit negative X coordinates or put up a convenient black bar to hide it.


> On the NES however, every line of code is yours. There's no such thing as libraries, operating systems, or frameworks. They don't exist. It's refreshing.

I would say that there’s still “other people’s code” that you rely on—that code is just reified into ASICs you drive with control messages over IO ports, rather than being OS system-service daemons you drive with control messages over IPC.

Also keep in mind that, despite the lack of a forced OS or BIOS on the system itself, Nintendo’s SDKs included snippets or code samples (or sometimes complete small libraries as modifiable assembler source), and developers shared others around when they came up with them to solve a particular problem.

(Later on, in SNES/N64/GameCube era, Nintendo effectively handed you a whole library exokernel worth of precompiled, static-linkable libraries for you to use, too. The NES was too small for that, though.)


Not necessarily, there are hundreds of embedded devices that can be programmed just like in the old days.

Arduino, ESP286, ESP32, <pick your own flavour>.

And since we are in the context of games and Nintendo, something like Arduboy.

https://arduboy.com


> Arduino, ESP286, ESP32, <pick your own flavour>.

Both the Arduino libs and Espressif SDK are fairly big chunks of code.

Yes, you can program AVRs without libs too, but you did specifically say Arduino. It is also my understanding that ESPs are both bit obscure and fairly complex beasts to be managed without SDK.

While there are still some nice cases of simplicity in the embedded world, the complexity and layers of libraries and frameworks model is encroaching there too.


Here's a sneak peek of something I'm working on

http://fingswotidun.com/avr/AvrAsm/Testbed/

It's pretty much a project to scratch the itch being described in this thread. It gives you an in-broswer 8-bit AVR based fantasy console and Assembler.


It looks quite cool.


The core of the esp-idf is like a wrapper + a tiny bit of logic over the registers, there is a ton of extra stuff packed in there but you don't need to use it if you don't want to and it is pretty clearly separated in the source directory. I really like it because it gets out of my way when I don't want it, and when I do want it there is usually something there to help.


Even worse than that, people still use legacy devices like 8051 MCUs in modern products. I'm currently working on an assembly codebase for a SiLabs part.

Like... Please move to an ARM MCU with a C codebase.


An ARM might be overkill (and a bit expensive) for something that can be done with an 8051. A better choice might be something like an Atmel AVR or Microchip PIC(16 or 32 probably). Those are dirt-cheap, and specifically designed for microcontroller applications with features like on-board ADCs, comparators, timers, GPIOs, etc.


I hear this argument a lot in the EE community. Yet, the cheapest 8051 part in single quantities on DigiKey: $0.43. Cheapest ARM Cortex-M0 part on DigiKey: $0.47. I'll take the ease of development and features on ARM for $0.05. You can develop without proprietary development environments. I do my development with a basic gcc+make setup.

Of course you can pick up old parts which fell off a truck for cheaper but I'd rather not do that when I depend on these parts for many years to come.


The cheaper 8051s aren't on digikey (they are usually integrated into something like the custom MCU used for a pregnancy test or tooth brush), and $0.05 is very worth it if you are making 1000000.


8051 is a complete overkill for a pregnancy test or a toothbrush.

That's why those applications use less known 4-bit processors.


There is C available for an 8051. The Keil 8051 compiler (now owned by ARM) was pretty good and it had a nice GUI environment. The 8051 is pretty good for certain tasks but when you push it beyond the limits, your effort switches from solving the core problem to managing memory and micro optimization to make things work. Meanwhile with an ARM MCU you don't have to worry about a lot of things. I once spent a couple days optimizing a 32-bit divide for an 8051 to squeeze a couple % improvement in product performance.


You can program 8051 in c using tinycc. It functions great as an assembler as well. You can structure the parts that must be in assembly as c functions using inline assembly. Very convenient!

Using c does come at the cost of some code compactness, but you can selectively optimize as the need arises.


Well, if it is an ARM MCU, Basic, Pascal, Java, C++, Oberon, Ada and Rust are also possible options. :)


Steve Woz in his autobiography said a similar thing about growing up in the Apple II era vs the kids he teaches today; there’s a lot more you can do today, but what you miss is the insight and power that comes from understanding the system as a whole, and being able to leverage all of its parts.


> On the NES however, every line of code is yours.

I once stuck my nose into Game Boy Advance homebrew and was happy to find it can be the same there. Every piece of the hardware is memory mapped. That means, given a reference of the mapped addresses and struct definitions, you can do whatever you want with the machine without any magic, black-box external code.


GBA was the very last of the "old-fashioned" consoles, and as such one of the best targets for homebrew.


I also adore old programs for their simplicity. However, I've come to terms with "unnecessary" complexity as a necessary cost of growth and abstraction.

Each part of a bacteria, relative to its mass, may be more cost effective than that of a mammal. I'd much rather be the latter.


It's true insofar you also count in the evolutionary, worse-is-better nature of software development. I.e. there's lots more of complexity than it's necessary for the level of abstraction and capabilities you want, but that complexity was accrued as people developed and iterated on every layer, mostly leaving it be as it got good enough, and moving on to something else.


Funny because at least over in the Linux world i feel the complexity has come as people had started hunting for "perfection".

Where before you had a GUI part talk to a CLI part via pipes, now you have them talking to each other via a XML based bus mediated by multiple daemons (one for the bus, one to uphold a fine grained security layer again using XML and JSON).


Especially now that every damned language comes with their own custom package manager, with node.js as the posterboy:

https://medium.com/@caspervonb/the-internet-is-at-the-mercy-...


Blame Perl :)


The graphic on there is brilliant.


They couldn't bother to cite the image apparently but it's from a web comic called Gunshow


Yes it is from Gunshow, and that isn't the full comic. It's even better when you see the whole thing: http://gunshowcomic.com/648


Those two panels are well-known as a meme image, over probably the past 4 years. Memes tend to transcend their original source; that's probably why the image wasn't properly cited.


Because clearly nobody working on the web as an artist deserves to get credit for their work once it passes a certain threshold of popularity.

That said, when K C Green ran a Kickstarter for a plush This Is Fine dog it took in more than ten times what he was asking for; how much of that went into his bank account as profit, I dunno.


I'm not trying to say whether they deserve credit. I'm just trying to describe something observed about how people treat images that reach the level of "meme".


"it's a meme" isn't an excuse for not giving credit.

If KC Green made an image macro and posted it on Reddit it would be one thing. But they didn't, they made a comic and put it on their own site.

I can't speak for KC but it's just bad taste not to cite things that are flat-out copyright somebody else.


Describing reality isn't making an excuse.


Somewhat related:

I'm always thoroughly impressed with how complicated and knowledgeable the people who work on these topics are.

I don't even use the product, but I love reading the Dolphin blog [0] because they write about fairly complicated topics in a way that's pretty approachable, and it always makes me appreciate how much I don't know about these topics.

While I feel like it would be difficult-but-not-impossible to be a contributor to say, the linux kernel (this is probably me grossly oversimplifying the process), I think I could spend the next year diving into graphics programming and basic hardware topics and still not contribute in any meaningful way to a project like Dolphin.

Articles like this and the Dolphin blog definitely make me realize how much I don't know.

0: https://dolphin-emu.org/blog/


The difference is domain knowledge, it takes time. You might feel that way about kernel because the high level concept of resource management is pervasive in programming. So you might feel you can easily dive into it, but it will take you equally long to say port the kernel to a new architecture/write drivers for a new platform.


I recently started work on an NES project, and wrote the first part of it up: https://www.moria.us/blog/2018/03/nes-development

It's a lot of work. I've managed to get data from Tiled into my ROM image, and I'm working on figuring out an architecture for scrolling and displaying a status bar.

Implementing scrolling and a status bar sounds simple but is actually a total pain. You only have two screens of tiles to work with, which can be arranged horizontally or vertically, and anything more complicated than scrolling in a single direction (including status bars) requires adjusting the scrolling registers between scanlines, and making sure that your status bar is in a different part of the tilemap (called "name tables") than your world.

A common way to accomplish this is to lay out your graphics memory so a certain address line turns on and off once every scanline, and then putting a chip (called a "mapper") on the game cartridge which wires that address line to a counter, which drives a CPU interrupt line. Simpler games like Super Mario Bros. don't need this extra chip, but you only get a single status bar + scrolling in one direction only. Games with scrolling in both directions often display graphical glitches on the edge of the screen, like Super Mario Bros. 3 (look at the right edge of the screen).

This is unnecessary on newer consoles, and less painful even on the Sega Master System.


But programming with restrictions like that will make you a better programmer. Nothing brings out creativity better than a limiting environment.


> Nothing brings out creativity better than a limiting environment.

This is so true as a life principle!

I once read a book about the development of laser guided bombs [1], and in it it talked about how Texas Instruments made the first viable prototype [2] on a shoe-string budget even though they were competing against a large government contractor with a huge budget.

Because of the budgetary constraints, TI couldn't afford multi-million dollar wind tunnels for testing, so they built scale models of the bomb and dropped them into swimming pools, performing mathematics transforms on the results using the drag coefficients of water vs air. They came up with a bunch of other innovations as well, such as the shuttlecock seeker head and made a superior product that vastly outperformed the high-budget contractor.

[1] https://www.amazon.com/Weapons-Choice-Development-Precision-... [2] https://en.wikipedia.org/wiki/Paveway


I'd find completely awesome if it weren't for tools aimed at killing people. I'm not sure I enjoy advances in that area.


My crypto professor made a point that stuck with me: There are many many problems that are, from an engineering standpoint, really cool and fun to solve, but can only really be used to harm or kill people. Building MIRVs or cluster bombs probably scratches the itches of a lot of aerospace engineers, but you're still using your design chops to build things that end lives.


Well, the alternative to PGMs is a lot more collateral damage. Want to take out that bridge? Carpet bomb or dive bomb and hope too many civilians/pilots don't die. Want to take out that factory? Carpet bomb or dive bomb and hope too many civilians/pilots don't die.

With PGMs you can take out a bridge with pinpoint accuracy in the middle of the night and nobody dies at all (directly, at least).


It's just as easy to say it is a tool designed to save lives. Ultimately it's a tool, it's how you use it that's important.


Just as easy maybe, but fundamentally incorrect. Its mechanism of action is killing people -- choosing to believe the claim that killing those people will save lives alters neither the purpose nor MO of the tool.


The goal of the mechanism is to kill fewer people than other available methods.

Tools are, simply put, devices that increase a user's leverage.

We live in a world full of tools capable of leveraging against life. We also live in a world with individuals and groups who - for whatever reasons - choose to use that leverage.

We can't get rid of tools like this. They are very straightforward inventions, especially as technical knowledge increases. The simplest nuclear fission reactor is a bomb. That does not make nuclear fission fundamentally worthless, and whether you agree that knowledge about it is a "good" or not, that knowledge is not going to simply disappear.

> choosing to believe the claim that killing those people will save lives alters neither the purpose nor MO of the tool.

How a tool is used does not change what it is or what it does. It does, however, define the purpose. If the purpose of killing someone is to prevent that individual from taking another life - or several other lives - then the purpose of the tool used is to save lives. The goal for perfecting a tool that is to be used for that purpose is to minimize the amount of lives taken, and damage done. That means a "better" bomb takes fewer lives, saving more.

If you choose to believe that an individual taking action against the life of another has not forfeit his/her own right to live, then you might consider this tool to have no legitimate purpose. Frankly, I disagree, and hope you will reconsider.


Honestly, I found after growing up programming with lots of restrictions, I developed lots of bad habits. I slowly developed a closed mind for "newfangled libraries," IDEs, and similar. I didn't learn to value computationally expensive things like internal consistency checks or logging. Eventually, and in part thanks to reading HN, I stepped out of that mindset of scarcity. Instead of trying to do things with limited resources, I now find it more worthwhile to find what resources exist and leverage them.

Also, maybe creativity isn't the same as being a good programmer. Creativity may get you out of a bind or into a new area, but I think being a good programmer is founded on many mundane characteristics, such as consistency and ability to conform to others.


“The enemy of art is the absence of limitations” (attributed to Orson Welles)


The goal of an artist is to push boundaries, overcome restrictions. Absence of limitations is the goal of pretty much everything. That's what would be called perfection. Setting artificial restrictions is a way to learn by reduction of a problem by simplification, and setting boundaries requires creativity, too, but in principle, freedom is not the enemy.

In another reading, absence of limitations would be limiting, still, which would be paradox.


I liked this video explanation of NES scrolling ("The Nintendo Entertainment System's Loading Seam" by Retro Game Mechanics Explained) https://www.youtube.com/watch?v=wfrNnwJrujw


Wow! I just watched his video on the PacMan kill screen, and it finally made sense. I’ve read multiple posts about it before, but having it all presented in a video format with animations made it all make sense.


My own personal story with the NES didn't last that long, because I was a kid without money, terrible at platformers, and my parents wouldn't buy me a lot of games. You can only play Super Mario Bros so many times before you're tired of not making it past the 2nd level. So I eventually switched to computers (Apple II vintage), and never looked back to consoles.

This past year, however, I decided to take up a little retro project based on the NES. Its best described as a "video game music player alarm clock", using some authentic hardware. It uses the CPU from an actual NES to faithfully synthesize music from actual NES games, while building the surrounding system out of modern components.

First prototype in action: https://www.youtube.com/watch?v=izMFPKmD5ZU

Blog posts detailing the project: http://hecgeek.blogspot.com/2018/02/nestronic-1.html http://hecgeek.blogspot.com/2018/03/nestronic-2.html


I was a bit turned off by the way assembly is described as "tricky" and "tedious"... All programming languages can be tricky and tedious. Assembly (specially on the 6502) is conceptually very simple and, while it may not be trivial to translate higher-level concepts to its simplicity, as long as what you want can be readily expressed in it, it's trivially easy.

You will, of course, need to know what bits to set at what addresses to make the NES auxiliary chips work and do what you expect them to do and that may represent some work but that complexity is from the platform, not the language. You can have a library (or a set of macros) that deals with it written in any language you can compile for the platform.


>Assembly (specially on the 6502) is conceptually very simple and, while it may not be trivial to translate higher-level concepts to its simplicity, as long as what you want can be readily expressed in it, it's trivially easy.

That's precisely why I would qualify ASM as "tricky" and "tedious". You're bogged down in the tiny details that nowadays a compiler would probably solve as well (or even better) than you do. If you want to express "I want to make a function that adds two integers" then yeah it's trivial. If instead it's "I want to iterate through the leaves of a binary tree and compute its standard deviation" then have fun; good luck.

"This is a multiplication by 5, it's pretty expensive, probably better off shifting by two and adding once. Oh but then I need an intermediary register, do I have one available?"

"I guess I could inline this bit of assembly here instead of making a function call. Oh but wait now I need to re-do my register allocation to match."

"I need to reserve a bit more stack space to bank this register, let's modify my function's prelude and prologue to match."

Ain't got no time for that.

I think writing assembly is a skill any coder should have, especially if you usually deal with close-to-the-metal languages like C, C++ or Rust for instance. I think having a good grasp of assembly will make you a better coder, if only because it'll let you read the assembly output of your compiler to figure out what gets optimized and how (and also maybe what you could change to speed things up). There are also a few situations where writing assembly is the right solution.

That being said you'd have to pay me a lot to write a full application in assembly, regardless of how many layers of macros and pre-processing steps I'm allowed to use (unless gcc is allowable as a pre-processing step...)


You've captured exactly what it feels like -- this is the constant tension among asm developers. I love really high performance code, but I also like accomplishing more than one trivial thing per hour. It's hard to even remember the time before I knew C, when coding in assembly was all there was. When it was "normal." But I do remember the first time I wrote for the M68000 chipset and finding a multiply and divide instruction. "What? I don't have to write my own divide routine?" Tears of joy!

I will typically build something out in a high level language, profile to see where the time is going, and take a closer look at the algorithm first to see if it's sensible or some different approach should be taken. If it's an appropriate algo, then it's time to look at the implementation to see where we can shave cycles.

For the most part, even if you're talking to hardware that requires exceedingly precise timing (interface/bus protocols, certain chips), C will probably do the job. (It's no coincidence it was referred to as "universal assembly language".) Only where one is absolutely starved for resources (as the NES, and many 8-bit systems were/are) is ASM necessary.

"high level when you can, assembly when you must"


> "high level when you can, assembly when you must"

I was already following that on MS-DOS with its 640 KB, with all Basic, Pascal, C and C++ compilers from Borland and Clipper.

Although I did spent one year still focused on using TASM for everything and playing how much I could cram into a COM file.


It's the difference between being a plumber and being someone who makes custom jewelry. Both are artisans of a kind, the one is doing production and trying to 'get the job done' the other is making one-offs that will have a vastly inefficient time:product ratio where a lot of the value created will be in the eye of the beholder. Both are valid paths.


It's more like someone making custom jewelry using modern tools vs. somebody making custom jewelry only using methods available in ancient Rome. You might not see any obvious difference looking a the result but once you know how they're made one is definitely more impressive than the other. I'm also sure that one is more "tricky" and "tedious" than the other, which is what I was addressing.

If you code something for artistic or "competitive" reasons then of course it makes complete sense. Like people making 4KB demos, speedrunners or people folding thousands of paper cranes. There are no invalid paths if you're an artist.

On the other hand if you consider it from a practical engineering perspective then there are few use cases where I'd go with ASM nowadays, well optimized C code will be easier to write, probably nearly as fast, way easier to modify and maintain and much more portable. Some paths are wildly superior to others if you're an engineer.


Modern CPUs are insanely complicated, with thousands of different instructions, countless layers of cruft and bewildering performance variations across architecture generations - what's optimal in one gen can be a worst case scenario two generations down the road. For any modern CPU I'd never attack a problem ASM first and, most probably, never even touch it.

But we are talking about the 6502 inside the NES, running at less than 2 MHz, with one accumulator and two index registers, a couple status flags and an 8-bit stack pointer within a 256-byte stack. It's a simple machine, for simpler problems.

And yet, guys like Paul Lutus were doing real-time 3D wireframes in FORTH, with fast 8-bit scaled trigonometric functions (no floats involved). It was pure badass of a magnitude not heard of since.

My own contribution was a windowing library for the Apple II that did fit in 1024 bytes and ran self-modifying code to display overlapping windows.


> "I want to iterate through the leaves of a binary tree and compute its standard deviation"

Assuming this is a recurring problem, I'd just use two libraries - a binary tree and a floating point one (I cut my teeth on the Apple II+, so I could count on FP routines already in place in the ROM. If a tree library is not available, writing a tree walker is not a particularly difficult task.

Libraries, subroutines and macros were a huge helper then.

> I need an intermediary register, do I have one available?

If we are talking 6502, the answer is "no". You had page zero, which was almost as good as having 256 8-bit registers.


> I think writing assembly is a skill any coder should have, especially if you usually deal with close-to-the-metal languages like C, C++ or Rust for instance.

It also applies to using languages like Java and C#.

Occasional devs might not be aware of it, but there are ways to see what the JIT/AOT compilers generate.


Part of the tedium is only having a few registers so programming in assembly begins to resemble solving the towers of Hanoi puzzle. But, it's certainly something every programmer should give a try at some point, if for no other reason than just a mental exercise.


Have you done http://nand2tetris.org ? It involves (virtually) building up a CPU that has two registers, which I'm guessing the minimum possible.


You'll need some internal state (one register?) and a program counter. If instructions are long enough, you can encode the address of the next instruction in the instruction itself (sort of a perverse VLIW where everything ends with a jump to a given address) and ditch the program counter. You can make tests destructive, so your status bits end up overwriting the register - it would force some "interesting" programming techniques, but it seems doable.

But remember: there is always a lot of state flying around that's not exposed to the programmer through architectural registers.


You could make every operation of the form (op src dst nxt). Something like https://codegolf.stackexchange.com/questions/11880/build-a-w..., where results are always written to ram and 'register' indexes are just addresses. You'd still need some transient/architectural registers to hold the current instruction/memory address/alu input-output on its way though the cpu though.


> resemble solving the towers of Hanoi puzzle

That's the relaxing part.


> Part of the tedium is only having a few registers so programming in assembly begins to resemble solving the towers of Hanoi puzzle.

Solving the towers of Hanoi puzzle can be done completely mechanically (exercise: derive the really simple algorithm). Programming in such a constrained environment is something where you have to think for yourself, since I am not aware of any mechanical solution.


A mechanical solution would be using other languages like C


Using C you cannot get the advantages that 6502 assembly provides. On the other hand, to stay with towers of Hanoi, one can show that the simple mechanical algorithm that one can derive for it, will solve the problem with a minimum possible number of moves.


So, what the c compiler does.


What always scares me away from assembly every time I try to learn it is that principle that makes code more dangerous or rigid or difficult to evolve, the more manual work you have to do in it. Like how it’s harder to refactor C code than Python because of how the implementation and algorithms get closer to each other the lower you get. And since assembly gives you almost nothing, changing an algorithm seems to mean massive changes. Which implies heavy up front design work. Which is contrary to the kind of exploration and experimentation I like to have in my side projects, especially video games, which is presumably the main type of program for NES.


Assemblers with good macro languages help a lot.

For example using something like MASM or TASM back in the MS-DOS days was like having your own high level language.

So you end up creating a kind of DSL with those macros.


Whenever i see people talk about languages (and CPU assembly is a language, each CPU arch has its own dialect), what it comes down to seems to be how "verbose" the programmer has to be about things.

Meaning that if they have to care about memory structures, or even variable types (just look at how popular JS and Python is), they see it as a bothersome language to work with.

End result though is software that balloons to multiple gigs and maxes out a gigahertz CPU just by being launched and sitting idle.


It is a little misleading isn't it?

I miss the old assembler. It could just be that I have a poor memory, but with 8/16-bit assembly, I could remember almost everything about my toolset and focus on the problem.

With modern APIs and languages--even modern assembly languages--I spend more time googling than anything else.


I remember the 16-bit "real mode" assembly of Intel's 286 CPU very well.

> focus on the problem

Think harder, you might remember that you weren't solving many problems either.


Not true. Programmed micros for practical things--many of which are still in service. Modest hardware forced us to focus on the task rather than presentation.

Modern hardware raised the stakes and forced us to spend more time (in some cases, the majority of the time) on packaging and presentation--even where it didn't really matter that much after the sale.

I'm not talking iPhones were a human interacts with it constantly. I'm talking control systems, where after the sale, human interaction is simple and rare.


> I remember the 16-bit "real mode" assembly of Intel's 286 CPU very well.

Those CPUs already started being complicated enough to make ASM programming less attractive. IIRC, on 286 the string copy instructions were faster than the DMA transfers people used on earlier PCs and, because of that, if you wanted to have optimal performance, you'd need to check which CPU it was running and branch accordingly to the best code path.


I'd blame it more on 286 being trash than actual complexity. I remember doing MIPS ASM in the early 90s, and it was a joy.


When I learned MIPS on the university using SPIM, Pentiums 75 MHz were still fresh and I knew Z80 and x86 Assembly quite well.

The way some instructions were neither macros nor actual CPU instructions always felt strange to me.


> if you wanted to have optimal performance, you'd need to check which CPU it was running and branch accordingly to the best code path.

The Intel C Compiler still does that. It compiles multiple versions of your code, each optimized for different processor versions.


The appeal of retro console/computer dev is essentially an exercise in constrained programming. Systems of these era had extremely constrained specifications, and making the most of them is an art in itself.


It's really amazing how people keep finding ways to make old hardware do new tricks. Overdrive 2 is my favorite example - it's a Sega Megadrive demo that was released 30 years after the Megadrive launched, but still managed to pull off multiple effects that had never been seen on the system before. It abuses the hardware in such unique ways that no emulator can run it properly.

http://www.pouet.net/prod.php?which=69648 / https://www.youtube.com/watch?v=gWVmPtr9O0g


That is amazing. My family had one when I was a kid and I loved it, but what they did there is insane. Way more than I would have thought possible, and I hold that console in high regard.


Sometimes I wish we had more constraints because I fear we've all forgotten how to write fast and lean code in favor of semi-fast but very bloated code.

I remember back when I first used Windows 95, opening programs was very snappy and I was impressed. I imagined Windows of the future being even snappier, but somehow the opposite has happened. Now every application takes a long time to boot up, check for updates, etc just because they use orders of magnitude more resources. I long for apps with instantaneous startup times, no loading screens, and small footprints


I've often thought it would be useful to encourage developers to make at least one project for an old computer or games console before they make anything more modern, just so they learn about the importance of constraints and what not. Maybe after that they might stop trying to throw resources at problems to make them go away and start thinking how their code actually works.

I also believe game designers/developers should work on at least one retro game or mod for similar reasons. Teaches them how to actually design levels without thinking about the artistic side, and (thanks to some brutal moderation requirements on the sites hosting this stuff) forces them to get better at game design rather than just walk through Steam/the app store's non processes.


This came up in the RachelByTheBay thread the other day[1] about how she got started... My computer progression started with a Vic-20 (3kB of RAM), followed by an XT (640kB), followed by a Pentium 133 (16MB of RAM). While the Vic and the XT really taught me a lot about how to write lean & mean code, the Pentium really acts like my baseline for what a resource-constrained GUI system can actually accomplish. I could surf the web and talk in multiple IRC rooms quite comfortably (woo mIRC!), probably with ICQ running as well. The fact that Slack is currently idling with an RSS of 1.3GB is bloody offensive in that lens. 81x the memory footprint of the entire machine I used to do all my web browsing on.

I really like your idea of encouraging people to try getting code running on older machines. I really understand that someone who didn't cut their teeth on a 3kB RAM machine won't have the same appreciation for constrained memory computing, and I think your suggestion is a great way to introduce them to it.

Thanks for the thought!

[1]http://rachelbythebay.com/w/2018/04/04/learn/


On a related note, video games today seem to take forever to load. Plus you have to go through several menus to customize your experience before playing. For the NES Mario, you simply power on, start.


We haven't forgotten, we choose to put our time and energy into features rather than performance because that's what end users care more about (at least, it's what they vote with their feet for).


Some things have been lost, though, and I'm not even referring to the usual claim about constraints facilitating creativity. Games of the 8- and 16-bit console generations, as well as all arcade games up until that point, were written with CRT monitors in mind and often used 'racing the beam' techniques. This resulted in games with extremely low latency, something you never find in a game of any sort today.

The difference can be felt when comparing a game like Super Mario Bros running on an emulator with a typical LCD screen to running on original hardware with a CRT monitor. The emulated version is so sluggish it feels like you're playing underwater!


> Games of the 8- and 16-bit console generations, as well as all arcade games up until that point, were written with CRT monitors in mind and often used 'racing the beam' techniques. This resulted in games with extremely low latency, something you never find in a game of any sort today.

The techniques haven't been lost - if anything there are probably more people who know how to do those things today than there were in the past.

Typical LCD monitors are undeniably worse in many ways than the CRTs they replaced, but again, most customers turned out to prefer the convenience of the LCD on the whole. Nowadays with the rise of 144Hz monitors those rare users who care a lot about gaming latency do have an option.


This article [1] has been posted on HN before but not everybody has read it so I thought I'd mention it here. If you scroll down there's a paragraph where Dan talks about 144 Hz gaming monitors. Shocker of Shockers (not surprising to me, actually) the Apple ][ has far lower latency than a modern PC, even with one of these monitors. When actually measured with a high speed camera, a CRT refreshing at 60 Hz still has lower average latency than a 144 Hz gaming LCD.

I would expect the NES to perform similarly to the Apple, since it has the same CPU (6502) and also uses a CRT display.

[1] https://danluu.com/input-lag/


"tragedy of the commons"

I as a freelance developer sometimes make releases whose sole purpose is to refactor/slim down/increase performance with no new features. Yeah, some of my users are disappointed I didn't prioritize X feature they wanted, but it helps mitigate tragedy of the commons a little.

Individually the users are well-meaning with their feature wants/needs, but collectively all the different feature requests can turn your app into a bloated monstrosity if you aren't careful.


Right, I remember advanced 3d software fitting on a floppy. But now a simple Android app can be over 50mb.

No you don't need a video on your login screen.


Define "bloated code".


It's a dying art because now even the cheapest SoCs are capable of running full Linux with a big stack like Node. The Pi Zero is $5 and capable of all these things.

Perhaps the last refuge is battery-powered microcontrollers.


Not necessarily, when you need to have something that will last 5 years on battery power and not cost more than $1 per piece, a Pi Zero won't do.

But even then, if they have up to 512 KB, is like having the same tooling as on the Amiga, Atari, PC (MS-DOS) on a needle size CPU.

For younger generations, you can check on YouTube what we managed to do with 512 KB on those systems, even when using higher level languages.


Depending on the market you can save millions in production picking a cheaper, less powerful mcu. Sure it takes some extra work to squeeze in more features in less flash and every cycle counts, but on runs of millions of devices, this is easily paid back by the extra profits.


I literally just wrote my first ROM that displays a green box on a black background working on my retropie machine last night.

The NES homebrew scene is pretty cool and the tools and languages have come a long way. One interesting avenue that might be work exploring is to use code generation and CSPs to generate and shrink code.

It's a lot of fun. It's why I like the PICO8. Constraints are interesting.

I just wish I still had my old BASIC code for games I made on my Amiga back in the day...


> Constraints are interesting.

Constraints are also helpful when working towards a sense of mastery.

Look at a modern operating system (or console) for that matter. So much capability, so much complexity. You'll never learn it all, never be able to master it.

Heck, just truly understanding the machine code of a modern processor, and knowing everything it does and can do is quite a daunting task.

These older 8-bit systems (NES, C64, etc.) in contrast are much easier to completely understand. If you are willing to live within the limitations, you can create something really awesome, and be able to leverage the full extent of the system's capabilities.

That provides enormous satisfaction to some.


Just understanding the intel instruction set is basicallly impossible for a mere mortal. How thick is that reference manual now?


Slightly related, but for GameBoy Homebrew, I've been working on a Javascript implementation of GBDK, a C library for GameBoy dev, and am using Emscripten to build a React debug tool allowing you to check out the state of the different graphics buffers and registers and print to the browser console. Bit early days but it's really helping speed up a game I'm working on.

https://www.gbdkjs.com/

Got an example project at

https://www.gbdkjs.com/examples/shooter/web/


It's also a matter of seeing what you can achieve on aging hardware. Can you do better than a team of professionals could in the 80s? Could you make something that pushes the console to its limits? Those types of questions are very inviting for both homebrew and ROM hack developers alike. Or perhaps before them, for those in the demoscene.


Love the NES and love retro gaming, but the article completely contradicts itself. It's not just driven by nostalgia, but...

It says something about the legacy of the NES that most homebrewers continue to make games for the system for reasons other than nostalgia. It’s mostly because they love the NES.

“I’ve always wanted to create NES games, ever since I was a kid. I remember sitting around drawing maps on paper, daydreaming of what my designs would be like to play,” explained Nathan Tolbert, a relatively new figure to the NES homebrew scene who’s already built a couple of titles for the annual NESDEV competition.

So...nostalgia.

One creator, Antione Fantys, wants to bring back some of the elements from retro games that don’t seem to exist in the modern generation. Because the NES is the most iconic of these systems, he enjoys programming with it, citing the limitations of the system as a means to creating the most accurate representation of a retro game as possible.

More nostalgia.

The article pretty much proves these homebrews are the product of nostalgia. Not that there is anything wrong with that.


I used to do a lot of dev against the NES. 6502 for the NES was my first language, wrote my own assembler, built games, cartridges, etc. etc. Hell, I even have a PCB, minus the PRG and CHR ROMs, from an old Ice Hockey cartridge on my desk in front of me as I write this.

Never had an NES growing up, never played one until I got into NES dev. I was turned onto it by a classmate. We're both too young to have had the NES growing up, so we simply couldn't be a nostalgia play. There's a lot of folks who get into because it's surprisingly approachable, and scratches the itch of "make a game." TFA just had a bad quote...


Nostalgia doesn't have to be purely about one's childhood interests, although it frequently is. In your case, your interest still seems a product of nostalgia.

I don't really understand why people view interests based on nostalgia as a weakness. Being nostalgic doesn't have to mean that life isn't good or that things were better once upon a time.

Like what you like, do what you do, and enjoy what you enjoy without apology!


If we want to get pedantic; "Nostalgia: a wistful or excessively sentimental yearning for return to or of some past period or irrecoverable condition", where there seems to be an emphasis on 'wistful', 'excessive' and 'irrecoverable' parts of the definition, which in turn can infer a negative connotation about it all.

However, I agree with you, trying to capture past enjoyments and in cases like this, continue to make new memories and fun times, go for it I say!


> Nostalgia: a sentimental longing or wistful affection for the past, typically for a period or place with happy personal associations.

Nothing about what I did, nor wrote was driven by a longing for the days of yore, mine or otherwise. I enjoyed learning to write software for the 6502, it was a simple instruction set, fairly clear, concise, and simple.

I haven't written 6502 in a number of years, and gave my NES collection to a friend last year. That I have an old PCB on my desk, yes, is nostalgia.


Sherry Turkle [1], a psychologist who wrote about humans and the interactions with computers, wrote about people who use computers as a way to satisfy a need for total control. This manifests itself in building/making or using a low level means to accomplish something that could have been done with a higher level or easier means. These users prefer the lower level means over the easier way because the latter would remove some control or transparency of operation from the person.

So, nostalgia is a big part of the NES appeal, but there may be other factors as well.

[1] https://en.wikipedia.org/wiki/Sherry_Turkle


"The NES Homebrew Scene Is Driven by Nostalgia"

Hey, why isn't anyone reading our article?


"You Won't Believe These NES Homebrew Games!"

"Crazy! The NES Homebrew Scene Thrives Like You Wouldn't Believe!"

"When You Read About These NEW Homebrew Games, You'll Never Game the Same Again!"

"I Played NEW Homebrew Games - What Happened Next Blew My Mind!"

Now they'll read it.


https://twitter.com/clearvus streams at 9pm EST on Mondays and Thursdays on Twitch, he has been working through making an NES game from scratch. He is off this week, but people who are interested in seeing this type of work should definitely check out his series, he deserves way more viewers! I've learned a lot from it.

https://www.twitch.tv/clearvus

recordings are posted on youtube: https://www.youtube.com/watch?v=XwGj1ciSAtw


Maan, I envy the homebrew scenes in the US so much. In Japan there are a few people who're doing homebrew, but because of the constant intimidation by Nintendo and marginalization by the society ("they're technically criminals.") they're not as open or organized as the US. I think we're killing our own talents.


I haven't read the article yet but, judging by the title, I thinks this is relevant:

https://www.kickstarter.com/projects/1316851183/nesmaker-mak...


The existence of phenomena like this and the demoscene made me realize that there are levels of programming skill that I will never be able to reach. Kind of disheartening, but still neat to watch other people do it.


I felt like that for a long time, and eventually it prompted me to build an Apple II emulator, and implement the compiler from the Coursera Compilers course.

Simple versions of these things are not as insurmountable as you think: just put your foot on the first step :-)


Very nice! One thing I always loved about Nintendo games is that you can usually just pop them in and start playing. No dragged out tutorials, setups, etc.


The article namedrops Shovel Knight. I highly recommend the game. It's one of the best games I've ever played, even putting nostalgia aside.


I've had a lot of fun trying homebrew games on my modded NES Classic. If you're a fan of the original Metroid, you should definitely try Rogue Dawn. It plays like a true sequel to NES metroid and it adds features from later Metroid games.


Are there any uniquely impressive homebrew NES games worth playing?


Can anyone recommend any particular enjoyable home brew games? I can see a few already listed in comments, but it would be nice to gather up a short list.


Blade Buster, fast paced modern time attack shmup; very pretty, don’t forget to hit select to increase speed.

My top score in 2min is 500,000+.


That's great! (and difficult).... 100,000 is initial best.

Having recently discovered OpenEmu, I'm trying to track down good homebrew/freeware games.


I got to the "and these are the reasons why" section before I really had to give up. The reasons why you remember the NES over every other video game system of the era was that Nintendo were monopolizing bastards at that time, and if you (as a developer) made games for the NES to be released in the USA, you were essentially forfeiting the right to release those games for other systems in the USA.

Honestly, I'm glad Nintendo got knocked down a peg on their last few systems. I'm just not happy about what they taught their competitors.


The best games age well. People still play pac man. People still play Mario. That said, the aesthetic is absolutely nostalgic.


The aesthetic serves a purpose though. Hi-fi 3D game assets take orders of magnitude more work to create than low-fi 2D ones. I agree that nostalgia and familiarity are what allows gamers to accept NES-style graphics as a valid form of low-fi 2D art. The creator of Stardew Valley mentioned he wouldn't have been able to finish the game had he had to create 3D assets that looked as good to people as the current 2D art.

Having people accept NES-style pixel art is a boon to indie devs. Games like Super Meat Boy have higher res 2D graphics but is still a similar aesthetic. Low-fi poly is also a retro art style. However I've noticed that low-fi poly games are still rendered in high resolution and using more advanced lighting techniques so still requires more work than low-fi 2D.


My favorite genre of games are still platformers. 20xx (A Megaman X roguelike) is my favorite game I've played this year.


Fascinating that these scenes are still thriving, just recently saw "8-Bit Guy"'s postmortem video on writing and shipping a game for C64.


Of course it isn't just driven by nostalgia. It also requires a constant supply of coffee, for one thing.


Ah nostalgia. Everyone here should look up the original meaning of nostalgia though to really understand this. It's so apt yet so crippling.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: