Hacker News new | past | comments | ask | show | jobs | submit login

The 2600 is totally different from the 8-bit computers, just so folks know. Programming an Atari 800 computer was a walk in the park compared to the living hell on earth that was the 2600.

[I wrote a few Atari 800 game cartridges. I watched one of my cow-orkers mentally deteriorate as he attempted to write a simple 2600 game over maybe four months -- not kidding. He gave up and was a happier person for it].




At APh (the consulting firm that designed the Mattel Intellivision, wrote its system software, and most of the early games for it) the biggest insult you could give to a programmer was to tell them their Intellivision game looked like an Atari game.

This was NOT a slam against the Atari programmers. It was a reference to the graphics hardware limitations of the 2600, which made it ridiculously difficult to do things that were easy on Intellivision.

Not that Intellivision was a walk on the park...the mode most games used for the colors of things that were not sprites was demented, but at least the hardware sprites and their collision detection, along with the system software that automatically handled animating a sprite and moving it at a fixed velocity, saved us from many nightmares that 2600 programmers had to endure.

The 2600 had sprites, but they had limitations. I don't recall with absolute certainty what they were (when Mattel decided to do Atari games and some of that work was contracted to APh, I was not involved--I was working on games for what would have been the next generation Intellivision if the industry had not collapsed and laid me off, so my 2600 details are mostly what I remember Hal Finney and the others involved in the Atari games telling me), but my recollection is that there were limits on how many sprites could occupy overlapping vertical positions, and getting their horizontal position right involved software timing, so you had to do things like make sure different code paths took the same number of clock cycles. That is not fun.


> The 2600 had sprites, but they had limitations.

Calling them "sprites" is perhaps a bit of an exaggeration. They were colored boxes. :) Turning them into any sort of recognizable graphics was required developers to "race the beam" to update registers constantly to render more objects per line than the hardware directly supported.


> The 2600 had sprites, but they had limitations.

Saying that the 2600 had sprites is like saying "Roman galleys had engines..." -- definitely no fun at all :-)


I'm finishing up this book - https://en.wikipedia.org/wiki/Racing_the_Beam - which goes in to a lot more detail - really interesting reading :)


If you enjoyed that, you might also enjoy It's Behind You (http://bizzley.imbahost.com/). It talks about porting R-Type (side-scrolling space shooter) to the ZX Spectrum... and similar fun around squeezing down high-end (for the time) arcade games to run on hardware that shouldn't really support it - a great read!


I survived making an Atari 2600 game! https://github.com/boomlinde/jupitersumo

The tricky part is writing the graphics kernel (which is responsible for updating the one-dimensional screen registers as the beam advances vertically). It needs to be quick, and it's probably not healthy for the programmer trying to incorporate any other game logic into this part.

After the screen has been drawn you have a few lines free for arbitrary code on every frame. You still have to keep track of how many lines have passed to maintain the correct screen frequency and initiate the vertical sync at the right moment, though. There are no interrupts to deal with this, but there is a timer that you can set up and start polling once your game code is finished not to have to think about lines when doing the game logic.

Then there's the sprites... They can be nudged side to side arbitrarily in up to 8 pixel increments, but can only be placed at an absolute horizontal location by waiting for the beam to get there and writing to a register that resets an internal sprite counter. There is no vertical positioning, and you just enable/disable the drawing of the sprite or clear its pattern at the right line.

In my game, I could save a lot of time in the kernel by never explicitly clearing the sprites to produce 2D objects. Instead, as the screen draws lines, it decrements an 8-bit counter which is used to index a table of 256 sprite lines. "Moving" the sprite vertically, then, is just a matter of changing the initial value of the counter.


Interesting. How do you build the code? A quick Google "acme atari assembler" pointed me to: https://sourceforge.net/projects/acme-crossass/

Is this correct?


There are tons of 6502 assemblers out there. It's a pretty easy processor to work with. I wrote maybe half a dozen assemblers back in the day (each one better than the last). On a lark, I wrote one in Python a couple of years ago; it's on github (https://github.com/landondyer/kasm).


Yes, that's it.


While the experience of programming them is obviously different, the 400/800 hardware seems to largely be a modestly expanded/fixed version of the 2600 hardware with a coprocessor stuck in front of it to offload all the "racing the beam" crap. Of course, I could be misinterpreting something.


No, they really are totally different machines. Look at the schematics. About the only real similarity is that both used a 65xx family microprocessor.

Besides the very extensive hardware differences, to the best of my recollection there was no overlap at all in software/firmware.


What I meant was the graphics hardware rather than the systems in general, which I could have been clearer about. Looking at the registers for the 2600's TIA and 400/800/etc. CTIA/GTIA, they seem far too similar (while simultaneously being considerably different from other graphics hardware of the time) to write off as coincidence.


You have to look beyond the registers, to the whole system; those registers are largely useless without the system RAM to make them work.


Jay Miner did both then went on to Amiga. I would imagine the similarity is the learning and experience showing.


> modestly expanded/fixed

Really totally different. You're going from a machine that has 128 bytes of RAM to one with two orders of magnitude more (in the base configuration). You can use bitmap graphics. There are interrupts and they work. There is a ton of graphics hardware with a rich set of display modes, including characters and bitmaps. You can attach disks, modems and other devices (okay, the serial bus is slow, but there's an I/O system that makes it easy to use) and do development on the actual hardware. BASIC is built-in, but there are a bunch of other languages available if you don't like it. The sound is . . . better.

The platforms are orders of magnitude different in practically every dimension (except physical size :-) )


As I understand it, programming the 2600 is mainly occupied with the task of squirting the right pixels onto the screen at the right times, which involves very tight timing constraints.

Although that hasn't stopped the demoscene from doing some pretty amazing things with the hardware:

https://www.youtube.com/watch?v=hrhJ9wDNWm4

https://www.youtube.com/watch?v=GZSlzdJ3yR8


Admittedly, I'm not very well versed on computing from the late 70's and early 80's. My own personal experience of 8 bit machines are the recent PIC and Atmel micros.

I just figured that an emulator/games that are completely self-contained would be handy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: