Hacker News new | past | comments | ask | show | jobs | submit login
Let Chris Crawford teach you to program Atari's 8-bit computers [video] (gamasutra.com)
151 points by msie on April 26, 2016 | hide | past | favorite | 39 comments



Here is an example of a contemporary demo on the Atari 8-bits:

https://www.youtube.com/watch?v=eY_YsoR11d8

You might notice that some scenes show more colors onscreen than contemporaries like the C64 - this is all because of the display list interrupt techniques Crawford talks about, combined with clever overlaying of the player/missile sprite system. (The weird term for these hardware sprites is because it's built on the legacy of the 2600's system, which itself was built around the needs of the pack-in games: two players, two projectiles, one ball.)

The A8 series was limited in lifespan in the USA for a number of reasons: poor documentation, platform fragmentation, and price competition. It shipped with very little documentation on how it worked, leaving developers starved for resources for several years(Chris Crawford himself contributed to the unofficial bible, "De Re Atari"). There were the original 8k-upgradeable-to-48k 400/800 models and the newer XL/XE models, which shipped with 64k standard, and additionally, a change partway through the 400 and 800's life to an upgraded graphics chip; although most old software would run on the new machines, the C64 was more straightforward to target and to program, and it led the way in price-cutting the home computer market, which gave it a huge lead in market share. The later life of the A8 was entirely in European markets.

In terms of what the two machines can do when pushed to the limit, though, there's plenty of room for comparison.


If you want to check out an amazing example of what was being done by some Atari 8-bit programmers at the time, check out "Alternate Reality: The City", a Datasoft game by Philip Price released in 1985. He used some really novel techniques to create a game that was visually and musically impressive for the time and architecture. Some of the techniques he employed (from an interview on Atariage.com a couple years ago)

>1. Knowledge of the names of registers was from magazine articles. I was self-taught for the most part. On the tricks with the hardware, that was just me thinking up what cool things I could do with the functionality of the hardware. Cycle counting to determine the position of the television electron beam, using a sliding window and emitting to the front and back of the current visual screen, using interrupts to change the color registers on each scan-line (combined with cycle counting and moving the player missiles to be at two places at once on the scan line),the 3D stuff, security, encryption, texture mapping, music driver, all self taught (no books).


> You might notice that some scenes show more colors onscreen than contemporaries like the C64 - this is all because of the display list interrupt techniques Crawford talks about, combined with clever overlaying of the player/missile sprite system.

Similar techniques are heavily used on the C64 too. See e.g.

http://studiostyle.sk/dmagic/gallery/gfxmodes.htm

E.g. here's a bunch of IFLI images:

http://c64pixels.com/main.php?g2_view=keyalbum.KeywordAlbum&...

And here's a bunch using sprite layers:

http://c64pixels.com/main.php?g2_view=keyalbum.KeywordAlbum&...

Like for many of these platforms, it's almost impossible to extract graphics from "newer" games or demos accurately without actually running them in a cycle accurate emulator and in many cases you have to capture at least two frames and interpolate...


One of the most impressive things I saw for the 8-bit Atari is Chris Hutt's 2011 conversion of Space Harrier. It's probably the best home conversion of the game this side of the Sharp X68000.

https://www.youtube.com/watch?v=y1Oi3zgpGu8

Also, the graphics chips in the Atari 2600 and 800 were designed by Jay Miner, famous as the "father of the Amiga". I'm no Atari expert, but I wonder if there is an evolutionary link between the display list techniques of the Atari 800 and the Amiga's Copper Lists? Long ago I read an interview with Archer Maclean, where he described the '800 as a "little Amiga". I wonder if anyone could expand on that?


I would love it if someone could do something like this for modern GPUs, complete with pipeline stalls as a costumed CPU waits to hear back from a costumed GPU. It seems like there's an unreasonable amount of mystery around how GPUs work, at least in popular understanding, compared to our knowledge of arcane corners of obsolete computers.


Isn't there a lot of mystery because the GPU companies keep things very secret? It's been my understanding that GPUs are very much black boxes.


Most popular desktop GPUs have been pretty well reverse engineered (or implemented from documentation) in Mesa, so you really can go and look at the nitty gritty details, and write assembly language if you so like.

The VC4 is also a fun target, because in addition to being supported in mesa, it is documented very well [1] and in a very popular product (the Raspberry Pi).

[1] http://www.broadcom.com/docs/support/videocore/VideoCoreIV-A...


Here's dump of public knowledge about various GPUs.

http://renderingpipeline.com/graphics-literature/low-level-g...


This series of articles doesn't have any costumes, but it's still a good read: https://fgiesen.wordpress.com/2011/07/09/a-trip-through-the-...


Orchestrated to music like the Sorcerer's Apprentice [1]!

[1] No link necessary.


If anyone wants to try out programming the a8 then here is a little info:

Best windows emulator: Altirra

Best cross platform emulators: Atari800 and Atari++

Or if you have an FPGA board (MIST,MCC216 etc) then you can try my core available at http://www.scrameta.net

Or you could buy a real 800XL!


I'd recommend Altirra even if you're on Linux, it works well through WINE.


I remember good old times when I used ZX Spectrum (actually it was 100% compatible soviet-clone).

I dreamed to create a game. That's why I actually started programming.

I wanted to create animation.

When I started with Basic I noticed that Basic interpreter is way too slow to update video memory (even for fraction of screen).

Then I discovered world of compiled languages with Pascal. It was much more fun than Basic but still too slow to update video memory.

Then I moved to assembly language (actually I only had list of opcodes and I had to put them in memory directly). It was much more fun than Pascal and speed was great but still not enough for animation if you write straight-forward implementation.

Then I discovered that if you put performance-critical code into lower-memory addresses it runs much faster.

Unfortunately, I didn't finish my research because I had no access to internet and books.


> Then I discovered that if you put performance-critical code into lower-memory addresses it runs much faster.

Do you mean higher memory addresses?

Code in the lower 16KB of RAM (addresses 16-32KB) has contended memory access with the ULA, which causes the CPU to pause while the TV raster line is drawn.

Code in the upper 32K does not have this issue.


Chris is awesome. He created the Game Developers Conference and an early journal on game development. He encouraged collaboration and sharing in an industry that was very secretive.


Well, lets see what I can do to help this effort....

Here is a link to an Atari 2600 emulator with every game. There's also a loader to load in any game or rom. That would allow near-instantaneous game and application creation via this tool

http://ipfs.io/ipfs/QmacAqRVhJX9eS7YJX1vY3ifFKF9CduDqPEgaCUS...

It's also using something called IPFS, or Inter-Planetary File System. That got a bit of press earlier on with the left-pad situation and package managers regarding npmjs. Effectively, it's a worldwide GIT repo backed by a self-certifying file system, a Bittorrent type transmission between nodes, and Kadmelia and dHT for host detection.

(Edit: Really? I offer something that is relevant, at least orthogonally. And yet, -1's. Cant the ones that downvote provide a reason, even if from a throwaway account? And it's not even a controversial opinion...)


The 2600 is totally different from the 8-bit computers, just so folks know. Programming an Atari 800 computer was a walk in the park compared to the living hell on earth that was the 2600.

[I wrote a few Atari 800 game cartridges. I watched one of my cow-orkers mentally deteriorate as he attempted to write a simple 2600 game over maybe four months -- not kidding. He gave up and was a happier person for it].


At APh (the consulting firm that designed the Mattel Intellivision, wrote its system software, and most of the early games for it) the biggest insult you could give to a programmer was to tell them their Intellivision game looked like an Atari game.

This was NOT a slam against the Atari programmers. It was a reference to the graphics hardware limitations of the 2600, which made it ridiculously difficult to do things that were easy on Intellivision.

Not that Intellivision was a walk on the park...the mode most games used for the colors of things that were not sprites was demented, but at least the hardware sprites and their collision detection, along with the system software that automatically handled animating a sprite and moving it at a fixed velocity, saved us from many nightmares that 2600 programmers had to endure.

The 2600 had sprites, but they had limitations. I don't recall with absolute certainty what they were (when Mattel decided to do Atari games and some of that work was contracted to APh, I was not involved--I was working on games for what would have been the next generation Intellivision if the industry had not collapsed and laid me off, so my 2600 details are mostly what I remember Hal Finney and the others involved in the Atari games telling me), but my recollection is that there were limits on how many sprites could occupy overlapping vertical positions, and getting their horizontal position right involved software timing, so you had to do things like make sure different code paths took the same number of clock cycles. That is not fun.


> The 2600 had sprites, but they had limitations.

Calling them "sprites" is perhaps a bit of an exaggeration. They were colored boxes. :) Turning them into any sort of recognizable graphics was required developers to "race the beam" to update registers constantly to render more objects per line than the hardware directly supported.


> The 2600 had sprites, but they had limitations.

Saying that the 2600 had sprites is like saying "Roman galleys had engines..." -- definitely no fun at all :-)


I'm finishing up this book - https://en.wikipedia.org/wiki/Racing_the_Beam - which goes in to a lot more detail - really interesting reading :)


If you enjoyed that, you might also enjoy It's Behind You (http://bizzley.imbahost.com/). It talks about porting R-Type (side-scrolling space shooter) to the ZX Spectrum... and similar fun around squeezing down high-end (for the time) arcade games to run on hardware that shouldn't really support it - a great read!


I survived making an Atari 2600 game! https://github.com/boomlinde/jupitersumo

The tricky part is writing the graphics kernel (which is responsible for updating the one-dimensional screen registers as the beam advances vertically). It needs to be quick, and it's probably not healthy for the programmer trying to incorporate any other game logic into this part.

After the screen has been drawn you have a few lines free for arbitrary code on every frame. You still have to keep track of how many lines have passed to maintain the correct screen frequency and initiate the vertical sync at the right moment, though. There are no interrupts to deal with this, but there is a timer that you can set up and start polling once your game code is finished not to have to think about lines when doing the game logic.

Then there's the sprites... They can be nudged side to side arbitrarily in up to 8 pixel increments, but can only be placed at an absolute horizontal location by waiting for the beam to get there and writing to a register that resets an internal sprite counter. There is no vertical positioning, and you just enable/disable the drawing of the sprite or clear its pattern at the right line.

In my game, I could save a lot of time in the kernel by never explicitly clearing the sprites to produce 2D objects. Instead, as the screen draws lines, it decrements an 8-bit counter which is used to index a table of 256 sprite lines. "Moving" the sprite vertically, then, is just a matter of changing the initial value of the counter.


Interesting. How do you build the code? A quick Google "acme atari assembler" pointed me to: https://sourceforge.net/projects/acme-crossass/

Is this correct?


There are tons of 6502 assemblers out there. It's a pretty easy processor to work with. I wrote maybe half a dozen assemblers back in the day (each one better than the last). On a lark, I wrote one in Python a couple of years ago; it's on github (https://github.com/landondyer/kasm).


Yes, that's it.


While the experience of programming them is obviously different, the 400/800 hardware seems to largely be a modestly expanded/fixed version of the 2600 hardware with a coprocessor stuck in front of it to offload all the "racing the beam" crap. Of course, I could be misinterpreting something.


No, they really are totally different machines. Look at the schematics. About the only real similarity is that both used a 65xx family microprocessor.

Besides the very extensive hardware differences, to the best of my recollection there was no overlap at all in software/firmware.


What I meant was the graphics hardware rather than the systems in general, which I could have been clearer about. Looking at the registers for the 2600's TIA and 400/800/etc. CTIA/GTIA, they seem far too similar (while simultaneously being considerably different from other graphics hardware of the time) to write off as coincidence.


You have to look beyond the registers, to the whole system; those registers are largely useless without the system RAM to make them work.


Jay Miner did both then went on to Amiga. I would imagine the similarity is the learning and experience showing.


> modestly expanded/fixed

Really totally different. You're going from a machine that has 128 bytes of RAM to one with two orders of magnitude more (in the base configuration). You can use bitmap graphics. There are interrupts and they work. There is a ton of graphics hardware with a rich set of display modes, including characters and bitmaps. You can attach disks, modems and other devices (okay, the serial bus is slow, but there's an I/O system that makes it easy to use) and do development on the actual hardware. BASIC is built-in, but there are a bunch of other languages available if you don't like it. The sound is . . . better.

The platforms are orders of magnitude different in practically every dimension (except physical size :-) )


As I understand it, programming the 2600 is mainly occupied with the task of squirting the right pixels onto the screen at the right times, which involves very tight timing constraints.

Although that hasn't stopped the demoscene from doing some pretty amazing things with the hardware:

https://www.youtube.com/watch?v=hrhJ9wDNWm4

https://www.youtube.com/watch?v=GZSlzdJ3yR8


Admittedly, I'm not very well versed on computing from the late 70's and early 80's. My own personal experience of 8 bit machines are the recent PIC and Atmel micros.

I just figured that an emulator/games that are completely self-contained would be handy.


I loved my Atari 800XL, and while I had a few computers before it, this is the one I cut my teeth on learning how to program. Its hardware was really interesting and forced you to be quite inventive to get the most out of it.


I started with the Atari 400. It was quite fun to program, and I was a bit disappointed when the school finally got Apple IIe and I couldn't figure out why they were "better" than an Atari 800. They certainly weren't as fun to program. I gotta admit that I do miss 6502 assembly.


Predated me a bit with the 1040ST which was an incredible machine for what it was able to do.



Talking of Chris Crawford he has an interesting Patreon: https://www.patreon.com/ChrisCrawford




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: