Hacker News new | past | comments | ask | show | jobs | submit login
The DooM-chip: no CPU, no opcodes, no instruction counter (twitter.com/sylefeb)
320 points by MrBuddyCasino on May 9, 2020 | hide | past | favorite | 101 comments



There's an old saying when judging a new architecture. Can it run Doom?

This question arises since pretty much anything can these days. https://www.vice.com/en_us/article/qkjv9x/a-catalogue-of-all...

Now we have a new answer: IT CAN ONLY RUN DOOM


There is no instruction counter, only DOOM!


If only mankind was capable of making a machine that can run crysis.


That's why we are developing quantum computers.


But will they run crysis? You will know only after opening them...


Is it doom complete?


for $39.95


IT IS DOOM


I bet 2000 years from now Doom is belongs to canon like Iliad. It's of the timeless classical epics among the oldest extant works from the early silicon era. Doomguy is mentioned with Gilgamesh and Odysseus.


Why? DOOM was a technical achievement, sure, but it has little in the way of story; it didn't even attempt the whole "trying to tell a story without words" thing common in video games of today. It's certainly not an epic.

If they wouldn't have released source code for it in 1997, who would actually be thinking of DOOM today in terms of anything but "Wow, that had pretty cool tech!"?

I could kind of understand the story appreciation for DOOM 3, because it at least had a story, but DOOM?


Doom has had an immediate huge impact in its time ; even before the source code got released the game had a hacker-friendly wibe to it. For instance DEU (the first editor I used) was released in 1994 (https://doom.fandom.com/wiki/Doom_Editing_Utilities). The 'unofficial specs', still invaluable today if you want to hack Doom, were released in 1994: http://www.gamers.org/dhs/helpdocs/dmsp1666.html

Magazines would bundle floppies (and then CDs) full of levels, with new textures and recorded games. Total conversions were made. Tools appeared to directly manipulate the executable and achieve various effects. As far as I remember no other game before had clustered such a large and active community around it? (which is not to say that no other great and amazing games existed before, so many gems lie in the past!)

And then there was the networked play. I spent an afternoon with a friend soldering a cable to play 'null-modem', and we got it working around ~6pm. At 5am the next day we were still playing, me on a luxurious 486 DX2-66, my friend in a tiny window on a 386 DX-33Mhz, both with red eyes. This was an experience like quite no other at the time.

The gameplay was simplistic but huge fun. The immersion was intense. The tech was stellar. But I am obviously biased by nostalgia ;)


I was in college at UCSD when the 14-Mb shareware DOOM was released. I finally found it (after frantic searching) on a public FTP server somewhere in Australia. I started the download, then went to dinner. After eating, I came back to both the completed file and a salty email from the Aussie sysadmin admonishing me for saturating their connection. The next few days (and my GPA for that quarter) were a bit of a blur. Good times. :D


Basically, Minecraft is doom for today’s youth (minus the technological prowess)


“Over the centuries, mankind has tried many ways of combating the forces of evil... prayer, fasting, good works and so on. Up until Doom, no one seemed to have thought about the double-barrel shotgun. Eat leaden death, demon...”

― Terry Pratchett


Lol!

For the lazy, I just looked this up, and it does seem to be a real quote :)


Most games except adventures and text adventures have a worse script (lit wise) than any average episode of X-Files/The Outer Limits and so on.

Heck, Half Life looks like a -mediocre- episode of The Outer Limits.


The more I've aged the more I've realized this. Most video games are horribly written and unbelievably cheesy without trying to be. The beauty of a game like Doom is that it knows it- it doesn't try to be anything profound.


What games would you quote as counter-examples of this? I can think of a few, among which "Life is strange", and "Bioshock Infinite". Probably more, but it's hard to recall at 1am (I've heard good things about spec ops: the line).

That said, story is but one element of a video game. Most are praised for their gameplay, but some are for their aesthetics (Hollow knight, bastion), music (transistor), or others criteria. Thus, a game that excels at none can sometimes come on the top... But so can other games that all but stopped trying scoring on some aspects (dwarf fortress, or doom).


I think (the original) Deus Ex is an interesting example here, because it doesn't fit easily into any box.

Normally I agree with the GP, and prefer the joyously silly/unashamedly dumb approach over the 99.9% of more 'serious' games that are tediously mediocre at best. And on paper Deus Ex should have been pretty annoying: it is a slightly janky mix of lowbrow conspiracy fiction and quasi-highbrow philosophising. But somehow -- obviously partly because of the still-fresh-20-years later gameplay, but also because of something I can't quite pin down in the synergy between gameplay and story -- it works amazingly well.

It really nails the feeling of existing in and shaping an exciting world, and I think every component is crucial, from the (blocky) environments to the (ludicrous) characters to the (somewhat awkward) mechanics, and of course the 'every conspiracy theory is true' plot. Obviously it has a sense of humour too, but I don't think it would have worked if it were constantly taking the piss out of itself.


Naughty Dog's stuff (Last of Us and Uncharted) is above the rest. Quanti Dream's Detroit: Become Human borders on interactive branching novel and is also good (as, most likely, their other two previous games Beyond: Two Souls and Heavy Rain, you van get all 3 on PC now btw). I also like the Nier: Automata a lot, I'd say the best thing about this game is story design.


I made an account just to post this:

SOMA, SOMA, SOMA. The best story I have ever seen in any videogame, ever.


Most graphical adventures are on par. Like The Dig, Broken Sword, The Longest Journey...


A perfect counter example is nier: automata


Like DN3D, but on the graphics side. Not 3D like Unreal, but the environment feels more realistic and interactive than the first.


In my opinion, the Lucas Arts adventure game "Indiana Jones and the Fate of Atlantis" had a great script but it's still a kind of genre that could be considered cheesy, however I enjoyed this game as much or more than the movies.


I’ll suggest not for story, just for overall surprise. Doom was a big leap over what came before it. Doom 3 wasn’t as surprising, even with more story.

I’d turn it around and instead of shooting the idea down, play the game and suggest what games you think will make the history books and get talked about in a thousand years. Even if it’s just some sort of computer or game history class in college, what games made before today will make it through the sieve of history and why?

I’d humbly suggest that story isn’t a very strong reason for the majority of the best games ever made; games are good for other reasons, including but not limited to visuals and graphics, immersion, interaction, engagement, sound, mood, viral play, pushing boundaries on limited hardware, etc., etc.


I’ll suggest not for story,

I’d turn it around and instead of shooting the idea down,

You just shot down the idea in five words.

play the game

I have. This is why I know it's not on the level of Charlotte's Web in terms of story, let alone the Epic of Gilgamesh.

I’d humbly suggest that story isn’t a very strong reason for the majority of the best games ever made

Exactly.

games are good for other reasons, including but not limited to visuals and graphics, immersion, interaction, engagement, sound, mood, viral play, pushing boundaries on limited hardware, etc., etc.

Literally none of this is relevant to what the poster proposed.

Architects don't try and claim that the Colosseum is part of canon. It's not part of canon. It's stone arranged in a particular way. It's impressive architecturally; it's part of architectural history. It's not part of canon.


I'm not sure if the original suggestion was 100% serious, but to the extent that it was, I really don't think it was about story. (If I'm right, maybe the word 'epic' was a confusing choice.) The idea is that Doom:gaming::Gilgamesh:literature.


> Literally none of this is relevant to what the poster proposed. Architects don't try and claim that the Colosseum is part of canon. It's not part of canon.

I don’t speak for the GP comment, but FWIW I think you are misunderstanding the original comment and mine. The Colosseum absolutely is part of the architectural canon.


Awesome works of art with little to no story:

- The Lovers, painting by Magritte

- Nighthawks, painting by Edward Hopper

- Girl with a Pearl Earring, painting by Johannes Vermeer

- Space Invaders, arcade game by Tomohiro Nishikado

- Pacman, arcade game by Toru Iwatani

- Tetris, PC game by Alexey Pajitnov

- Doom, PC game by John Carmack and John Romero


Doom's story is of id, not Doomguy.


> Doom's story is of id

Freudian doom.


> If they wouldn't have released source code for it in 1997, who would actually be thinking of DOOM today in terms of anything but "Wow, that had pretty cool tech!"?

I mean, no, but that event is the point. I don't think the parent is asserting that DOOM the game will be remembered in the classical canon. It's the codebase that belongs there!


Yes, also considering that it's a fairly complex game that is available on practically every end-user OS: https://en.wikipedia.org/wiki/List_of_Doom_source_ports


I once heard a game critic say that he had never played DOOM. DOOM's impact on the medium is so significant that this statement is roughly equivalent to a film critic saying they never saw Citizen Cane or Star Wars. DOOM set the bar for what games could be in a way that's difficult to appreciate from the perspective of inhabiting the medium it helped define.

Historically story in games has been, as Carmack himself once put it, "like story in a porn movie". It can be good or bad, and that can affect the end product's quality, but it isn't what people show up for.


Exactly, which is why it's ridiculous to imply it'd be on the level of Illiad or similar.


I distinctly recall the "Beowolf" story in my English class consisting of "there was a monster so then our baddest dude went out and killed it to death with a sword". Turns out having a good story isn't all that relevant to historical prestige.


More likely it will be remembered like "L'Arrivée d'un train en gare de La Ciotat"

In other words, only by intellectual gaming buffs.

https://en.m.wikipedia.org/wiki/L%27Arriv%C3%A9e_d%27un_trai...


Interestingly the AI remastered version of that video has quite a few views. https://youtu.be/3RYNThid23g

Might an AI capable of remastering DOS games have a similar affect?


I think you're onto something, but I think Doom is closer to a Mona Lisa than a Illiad. It's a work of art that has proven it's ability to attract people to it.

Not sure what the Illiad of games is. Maybe there isn't one. Maybe Doomguy and Mario are closer to mythos like Hercules and other demigods.


The Iliad was the crowning achievement of a literary society with hundreds of years of tradition. Doom was a simple and early innovator. At best, it will be remembered in the way that some fairy tales are: the beginning of a trope, or right of passage. Even that I'm doubtful of. I suspect it will be remembered like checkers and and the game of Ur: historical curiosities of early games. But not a cultural achievement. I'm not sure what it says about us that we think Doom has the same level of literary merit as the Iliad. It's a fun game, but it has neither the depth of chess or go, nor the narrative achievement of even a sitcom or cheap dimestore novel.


Nobody claimed Doom has the same literary merit as the Iliad; but it’s a fact that it has a lot of cultural merit.

The Wright brothers built an airplane that was a simple and early innovation, yet will always be remembered. Henry Ford build a simple and early car that will always be remembered. There’s evidence that simple and early innovations are lasting and culturally important.

Who knows if it’ll be remembered in a millennium? No one here, and maybe it will or maybe it won’t, but it already stands above most computer games ever made as an important milestone, it’s place in video game history is pretty solid. No reason to doubt a legacy is possible.


I vividly remember when Doom swamped the large Oracle facility where I worked. For literally 3 days productivity nose dived as it seemed the entire company was playing this game which had graphics that seemed absolutely impossible on the hardware of the day. To write Doom of as being akin to a "cheap dimestore novel" feels a little like some elitist attitude where creativity doesn't count unless it's in crusty old print.


Funny, I thought the same of Xena back in the 90's. What about Kratos or Spiderman?


Spiderman is another decent bet. He obviously has some long-term staying power.


Interesting, so it's somewhat like the original pong chip or the AY-3-8500. No OS, just inputs drive video output.

http://www.pong-story.com/gi.htm

http://www.pong-story.com/gi8500.htm


Some of the ultra-cheap "battery portable handheld games" (https://en.wikipedia.org/wiki/Handheld_electronic_game ) which are still produced today may have such ASICs, although I suspect a lot of them are actually a 4 or 8-bit CPU with a mask ROM.


Next: analog implementation of Doom?


Not a finished one but this concept is cool. Doom on a Vectrex. https://www.youtube.com/watch?v=VuVnoqFF3II


Sprite_tm! His hacks are always tasteful.


The Vectrex had a 6809...


Oscilloscope doom? I can dig it. Kinda like I dig this [1]

[1] https://www.youtube.com/watch?v=kPUdhm2VE-o


I like this guy. I thought this was going to be a more introductory video: https://www.youtube.com/watch?v=rtR63-ecUNo (holy c*, he published a lot of these since 2014, they're pretty good :) )

This has been done with quake: https://www.youtube.com/watch?v=aMli33ornEU


That is cool but I think what the OP meant by analog DOOM was actual analog circuits driving a TV PAL/NTSC signal or oscilloscope output.


I guess you could synthesize a HDL implementation into discrete logic?


This contributes to the timelessness of Doom.

Doom pretty much has been ported to everything with a CPU, like Linux. Heck, you can even search for JSDoom and get a few results. There's even a RISC-V port (RISC-V emulator included: https://github.com/lcq2/risc-666)

It will be interesting if it progresses past simply drawing the level, e.g. if you can actually play it.

Then Doom won't even need a CPU to exist.


Meh. I can play Minizork and Curses (version 3) on a:

- Monochrome GameBoy and compatibles.

- PDAs, nearly all of them.

- That intelligent pen which parsed everything you wrote.

- KA10 with Tops20 with dfrotz.

- GBA/NDS/PSP/Android/ioS... anything portable.

- Everything SIMH emulates, or nearly everything.

- All of the 8/16/32 bit minicomputers.

- Any OS with a gopher client. Even Nethack/Telnet may work with a dumb wrapper.

- A PostScript printer with a crafted input.

- Over telnet/ssh/irc... name a text protocol and you will be able to play it.


How much more efficient is this than running it in software?

I'm asking, because if this thing can render magnitudes bigger/more detailed worlds than a PC and it's basically copy protected because it's "in hardware" this should be the wet dream of the industry.


Much more efficient (some 0.x watts) and faster - but not very easy to develop and even harder to deploy :)

and you will get in trouble with the amount of code(needed gates) for porting nowerdays doom


No it will not outperform your GPU for rendering, that is already a specialized hardware


It honestly depends on what you are rendering. If your application is just pong, it is feasible to implement a rendering pipeline that is virtually instantaneous. GPUs have overhead for even the most trivial application. With flip-flops or other analog circuitry, you are talking about frame latency in terms that are similar to the propagation speed of an electron through arbitrary copper wires or semiconductors. If you insist on digital output, some buffering will be required, but in the analog output case you are probably looking at a handful of microseconds of latency between input and output.


It’s specialized for a very general problem though. If he made his own graphics logic in there too he could maybe make it faster too


The available fpga hardware is just not able to compete with the sheer hardware power (ghz,ram,gates) of a todays gpu - but i would agree on smaller problems


FPGAs can achieve very compelling performance when programmed properly, but usually not for compute bound workloads. Most interesting problems are memory bound these days, and that's where FPGAs shine, since you can design your hardware to be as wide as your problem. Or alternatively if you have many parallel sub-problems, you can make your hardware as narrow as the sub-problems and replicate it many times.

You can usually move data on and off chip very quickly also, since high end modern FPGAs have many hundreds of pins. Generally these get connected in to hard-logic like fiber networking or PCIe.

An FPGA soft core is never going to beat an ASIC if both were designed well. But they can beat general purpose ASICs (e.g. GPUs) for certain classes of problems, mostly those where you can exploit the massive memory bandwidth of the FPGA.

I think that for rendering computer graphics, you really just want a big fat pile of FPUs. FPGAs will usually have a number of hard logic DSP blocks on-die, but nowhere near as many as a GPU. If rendering 3D graphics is the problem you want to solve, you probably really want a GPU.


Further down the author says it runs at 60fps, without much headroom. So quite some way behind modern GPUs - the latest doom game can run at 200+ FPS and is vastly more complicated.

That, combined with the cost of the hardware in comparison to a digital copy makes me think this is unlikely to be particularly useful to the games industry. It's incredibly impressive though!


(Author here) Thanks - and I agree. That being said this is running 100 MHz on (what I believe is) a relatively low end FPGA? Also my implementation remains simplistic, many things to optimize. But this is a very good and deep question: how can we compare these things? What is the right metric? (FLOPS per watt? err, no, no floating point involved ;) ). I am wondering and this seems quite a difficult/subtle question.

I love GPUs, I spent many year working with them (still do!) and these are beautiful pieces of hardware and engineering (as modern CPUs are). They have evolved beyond our craziest dreams since the NVidia register combiners (https://www.khronos.org/registry/OpenGL/extensions/NV/NV_reg...). The performance we get nowadays is absolutely mind-boggling (I often think we don't fully realize how powerful they actually are).

Can we dream of some sort of mixed platform, where we could 'burn-in' very specific functions into FPGA type hardware that would seamlessly interact with our modern GPUs/CPUs? Is it already happening?


One could probably go for something like the product of the number of gates involved and the clock frequency to quantify the efficiency of an implementation. Then you can either have a very simple processor with only a few gates but running at a high clock frequency to get all the computations done or you can have a very large parallel implementation with many gates but requiring only a lower clock frequency.

One must obviously only count the actually used gates, for example if floating point units in a processor are not used, and account for idle time if a frame is completed faster than the frame time. Also counting gates might be somewhat tricky, for example in a FPGA where multiplexers and memory are used to build look-up tables to then implement gates, so one could either count the actual gates in the FPGA because those are the gates that are actually used but one could also want to count the gates in the design as if the design was implemented in an ASIC. On the other hand the difference is probably just a small constant factor and it might not really matter that much.

In the end power consumption should capture this pretty well as it scales with the number of actually switching transistors and clock frequency. One would still have to account for the differences in technology and especially supply voltage which goes quadratically into the power consumption.


That's way more expensive than the margins they're used to for distribution.

And the xbone not getting hacked in it's lifetime proved that they can have their cake and eat it too.


Nice work. However, a state machine "state" could be seen as an instruction counter.


Author here - Thanks! You are absolutely correct there are several state machines (vga, sdram controler, framebuffer, texturing, divider, renderer, etc) and each with an active state index. The renderer is the biggest one with 69 states.

I agree the state index can be seen as an instruction counter, albeit into very specialized instructions: there are no two same instructions in a given module, each is uniquely implementing a precise subpart of the algorithm. Also the states decide the flow and select the next state, there is no list of instructions you could program or re-arrange. So I wanted to capture this idea that the algorithm is completely embedded into the circuit itself, which is not capable of doing anything else.

There is definitely a very interesting trade-off between a general instruction set and an extremely specialized state machine like here - combining both seems promising?


Question is: How far will you go, fully playable level?


Unsure, I definitely would like to have fully working environments that can be explored interactively (with proper BSP collisions, doors/lifts, blinking lights etc). Everything is stored in BRAM, unsure the sprites would fit (maybe a few, or at lower res, but I did not look into that yet). When this gets released I hope this will also be reused/expanded.

My initial objective was more about creating some non-trivial hardware using the language I am working on, as well as learning how to implement and optimize algorithms in the context of FPGA. Due to my background in graphics / game programming, revisiting the Doom renderer was a perfect test case! Now that I have this first prototype I want to optimize and fine tune it some more, before adding too many features -- especially as this is meant to serve as a tutorial/example for the language.


Until it takes control input it's more of an electronic sculpture resembling Doom.


True - I only (partially) re-implemented the render loop, and this is far from the complete game. A game is always much more that its core technical components.

Adding a keyboard/joystick input is high on my todo. In terms of moving around this really should be just a question of wiring it to the board: the renderer takes a generic x,y,z + angle viewpoint as did the original engine. However, this also means checking for collisions with the BSP scene which is fun to implement (a nice trick in a BSP is to shift the line equations and check with a point as opposed to checking with a disc of some radius).

Side note: I instrumented chocolate-doom (fantastic port) to output the path shown in the video. Initially I was loading a demo lump, but I realized that these are only the inputs and could not easily reproduce the exact way the game answers them (for example, progressive acceleration and of course collisions).

Next up on my list are correct blinking lights, working doors/lifts and sprites (things + enemies). But I also want to optimize it, and to release the language I used to make this. So quite a huge todo; we'll see how it goes. In any case all of that will be made available so everyone can join the fun!


Great stuff, cool project. Could you share a few thoughts on your language and transpilation approach?


Happy to - but in full this will be better explained on release on the github README, with code examples.

Also, I am no language expert, and not an FPGA expert either (I have been learning for ~ 1 year). I shape this for my own use, hopping it will be useful for others, but I wouldn't pretend nor expect to be achieving something particularly new or interesting at large. Nevertheless, I am using it to build increasingly more complex hardware, the doom-chip being the most advanced so far. Every time the language is extended and fine tuned, so it is rooted in practice.

=> The following is an excerpt from the being-written documentation introduction:

My goal is to make it possible to write algorithms for FPGAs in the same way we write them for processors: defining sequences of operations, subroutines that can be called, and using control flow statements such as while/break. At the same time, the language remains low level. Everything an algorithm does is precisely timed, you can fully exploit the parallelism and niceties of FPGA architectures, clock domains are exposed.

My approach is reminiscent of high performance programming in the late 90s (in the demo scene in particular): the then considered high-level C language was commonly interfaced with time-critical ASM routines. This enabled a best-of-both-worlds situation, with C being used for the overall program flow and ASM used only on carefully optimized hardware dependent routines.

The language aims to do the same, providing a thin programmer friendly layer on top of Verilog, while allowing to call low level Verilog modules whenever needed. It favors and exposes parallelism, so as to fully utilize the FPGA architecture.

The main design principles are: - Prioritize combinational over sequential execution. Parallelism comes first! - Clearly defined rules regarding clock cycle consumption. - Explicit clock domains and reset signals. - Inter-operates easily with Verilog, allowing to import and reuse existing modules. - Familiar C-like syntax (but this is not C! different constructs for parallelism, pipelining, etc.). - Powerful LUA-based pre-processor.


Hmm… I wonder what would it take to create an FPGA hardware version of the PICO-8 fantasy console…

Baremetal projects are truly fascinating -- I cannot wait to read your documentation. There's also a hardware Z-Machine: https://hackaday.com/2014/11/29/the-zork-virtual-machine-imp...

And this dev is working on FPGA Another World: https://github.com/felipesanches/AnotherWorld_FPGA


Thanks, looking forward to reading more. I am a hardware guy, but I see using higher-level languages to generate HDL as being a very powerful approach.


Amazing project! Next step: custom silicon ASIC, the Doom arcade machine


I'd like to talk to you about ray tracing. Have an algo that would be incredibly fast in hardware. I think I may have tried integer math at one time.



>> The AR350 ray-tracing chips process 66 million ray-triangle intersections per second per core.

I was doing 1.5 million Ray's per second against scenes with 100,000 polygons back then on an AMD64.

Acceleration structures are everything. Traversing them quickly and having dynamic updates is hard.


This is the future of hardware. Specialized chips baked for a focused application.


The future sure sounds a lot like the past, given that everything, even toasters have CPU's in them these days.


http://www.cap-lore.com/Hardware/Wheel.html

We’re currently rolling around back to a new era of lots of specialized chips.


IDK, even in operations the size of the cloud providers, executing some of the same things billions of times a second, there's still very few specialized chips. TPU is an example, and I think they all use some for network packet filtering and such. But you don't see anything for DB acceleration or paxos acceleration or maps acceleration or anything like that. They've had years to figure this out, so if they're not finding it worth it at hypercompetitive cloud scale then I don't see the revolution going too awful far anytime soon.


Alibaba has been pushing the envelope pretty heavily in this area. I've heard of them using their FPGA instance for memcachedb instances, and video transcoding instances.


Toasters are moving back to ASICs too:

https://www.diodes.com/assets/Datasheets/PT8A2514A.pdf


You could make toasters by recycling used MacBook Pros!


lol meanwhile my 20 year old off brand toaster continues to toast just fine with 0 chips. As does my 10 year old dial based toaster oven


There are Super Nintendo games like Star Fox that had a second accelerator chip on board. But it would still be cool for the game to be the computer and exist in custom silicon. Development would be super slow of course but the result would literally be the best it could be.

Wonder if the advantage would be worth it. Is anyone trying?


I read somewhere that the chip used in Star Fox (SuperFX) is less an "accelerator" but rather "becomes" the main chip because it hogs the main bus, so the SNES's own CPU can't really do a lot. OTOH the SuperFX is not really a "game in silicon" but just a better CPU that has built-in fixed-point arithmetic including multiply/divide. All this is just from memory, so please take it with a grain of salt.


The SuperFX doesn't really become the main CPU, and has some practical caveats that keep it from doing so. Part of that is that the thing's instruction memory is only 512 bytes.

There are other accelerator chips that do take a more general approach though, like the SA1.


To be fair, the SuperFX was the foundation of an actual honest-to-god 32 bit embedded CPU architecture named ARC (Argonaut RISC Core), produced in the billions, because it was apparently quite efficient.

All the more impressive considering the guys making it had no prior experience in CPU design.


Contrary to popular belief the SuperFX and ARC cores are about as different as can be. RISC back then was a buzzword and applied loose and fast; today we wouldn't call SuperFX a RISC chip if not for the decades of describing it as such. ARC is very much a RISC design though.

Totally agreed on how impressive it was though, despite the differences in nomenclature.


Future? Most high end T&M gear will have an ASIC in it.

FPGAs are already very very prevalent. They aren't too common for hobbyists because although the transistor density in an FPGA has increased the pricing hasn't really followed suit a la regular CPUs - that and the tooling is often comically 1990s


So JavaScript processors?


This feels like a breath of fresh air: too many FPGA applications today resort to implementing a simple CPUs (a.k.a. PSM - “programmable state machine”), and the required logic is implemented in its “machine code.”


I think you'd be surprised how many classic hardware designs take that approach as well. There's nothing wrong with that approach.


even with OpenCL I find FPGA's extremely complicated to program on

it doesn't help that Intel is charging $5,000 licenses to compile on them, for some things


So, like the old 'Brick Game' ensembling a Tetris machine.

BTW, on portability, the Z-Machine has been ported even to "intelligent" pens. There is even a GameBoy, Amiga, Atari and C64 port.

Sorry Doom lovers, but your knowledge on portability is really low. The Z-Machine and the zillion of Infocom games/ homebrew can be run nearly everywhere. No display? Hook up a printer/serial device.

Have a wifi/internet client? Write a dumb gopher client, set up a dumb server on a VPS/Rpi. You could set a Z-machine playing bot to even be able to play it via IRC.

Also, with a custom dfrotz and a FIFO file you may be able even play it over morse with some software which decodes the morse input from radio and sends the commands to the interpreter, sending the game output back.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: