Hacker News new | past | comments | ask | show | jobs | submit login
From Nand to Tetris (2017) (nand2tetris.org)
667 points by mikpanko 9 months ago | hide | past | favorite | 174 comments



I first attempted this 7 years ago, after I graduated college. I got through the first 2 chapters. It was extremely rewarding. But regrettably, I abandoned it for other side projects.

I picked it up again 3 months ago, and it’s been a blast. I’m on chapter 8, having completed the logic gates, ALU, CPU, assembler, and half of the virtual machine.

Every chapter is overwhelming — in a good way. I keep thinking to myself, “how the hell is this going to work?” And when it does, it’s so satisfying.

As a side project, purely for educational purposes, it’s the most rewarding thing I’ve done. I’ve learned so much. It’s a damn good project. And I’m damn proud of myself for sticking with it.

Highly recommended.


I've done this too, and it also took me multiple (3!) tries to get through the entire thing. I interleaved it last fall/winter with Ben Eater's amazing series on building an 8-bit computer on breadboards. I bought everything over several months from China, then built the subsystems as the parts arrived over the winter. You should do that next! Aside from being (maybe even more) rewarding, it looks damn impressive to see all the flashing LEDs and understand what's happening.


Yes! Ben eater’s content accompanies nand to Tetris so well.

I did the 6502 project in which you assemble a computer system on bread boards (wiring together the clock, cpu, ram, etc).

It helped solidify many of the concepts from nand2tetris. For some reason doing it all physically with real chips and wires made it all a bit more memorable.

I’d love to try his other bread board project in which you assemble all the inner workings of a cpu on bread boards as well — I think this is what you were referring to.


>It helped solidify many of the concepts from nand2tetris. For some reason doing it all physically with real chips and wires made it all a bit more memorable.

Hang on a second, does nand2tetris not involve real chips and wires?


No, nand2tetris is done in virtual environments.

This is largely about accessibility. If it’s tied to hardware, fewer people can do it. Additionally, real hardware means there’s additional opportunity for error due to faults in the hardware or environmental variables.

That said, Nand2Tetris could, in theory, be done with a bunch of 74xx NAND chips on a breadboard.


The Nand2Tetris computer is a bad fit for 74xx chips and breadboards. It's 16 bit, for one thing, which implies a lot of chips and a lot of wires on a lot of breadboards, and the usual current problems that come with that. Better have a good source and an oscilloscope nearby. Also, the virtual design sweeps a lot of important timing issues under the rug, so you need some EE chops to translate it correctly to real hardware.

I know of a few people who managed to get a working Nand2Tetris computer on breadboards, but it took a lot more time and money than they thought it would.

On FPGA, though, it works nicely.


Thanks for the recommendation. I'll definitely look into it!


This and books like Code give me such deep respect for the people who originally figured these things out.


I've read Code and most of Computer Systems, a Programmer's perspective, but doing something by hand would've still been better!!


How did you learn the prerequisite solid state physics knowledge in order to fully understand the background behind the first two chapters?

e.g. The actual mechanism by which an integrated circuit transforms input into useful output. I've never been able to find a solid explanation on how even the simplest real world integrated circuit, such as a TI SN7400, actually does that.


You're making a category error, I think. This books/course doesn't cover physics. It doesn't even cover signal stuff, like stuff about how fast the voltage levels stabilize, or even voltages at all.

It's not "silicon wafer to Tetris" or "pile of protons and neutrons to Tetris" or "transistors to Tetris". You start with nand gates (and later also you get flipflops for free).

This course would work equally well on nand gates made of carved wooden gears, or nand gates made with fluidic logic, or nand gates made by encoding the whole thing into O-gauge hobby railroads.

If that's the level of explanation you seek, this book is incredible.


I vaguely remember someone trying to go the other direction, and teach "Tetris to Quake" but I can't find substantiation that course ever existed, and might have confused it with this article:

http://archive.gamedev.net/archive/reference/articles/articl...

I'd also be interested in anything that extends the stack from where nand2tetris left off, because, while I loved it[1], it felt unsatisfying that can't actually compile to usable binaries for the hardware -- your executable can't usually fit in memory and it doesn't teach you how swapping (or whatever) would get you to that point. It also doesn't cover more common hardware/OS issues like interrupts, or having juggle multiple running programs.

[1] most interesting technical project I've ever completed, with the possible exception of Microcorruption.com.


A “silicon to nand gate” would be a sweet complimentary course. However, probably more difficult to make hands on.


Followed by a prequel "Sand to ingot"


> It's not "silicon wafer to Tetris"...

That is a good point, I had just assumed that information was available somewhere online, but it doesn't seem likely.


Lower-level teaching resources definitely exist! Here are my favorites:

- The Zero to ASIC course (and Tiny Tapeout) [1] explains transistor circuits and teaches you an open source software stack---and you get a chip physically manufactured! You could make the Nand to Tetris computer in actual silicon (if you can get enough transistors).

- To learn how things are manufactured on silicon wafers, get textbooks on microfabrication. A decent starting point is [2]. There's also a good video series [3] for a quick overview.

- To understand how a single transistor or diode works, get textbooks on "semiconductor devices". A good starting point is the free online [4].

[1] https://www.zerotoasiccourse.com/ https://tinytapeout.com/

[2] "Introduction to Microelectronic Fabrication" by Jaeger

[3] https://siliconrun.com/our-films/silicon-run-1/

[4] "Modern Semiconductor Devices for Integrated Circuits" by Chenming Hu, https://www.chu.berkeley.edu/modern-semiconductor-devices-fo...


Thanks for the links, but notably there's a large gap between 'a single transistor or diode' and even the simplest real world microchip, such as the SN7400.

Everything before that stage, down to mining ore out of the ground, is understandable.

And everything after that stage is also understandable, at least to the level of an Intel 386 processor.

The gap is what I believe there are no resources online.


Something like this will at least get you conversant. You only need the first few chapters, and can skip BJTs.

https://archive.org/details/microelectronicc00jaeg/page/n11/...


https://nandgame.com/ has a cmos level towards the end. It doesn’t explain the transistor-level physics but does let you play with things.

The art of electronics book might be a good place to start if you want to learn the physical/electrical layer.


I have only faint memories of my beginner's course on this topic at university, and absolutely no knowledge.

Somehow I remember the word MOSFET.

I think the wikipedia articles about logic gates should provide all necessary cross references.

"Fully understand" is an elusive term though. How do you fully understand the solid-state physics of logic gates if you don't fully understand all of physics, chemistry, maybe even quantum mechanics...

Not meaning to be dismissive though! I love to try to fully understand things and hate having to accept black box logic. But I also have to admit that I've given up on this approach for many things a long time ago.

Skimming the course summary, it sounds as if this "Hardware Description Language" might mark the boundary of what this course is about.

Makes sense, it's not "from physics to NAND", it's "from NAND to Tetris" :)


The term "hardware description language" still gives me nightmares 5 years after my only experience working with them. Was working on my master's in an interdisciplinary CS/MIS/CompEng program for Cybersecurity and needed an elective. "Fundamentals of Computer Architecture" sounded kinda cool.

I walk in on the first day not realizing that while I had done my undergrad in MIS (fun fact: this is a business degree), literally every person in the course was either on the last semester of their undergrad in CompEng or were grad students that already had a BS in CompEng (this school combined some undergrad/grad lectures).

Suddenly i hear the teacher say like "grad students will also need to use an HDL and design a processor compatible with the basic MIPS instruction set." I started at "what's HDL mean?" Teacher responds "If that's a real question then it means: Hurry and Drop this Lecture." Day 1 and I already have the wrong questions for the wrong reasons.

That was a really bad 3.5 months... But it's also proof that if you hate literally everything hard enough, then it is absolutely possible to pull a 100 day "zero to MIPS HDL prototyping" speedrun.


As frustrating as it is to black-box certain domains of knowledge, it’s an incredibly useful mental shortcut when used judiciously.


It's also just plain necessary as human knowledge gets more and more complex. The more time you spend on learning, the less time you have to actually make use of that knowledge. Ars longa, vita brevis.


Aside from his basic 8-bit CPU, Ben Eater goes into how transistors work too: https://www.youtube.com/watch?v=DXvAlwMAxiA . Once you've got transistors, his videos walk you through making gates. Once you've got gates, he switches to the 74xx series and builds a CPU.


The course explicitly states that it’s not a physics, or even really an electronics, course. It doesn’t go into the gritty details of how all this stuff works in the real world, just how once it does work you can string it all together to build a computer and then a virtual machine to run on it.


The book is a great example of how we do pretty much everything with computers today: abstraction. You can definitely learn how a transistor works but this book/course explicitly starts with "you've got a NAND chip - GO!"


Ben Eater does have a handful of early videos on his YouTube page that gave me a much better understanding of what's happening down at the physical and particle level. But at the end of the day, to succeed with this material you just need to understand the theoretically digital function of a transistor, not the physical properties.


btw the sn7400 is already a fairly advanced ic; something like an uln2003 is closer to 'the simplest real world integrated circuit'

probably the best place to start for this, in a top-down sequence, is the art of electronics by horowitz and hill. it explains how transistors and diodes act in §1.6, §2, and §3, initially explaining transistors with the simplified 'current amplifier' model (which already goes beyond the 'transistor switch' model you're thinking of), then quantitatively with the ebers–moll model; they're focused on how to use this information to design working circuits from discrete components you can put together on a circuit board. camenzind's designing analog chips (available free online) goes into how to use this information to design actual chips (not only does nand2tetris get into things like metastability and noise margin, the authors seem to be confused about what a chip even is, thinking you can make a chip by assembling other chips)

but the ebers–moll model is still not solid-state physics knowledge. so far the best overview i've found of that is madou's 'fundamentals of microfabrication and nanotechnology' which has a couple of chapters about solid-state physics, going into the historical development of quantum mechanics. but it's not really a quantum mechanics textbook; it's just an overview that shows where quantum-mechanical knowledge fits into understanding solid-state physics

'the feynman lectures on physics' is the best quantum mechanics textbook i've found so far, but because my knowledge of quantum mechanics is even more minimal, please don't trust my recommendation on this

hope this helps. good luck in your learning voyage!


> btw the sn7400 is already a fairly advanced ic; something like an uln2003 is closer to 'the simplest real world integrated circuit'

If the goal is to explain how logic is implemented in general, skipping bipolar transistors and TTL and jumping directly to MOS may be easier. The behavior of a FET is fairly easy to explain, especially if you don't care about the ohmic region (which you usually don't in logic ICs), and it's straightforward to step from there to a practical implementation of a simple gate like an unbuffered NAND -- the latter of which can be trivially assembled on a breadboard with as little as two FETs and a resistor for a NMOS implementation.


> especially if you don't care about the ohmic region (which you usually don't in logic ICs),

you have to care about the ohmic region to be confident you've safely steered clear of it; at least one fet moves through the ohmic region every time a mos gate's output transitions

rtl is the bipolar equivalent of nmos (see the analog simulation at http://tinyurl.com/ylnljbgz) but you do need base resistors if you're going to try to drive its inputs with voltage sources instead of the outputs of other rtl gates. but you can omit them when the inputs are connected to rtl outputs http://tinyurl.com/ywja8z28

the flip side of that is that, though you need a base resistor to provide a constant logic high to an rtl gate, you can provide a low just by leaving the input open, you don't even need a wire like you do for nmos

bipolar logic is also a lot harder for students to blow up if your lab power supply has a current limit on its output


Some older parts are specifically made to make creating complex PCB-level CMOS gates easy. i.e. https://www.ti.com/lit/ds/symlink/cd4007ub.pdf


I think pretty much every sophomore level microelectronics book starts at basic semiconductor physics, works that into pn junctions, then transistors, then amplifiers, then gates and sequential elements.

A typical choice is Sedra & Smith https://learninglink.oup.com/access/sedra8e

But there is no shortage of choices.


sedra & smith is widely considered an excellent recommendation, but on skimming it, it seems that it does not cover solid-state physics at all or even mention any quantum effects; madou does spend a few chapters on solid-state physics, and of course feynman covers elementary quantum mechanics quite comprehensively. sedra & smith does go into a lot more depth on some aspects of chip design than horowitz & hill or even camenzind. it gives the ebers–moll equation in table 6.2, just not by name, and describes the inner workings of transistors in considerably more detail than the other books

horowitz & hill, camenzind, and feynman are much better written than sedra & smith or madou. the quality of the writing in sedra & smith in particular is quite poor; it contradicts itself every few pages, and often says things that require great effort to interpret as a correct statement, at least in the 7th edition i'm looking at

horowitz & hill also have much nicer schematics than sedra & smith or, especially, camenzind


We did the whole project as part of a class; it was one of the best classes I took in my CompSci program. It was excellent.


I loved this course and would strongly recommend it to anyone who works with computers that hasn't taken low-level CS classes.

This course does an exceptional job of giving you an intuitive understanding of some critical parts of how a computer works that affect higher-level programs and programming languages (for me the biggest aha! was grokking the difference between the stack and the heap). It is also incredibly fun to appreciate how magical it is that these complicated machines we use are just built from these simple circuits that you keep building up and building up through the course. Finally, the teachers did a fantastic job of simplifying what could be multiple semesters worth of material to quickly give you the gist of things like assembly languages, without oversimplifying.

Really can't recommend this enough if you have the time to do it. FWIW, the first part of the course (from NAND gates to building a CPU) is very fun and fairly easy. The second part (going from a computer to a full operating system) is quite a bit more work.


Is the online course and the book the same material? Trying to figure out if I need both or either.


The two online courses at Coursera are based on the First Edition of "The Elements of Computing Systems: Building a Modern Computer from First Principles": https://www.amazon.com/Elements-Computing-Systems-Building-P...

The most recent edition of this book is the Second Edition: https://www.amazon.com/dp/0262539802/

To quote from the Preface concerning what the difference between these two editions is:

"The Second Edition

Although Nand to Tetris was always structured around two themes, the second edition makes this structure explicit: The book is now divided into two distinct and standalone parts, Part I: Hardware and Part II: Software. Each part consists of six chapters and six projects and begins with a newly written introduction that sets the stage for the part’s chapters. Importantly, the two parts are independent of each other. Thus, the new book structure lends itself well to quarter-long as well as semester-long courses.

In addition to the two new introduction chapters, the second edition features four new appendices. Following the requests of many learners, these new appendices give focused presentations of various technical topics that, in the first edition, were scattered across the chapters. Another new appendix provides a formal proof that any Boolean function can be built from Nand operators, adding a theoretical perspective to the applied hardware construction projects. Many new sections, figures, and examples were added.

All the chapters and project materials were rewritten with an emphasis on separating abstraction from implementation—a major theme in Nand to Tetris. We took special care to add examples and sections that address the thousands of questions that were posted over the years in Nand to Tetris Q&A forums."


I'd recommend watching the course material and reading each chapter. It depends on a little luck which material clicks and how quickly, for me it varied from chapter to chapter whether or not I could get by with just the videos.


Im currently taking this course without having the book and it works great. I might buy the book later but for me it’s important to have the imposed weekly deadlines in order to not just drift away and do something else instead.


I did the course without the book. I haven’t actually looked through the book, but the Coursera class was definitely enough on its own.


I first tried the course many years back, and didn't complete it (not necessarily only because it was in course form). Last year I bought the 2nd edition book, and found it more accessible, as a do it on your own time type of project, and as a reference while implementing the parts.


I highly recommend this course/book. It's extremely rewarding once you go through it.


I found nand2tetris a very natural progression for me after finishing Ben Eater's great 8bit computer series (https://eater.net/8bit/). It just makes you grok so many basic concepts of computer design which are quite often glossed over. A perfect project to start over the holidays!


Second recommending Ben Eater. I love his videos. Two of my favorites are the introduction of loops, and the introduction of interrupts. Seeing the code he wrote re-written using these to be so much more elegant was very satisfying.


For me the 'aha' moment was the microcode section of the 8-bit computer project. Just the physical understanding of how it can take a different number of clock cycles to execute an 'instruction' - and how (in this architecture) every instruction cycle has to start with the memory address switching to the current program counter, and the data from that address being loaded into a register.

I haven't gone as far as actually building a breadboard CPU, but I was able to apply what Ben teaches to build several functional microprocessors in https://www.falstad.com/circuit/circuitjs.html, starting with basically reimplementing the breadboard model he built out.. but then after watching the 6502 series, going back and being able to see and even implement some of the things that implies about the internals of a 6502 processor.


Did Ben Eaters 6502 computer this past summer. Can also recommend.

Guess I'll have to look into NAND to Tetris now.


I did nand2tetris during my time at Recurse, and I spent a lot of time watching Ben Eater those days. They paired nicely.


Here is a fun Nand game inspired by the course: https://nandgame.com/


Here’s an even more fun one with overwhelmingly positive Steam reviews! https://store.steampowered.com/app/1444480/Turing_Complete/


I played both of them. Turing complete has a more non-linear arcade mode. I mean that you can choose which level you want to complete first.

There is also soundtrack and a little bit of story line present. There is also gog version of this game.

I feel that I learned almost no new skills this semester, but I remember asking my circuit design lecturer questions about this game (in particular, about the possibility of constructing full-adder with fewer elements). That was fun

You can stick to nandgame, if you don't want to pay, you loose almost nothing, but playing Turing Complete is more handy because it keeps progress on your computer and/or on the steam account. So, if you clean cookies every time you restart your browser, when using ungoogled-chromium for example, you better play Turing complete.

Also one internet friend advised me to play incredible pm https://incredible.pm/

It is a similar game, but the main theme is proofs rather than digital circuits. Hasn't played yet, unfortunately.


I've completed both nand2tetris and Turing Complete. A note about differences:

- TC only goes up to the level of writing programs in assembly, while nand2tetris has you build layers of abstraction on top of that so you can program in a Java-like language. In fact, TC doesn't give you "assembly code" at all, you just implement binary instructions, and they leave it to you to decide your own mnemonics for each byte of the 4-byte instructions (on the second computer you implement).

- TC lets you make more of the decisions yourself regarding architecture, like how to address memory.

One thing I didn't like about TC is that, when doing the projects to build the (second) computer, it doesn't regression test. When you implement new opcodes, you can be credited for completing the level, even though you broke the previous ones you implemented, and you might not realize it until several projects later, when your computer doesn't work.


Turing Complete is a great game. Finishing it was a major sense of accomplishment for me, particularly since I rarely complete games (though Zachlikes are a common exception).

This is from my Steam review (one of two ever, I liked it so much):

> Man, what a trip. I’ve played the NAND game and MRHD, among others, but never quite got to the “build a computer” endgame. I did with this one. At this point I’ve built a 256-byte RAM system that does six math instructions, six conditional jumps, and subroutine calls using its own stack—then developed my own assembly language for it which I'm now using in the endgame to solve more traditional programming puzzles.

> And I designed and built the entire architecture for it myself. The initial levels are mostly "one solution only" problems that each results in you designing a component. But by the time you're at midgame, all the decisions are yours as long as the output is correct. Previous problems tend to hint towards partial solutions of later problems, but very little is given to you outright. That gives you an incredible sense of accomplishment for what you put together.

That said, unless they improved the bit where wiring tends to merge together and virtually short out during edits, it can be a little frustrating once things get very complex. It's not enough for me to not recommend it, but there was a point where I felt like I was having to master the tricky interface quirks as much or more than the logic. Shenzen IO did that part much better.


Here is another game on steam that is nearer to the first part of "From Nand to Tetris":

MHRD

https://store.steampowered.com/app/576030/MHRD/


While we're recommending programming games, I think Virtual Circuit Board is a bit underappreciated. It's a pure sandbox, not a game, so you have to bring the ideas yourself, but it hits a really sweet spot somewhere between Minecraft redstone and Logisim.

You just draw traces and add gates on a grid, with a few special IO components like the Nand2Tetris course. But there's some kind of compilation of the circuit, and simulation can be blazing fast, way faster than other programming games/programming friendly sandbox games.


> Virtual Circuit Board is a bit underappreciated.

For the convenience of the reader:

- Website: https://www.virtualcircuitboard.com/

- Steam: https://store.steampowered.com/app/1885690/Virtual_Circuit_B...


There is an emulator on steam called CRUMB.

Rather than being schematic, it is almost too real.

The 3D lab is equipped with a breadboard, power supplies, signal generators, various passive and active components, and chips such as the timer 555.

I attempted to use it to complete a lab from my university, where I had to construct a trigger from a chip with 3 NAND gates.

But at this point getting membership at nearest hackerspace may be a decent option.

> But there's some kind of compilation of the circuit Realistic enough. Back then you could compile your PCB project in altium designer, but now this button would be called validate.

Sorry, if I wrote too much in this discussion thread.


s/trigger/flip-flop/

I forget that in English it is more often called latch and that there are also flip-flops, two different types of what is called triggers in the East.


Definitely recommend Turing Complete. I've been playing it off and on over the last few weeks, just finished the first working CPU. It includes small hints and solutions for most levels, so if you get stuck, you'll always have a way forward. The interesting thing is, for a lot of levels, you can just google the component you're trying to build and use actual logic diagrams as guidance.

The game makes the whole topic a bit easier to grok with its visuals and the ability to step through your circuits when they're running. So, great fun. But beware, if you've been bitten by Factorio addiction, you might be in danger of missing a lot of sleep :)

Also, as some other comments mentioned, I highly recommend the Zachtronics games. Exapunks is amazing. But they're quite different, they're more like puzzle games about programming.


It's a bit different, but I also recommend both TIS-100 and Shenzhen IO. Both of them involve writing in an assembly-esque language, and it's very fun to learn!


I gave up early on TOS-100, but that was before I did nand2tetris. I did Nand2tetris before Shenzhen-IO and really enjoyed it.

Yet to finish Shenzhen IO, but I blame it on me stupidly doing premature optimisation.


Both TIS and Shenzhen are cases where I'd say make something ugly that works, then go back and fix it hours later when you have new tools and experience.

IE, the very first puzzle gets an easier solution once you learn about a hidden command.

There's satisfaction to leaving a solution to come back days later and go "whoah, this better solution is so obvious" which is easy to miss when you're stuck on it for hours.


I can also recommend the Digital Design and Computer Architecture lectures from ETH Zürich if you're trying to understand computers at a lower level:

https://www.youtube.com/playlist?list=PL5Q2soXY2Zi-EImKxYYY1...


this is a gold mine! I've taken an introductory course on computer architecture, unfortunately we didn't get far and definitely didn't touch on anything about out-of-order execution or the similar magic CPUs do nowadays. this'll playlist will be my holidays theme

thanks for sharing!


Great memories of this book, and of the period of my life I picked it up in. I was at coffee with my girlfriend saying I'd read you could make all the logic gates using only NANDs, so we got curious and wrote NAND on a bunch of napkins to see if we could figure them all out.

We did and it was great fun. For each gate we figured it with NANDs we would write the name of the new gate on a napkin.

We took the napkins and the joy of it home and sheet a few days we started combining those gates up as well, trying to eventually figure out an entire ALU. And so along came multiplexors and a bunch of other fascinating stuff.

Eventually we got stumped, but I'd heard about this book and we decided to order it. The rest of the chapters really did help us get an understanding of the low level that had always been mysterious to me as someone with software experience only.

Can't say I ever did make it all the way through in terms of building around, though. Once it got more into the low software levels I felt I already had enough of an understanding that the work didn't seem so appealing, but the read was still fantastic.


There is also this repo from george hotz, very interesting ! https://github.com/geohot/fromthetransistor


I wasn't sure what I was looking at at first, since there's no material, just a rough outline of a hypothetical course. The initial commit makes it a little clearer:

> Wrote this a few years ago, wanted to put it online. Hiring is hard, a lot of modern CS education is really bad, and it's so hard to find people who understand the modern computer stack from first principles. Maybe if I ever get 12 free weeks again I'll offer this as a play at home course. I want to play too.


It’s funny he says a lot of modern CS education is bad

I did Computer Engineering rather than CS for undergrad and we covered like 80% of the topics in that list

Had multiple courses in Verilog/OS and worked a lot with microcontrollers/FPGAs. Building a CPU in verilog then writing an assembler/compiler was definitely the highlight.

Was a hard program but I felt like I had a really good understanding of the full stack coming out of it.

Maybe he just isn’t familiar with CE?


Where I’m did you do undergrad. My son is not having much success finding a college to shoot for in terms of having a goal. I think a curriculum like you describe would at least show him some options that exist.


Gatech, UPenn, U Colorado Boulder all have great CompE programs at both undergrad and graduate levels.


Georgia Tech has (when I was there at least) a good CMPE program.


Seems to me that CE covers sections 1, 2, 3, 7, and a bit of 5, and CS covers 4, 5, and 6. A traditional CS education should teach 3, even though doing 3 is absolutely not the job of CS grads.


I wonder if the prequel "sand2nand" would be possible as a DIY project?



> Native oxide is stripped off the wafer with a quick dilute HF dip and then they are extensively cleaned in Piranha solution (H2SO4:H2O2), RCA 1 (H2O:NH3:H2O2), RCA 2 (H2O:HCL:H2O2), followed by another dilute HF dip.

Fascinating project, but I'm not going to try this one at home


Oh goodness, this is the kind of thing I've been trying to find for years now: some method of making a computer from scratch, as in from raw materials.

Will have to look into your progress, thanks.


The Great Tinkertoy Computer [1] is made from raw materials, although its specific rather than a general purpose computer, and had computer aided design although back in the day it was via a PDP-10.

[1] https://www.computerhistory.org/collections/catalog/X39.81


Just to clarify, not my project! :)


Are you the same Sam? Either way, nice project wow!


Different Sam! :)


different sam, still has a lot of awesome projects


I took this course on a lark when I was working as a data analyst. It inspired me to change careers. Absolutely excellent course. Definitely do it if you've got the time.


Congratulations! Did you switch to a lower level programming career path?


I'm making apps for a local company now. Still valuable info though. I use the text parsing knowledge I gained from this course quite a bit.


That's pretty good. I'm glad that you made the switch you wanted.


Does this still use a custom hardware description language?

The curriculum here is very solid - my only critique is that it uses a custom HDL instead of Verilog, VHDL, or SystemVerilog.

It wouldn’t have been a huge stretch to do that, and make the skills taught that much more real as a result. Without the practical aspect of exposure to real HDLs, it seems more like a toy than a tool.


I don't think there's anything to be gained by switching to VHDL or Verilog. Their custom HDL is a fine HDL for the uses of the course. There's so much commonality that they're effectively interchangeable syntactically. If the idea was to use VHDL or Verilog such that one could actually make the CPU on silicon or an FPGA, then that opens up a whole can of worms that is well beyond the scope of the course. I do think it would be awesome if they had a sequel that did this, but I think it doesn't make sense in the context of the existing course.


Knowing Verilog, I must disagree. Although it would be more beneficial if the book focused on hardware, its primary goal is to teach something other than hardware-specific definitions. HDL serves as a bridge to illustrate communication with predefined logic gates.

The book should include a mention or an appendix to clarify its real-world applications. By the end of chapter five, I was dissatisfied, feeling a lack of control over the crucial aspects of the hardware I just built. However, a book can only do justice to some of the missing pieces of information we have.


I found Chapter 5 to be quite challenging, largely because it took a while to understand the combinatorial way in which any of the chip parts would get new inputs and so on, especially coming from a traditional sequential programming background.

What specifically did you feel like you were lacking from the language?


This was my experience with the course as well. It was fun but not useful.

Learning Verilog later really opened the world of digital design for me and has let me build actually useful things. It's probably a non-goal though for this coarse to turn people into digital designers, this seems more aimed at people with a passing interest in HW as opposed to those who really want to go deep.


This was an amazing course and is one of the most rewarding computer science courses I've taken! I loved that there was nothing left to "magic" and it was the first time I felt like I understood the "full stack" of the Java-like code I was writing right down to the transistors.

Self-plug for a full-blown minesweeper game I made for the final project: https://github.com/billmei/nand2minesweeper It's a complete game with a tutorial, custom RNG, and unit tests, using their hardware simulator.


How many hours/weeks would you estimate it took you to complete the course?


Part 1 was significantly easier than Part 2; Part 1 (the hardware part) only took 2 weekends for me, but every assignment on Part 2 took several weekends each. Granted, this was also because I was learning about operating systems and compilers at the same time; if you're already familiar you could considering skipping Part 2 altogether, or otherwise you might have an easier time than I did.


What I remember from this course is that the first few homeworks can be done in a few hours in one evening, and then you suddenly spend your weekend building a compiler.


So, here the question I want to ask someone who dealt with this course:

How much of the topic of compilers is covered in this course? Have you built an optimizing compiler that creates binaries from a [relatively] high-level language such as C? Or you have just created an assembly for a specific architecture?


Without too much experience in writing compilers beyond this course, I'd say that the course focuses on grounding your conceptual/intuitive knowledge of how compilers work rather than any serious exposition into modern-day production-grade compilers.

You write the assembler for the course's own hardware architecture called Hack, then a compiler backend that converts a stack-based VM intermediate representation (IR) to assembly, and finally a compiler frontend that does syntax analysis and code generation to translate the book's high-level language Jack to the IR.


It mimics Java, so your compiler compiles to a bytecode IR. The compiler as described by the book is a bare-bones recursive descent compiler that doesn't do any AST analysis. The compiler is for a toy language called Jack (fake Java) that was intentionally designed to be easy to write a compiler for. L(1) grammar I believe. They don't really talk about error handling or more complex compiler topics. No optimization methods are covered.

N2Tetris is like doing the 20% to get the 80% of every layer of abstraction. It really scopes down each project so you can cover every layer between logic gates and a high level language with built in OS APIs. I think it's a fantastic course but if you're looking to learn specifically about compilers I'm not sure if it'll meet your needs.


I highly recommend you play the game “Turing complete” alongside this course. Will give you an additional visual way of looking at the problems.


My initial interest in the computer's underlying operations were piqued when I read Code by Charles Petzold. I found myself wanting a more hands on experience and decided to try this. I loved it and learned so much.

I completed the first two projects a few months ago. Hoping to come back to it to complete the rest!


I completed both parts of this on Coursera. It can be a very challenging course at times but it's thoroughly rewarding. One of the key takeaways was a working mental model of how the hardware and low level software that underpins your code works.


I wish there was something like that for computability theory.

Presumably for historical reasons, professors of theoretical computer science love to talk about abstract machines like finite-state machines, pushdown automata, Turing machines etc. Not about logical circuits.

But arguably, logical gates are much more conceptually primitive than most of those automata above! They are basically an implementation of propositional logic, albeit potentially with a dimension of time and delay. And they are nonetheless somewhat close to how actual computers work. So why do they ignore them as a model of computation?

My guess is that they don't talk about them because they only awkwardly harmonize with the classical models of computation: FSMs, TMs and the like, and the neat hierarchy of computability they place them (what languages in the Chomsky hierarchy they recognize).

For example, Turing machines have two kinds of states: Objects called "states", and the states of the tape cells. The former are finite and the latter are infinite. Pushdown automata make a similar distinction into two types of states. Logical circuits, on the other hand, don't distinguish two different kinds of states in such a way. It's all just circuits.

The classical abstract machines have other problems as well: Arguably, among abstract machines, a real CPU is most similar to a universal Turing machine, because it can execute arbitrary programs. But according to the theory of computation, CPUs are merely equivalent to finite-state machines! Because they lack an equivalent of an infinite tape. Infinity is a strange detail to emphasize here, as logical circuits are merely "potentially infinite" in the same way finite-state machines are merely potentially infinite. But that doesn't make them similar to each other. The relevant difference seems to be that circuits with delay allow for "memory" (like flip-flops), while finite-state automata don't.

I would like to see a course or book on theoretical computer science that tackles this issue: "From NAND to Turing Machines".


> But arguably, logical gates are much more conceptually primitive than most of those automata above! They are basically an implementation of propositional logic, albeit potentially with a dimension of time and delay. And they are nonetheless somewhat close to how actual computers work. So why do they ignore them as a model of computation?

Simple: logic circuits for propositional logic are not Turing-complete, and in a lecture about theoretical computer science one wants to teach such more sophisticated, mathematical models of computation.

On the other hand, in basically every beginner lecture about computer engineering (in German: "technische Informatik"), they will teach you how how the transition function of a finite-state machine can be implemented via combinational logic (https://en.wikipedia.org/wiki/Combinational_logic), which can be transformed into logic gates for implementing a Mealy machine (https://en.wikipedia.org/wiki/Mealy_machine) or Moore machine (https://en.wikipedia.org/wiki/Moore_machine) in hardware.


> Simple: logic circuits for propositional logic are not Turing-complete, and in a lecture about theoretical computer science one wants to teach such more sophisticated, mathematical models of computation.

Logical circuits with delay (sequential [1] rather than combinational logic) are indeed not Turing complete -- but in the same uninteresting sense that a CPU is not Turing complete: In a conceptually irrelevant way. As I argued, Turing machines are precisely not "much more sophisticated" than CPUs or the circuits they are made out of. Turing machines merely have infinite rather than potentially infinite memory.

You could modify the concept of a Turing machine such that its tape is potentially rather than actually infinite, and it would change nothing of substance. Yet this machine would suddenly only be considered equivalent to a finite-state machine.

Apart from that, a perhaps theoretically interesting point about combinational logic (circuits without time/delay) and finite state machines would be that combinational logic is apparently computationally weaker than finite-state machines. Wikipedia doesn't say anything about that explicitly, but the "classes of automata" image [2] they include in the article does suggest it. But that, assuming it's true, should be taught in theoretical computer science class, not in computer engineering (if the latter mentions it at all).

But, much more importantly, they -- the theoretical computer science professors, not the engineering guys -- should explain the theoretical relation between circuits with delay (sequential logic) and various automata, including Turing machines.

The likely fact is that they ignore circuits because they don't fit well into their supposedly neat picture of abstract machines. In fact, they call their reliance on infinity as an important difference between models of computation in question.

[1] https://en.wikipedia.org/wiki/Sequential_logic

[2] https://en.wikipedia.org/wiki/File:Automata_theory.svg


as you say, the chomsky hierarchy is surely an influence, but i don't think that's the main thing

i think it's probably because a turing machine can do universal computation and can be described completely in a couple of paragraphs, while you need at least several hundred logic gates to describe something that can do universal computation (if you give it infinite memory), and then you still don't have a convincing argument that what it does is universal in any interesting sense

until you bring in turing machines, anyway

you can write a turing machine in about one line of c; the description of a computer in terms of nands doesn't give you that


That seems to be a somewhat strange comparison. Logical circuits are arguably not much more complex than finite-state machines, and theoretical computer science professors don't have an aversion to those. So it doesn't seem like they find logical circuits too primitive. They even seem to strive for primitivity, as Turing machines are much more conceptually primitive than things like register machines or actual CPUs.

Moreover, a logical circuit itself is actually easier to describe than a Turing machine. Both logical circuits with delay and Turing machines can do "universal computation" for any reasonable sense of the term (demanding an infinite element is arguably not reasonable).

And to give a "convincing argument" that they are universal is quite difficult even for Turing machines. What would such an argument even be? In practice, it's more the fact that there are no known counterexamples which causes us to believe in the Church-Turing thesis (that any problem that we would intuitively consider "computable" is also computable with a Turing machine or logical circuit, and vice versa), rather than any one (positive) "convincing argument".


a nand gate is not too primitive, of course, but building an actual cpu out of nand gates involves several hundred of them, and that circuit is harder to describe than a universal turing machine

turing's original paper gave a convincing argument that turing machines were universal; he said that they could do anything a mathematician could do, since the mathematician can only hold a finite number of symbols from a finite set in his memory at once, and can only fit a finite set of symbols from a finite set on a page in his field of view. so the squares on the turing-machine tape were originally pages in a notebook

it seems like you would benefit more from exposing yourself to things you find strange; you could start by reading turing's strange paper https://www.cs.virginia.edu/~robins/Turing_Paper_1936.pdf

strangeness doesn't guarantee insightfulness, but it's a necessary precondition to it; if you continue to dismiss all strange arguments you will dismiss everything from which you could possibly learn anything, preserving your ignorance like a precious jewel


> a nand gate is not too primitive, of course, but building an actual cpu out of nand gates involves several hundred of them, and that circuit is harder to describe than a universal turing machine

Building a CPU with a Turing machine would also be quite complex. Probably less complex, but only because logical circuits are more primitive.

> turing's original paper gave a convincing argument that turing machines were universal; he said that they could do anything a mathematician could do, since the mathematician can only hold a finite number of symbols from a finite set in his memory at once, and can only fit a finite set of symbols from a finite set on a page in his field of view. so the squares on the turing-machine tape were originally pages in a notebook

Yet according to mainstream theoretical computer science (professors), you didn't mention the allegedly relevant property of Turing machines: That the tape is infinitely long. Otherwise it is considered to be equivalent to a mere finite-state machine. Which is, of course, absurd. I think the main difference between FSMs and Turing machines (or logic circuits with delay) is that the latter allow for some sort of "memory", which is (I would guess) a special type of state that finite-state machines don't support.


mostly your comment is wrong from beginning to end

it is important for novices reading this thread to know not to treat it as useful information


You might get more mileage out of Abstract, State Machines. They operate on structures instead of characters. Quite a few tools exist for them, too.


not sure what you mean but i'm glad to hear from you again


Maybe something like this?

https://computationbook.com/


Judging from the overview, they seem to implement classical abstract machines using Ruby. Implementing something relatively simple (an abstract machine) with something complex (Ruby) is quite unilluminating. I was asking for someone who implements them using logical circuits. While also explaining how their different "power" (in terms of languages they recognize in the Chomsky hierarchy) is reflected in their logical circuits.


The book is great - so, so cool seeing the layer upon layer of logic (and abstractions) that build up.


Cannot recommend this course enough


My favorite thing about the book is that each section builds on things, but you don't have to go through the whole thing to get there. So, for example, if you did a lot of hardware in the past but never got around to writing a compiler, you could in theory start there. Or if you did a lot of software before but never actually built an ALU, etc, you can get a lot out of just doing the hardware sections.


This project helped me get my first job after college.


Reminded of the Code Golf challenge to build a working game of Tetris in Conway's Game of Life: https://codegolf.stackexchange.com/questions/11880/build-a-w...


If I'm into web development, how practically useful is learning low level stuff like this, factoring in opportunity cost?


I see this question as analogous to 'how useful is building a car from scratch?'

The person building a cat from scratch will undoubtedly understand more than others, and may be able to fix that home brew car. They may, with some difficulty be able to apply that understanding to other cars.

But, you definitely don't need to build a car from scratch to be a race car driver.


building a cat from scratch

That would be an interesting project.


> building a cat from scratch

> That would be an interesting project.

Here is the source code of the OpenBSD implementation of cat:

> https://github.com/openbsd/src/blob/master/bin/cat/cat.c

and here of the GNU coreutils implementation:

> https://github.com/coreutils/coreutils/blob/master/src/cat.c

Thus: I don't think building a cat from scratch or creating a tutorial about that topic is particularly hard (even though the HN audience would likely be interested in it). :-)


Building a shell from scratch (or some pre-made starting point) seems to be an exercise in a lot of operating systems courses and also in Tanenbaum's book on Operating Systems.

There is this guide, divided in parts: https://github.com/tokenrove/build-your-own-shell

I think you right, and implementing core utils is a nice exercise in system programming. Maybe, it even can be used to create some automated tasks with tests on codewars.

Once upon a time I implemented expr(1) but it was too simple, without regex part.


I'm 99% sure that kjs3 was joking about building the kind of cat that goes "meow", not a program. ;)


100%. And considering the OP mistyped "car" as "cat", I'm really, really hoping aleph_minus_one was making a different kind of attempt at humor, but around here you never know.


You should then alias "cat" to print "meow" instead of concatenating files.

;-) ;-) ;-)


It depends whether you have any opportunities at the moment that are directly attributable to a singular focus on "web development". That's not a productive or healthy way to look at things imo—though it might seem like it if you're relatively knew—but if your prospects are good right now, then this won't help you necessarily keep your current job or get another one. It will help you reason about code, logic, and computers in a way you wouldn't otherwise, but you'll have a hard time if you're not enthusiastic about learning for the novelty of it, because it will eventually be quite challenging and you'll need to allocate significant amounts of time.

I say that it's not a productive way to look at it, because intensity and longevity of singular professional focus only lasts as long as your intellectual and physical infrastructure around it, which means doing things that are somewhere between a short and long distance away from your core subject; sleeping, learning fundamentals, algorithms, design, hiking etc.. (Web Dev, as with anything, can get just as intensely unstimulating as it can stimulating)

Also, learning how your language kind of gets compiled is just fascinating.


To elaborate on this slightly, when I eventually burnt out from just grinding frontend and working a stressful frontend job, I spoke to someone much more experienced than me in medicine who suggested that basically anything can be interesting if you keep going deeper. That's stuck with me in the 6 years since, and I don't worry much about drilling web stuff as much


In my opinion, learning anything in your field is useful because it helps you form a better mental representation of what you're doing. Will building electronics directly help with mundane web dev? Maybe not. But if you think it's cool, do it! Have fun! Broaden your horizons!


If opportunity cost is how you decide such things, I'm guessing this is not for you. There's undoubtedly some new JS framework that's about to be The Next Big Thing.


As a backend web developer you can learn about the difference in pre-fork and polling model in web servers, which interconnected with c10k

http://www.kegel.com/c10k.html

This will give you ability to reason about the web server configuration you want to use.

But both fork(2) and epoll(7) [kqueue(2), iocp] would stay at low level relatively of place where you operate.

Don't know what to say about fronted though, but there are probably some new points of view on JS you can get by implementing it as an interpreter in courses like crafting interpreters.


Understanding intuitively how compilers work, and how the various translators to/from VMs and machine code, is, I think, pretty useful. Maybe not in day-to-day web dev work, but it will help you around the edges.


opportunity cost is like 100 hours, which is pretty small

usefulness for web development is that probably right now there are things you think are 'too hard' or even inconceivable

after those 100 hours nothing will be inconceivable


Nothing is going to beat the SAP-1 computer and original textbook by Malvino for this experience.


Is the book you're referring to Digital Computer Electronics by Albert Paul Malvino?


Yes, that’s the one. I built one of those about 7 years ago.


Thank you! You built it in hardware?


Yes, there is a component list at the back of the book that lists everything you need. The whole book is about building the simple as possible computer while explaining each part of it. It was a fantastic experience.

Pretty sure that’s the same one that Ben Eater built based off of this book.


Oh, nice. Thanks for the reference. I'll check it out.


Similar: "Show HN: Tetris, but the blocks are ARM instructions that execute in the browser" https://news.ycombinator.com/item?id=37086102


This is a great course, well put together, and very fun.

I did the second part in Haskell too! I wasn't sure what their container environment is like so I stuck with only the `base` library and it worked out well enough.


I intend to pick back up from where I left off about a year and a half ago, I believe chapter 6. Super interesting project where I learned a lot about using an HDL, computer architecture, and gates in general.


I graduated from this and I cannot recommend it highly enough.

If you don't have a computer science or EE background, it completely de-mystifies the fundamentals of your machine in a deeply engaging way.


Is the estimated hours realistic? 43 hours for part 1 and 89 for part.


It'll vary a lot by your background and how much you play around along the way.

If you're pretty familiar with most of the concepts and stick to the requirements, you can do the whole thing in 40-50h. OTOH, if you're encountering new concepts and really want to internalize them and/or if you take fun detours then I'd plan on 100-150h or more.


It also varies by how professional you provide your solutions. If you start wanting to test things, give proper errors, etc., it will start increasing the time needed by quite a lot.


Much like any software, it depends on how quickly you get the hang of it, how many blocks of good time you can allocate, and what your existing exposure is to digital logic and circuits.

I'm on the final chapter of the hardware section after probably 5 months, where the last 2 months haven't been nearly as focused and it showed immediately in my progress, and overall I feel like my success rate looks like a bell curve; confusing at first and slow, then not confusing and more productive, and then the CPU took me like a month of taking periodic goes at it. Now at the assembler, I feel like this the easier part of any sequential programming involved in the latter half of the course, and it'll ramp up significantly.

89 seems plausible, if you get lucky, have good combinatorial logic exposure already, and can really dedicate a decent portion of most days to it, but I'd probably say that's a generous minimum.


I only did part 1 so far, but it felt like less than that. Whether that was because of experience, or because it was fun enough that it felt shorter, I don't know!


Sounds about right from my experience. I did this in a 10-week academic quarter spending ~10-15 hours a week on the projects. Might be a bit more time if you have little background or lack someone to explain things to you if/when you get stuck. The compiler and OS are by far the most arduous parts.


I did this over a summer and really recommend it!

The pacing was just right for me


I want sand2tetris next.


Sand2NAND is the only thing missing.

I'm reminded though of the late Dave Gingery's series of books, Shop From Scrap [1] where he tries to take you from building a foundry up to machine tools:

[1] https://gingerybookstore.com/MetalWorkingShopFromScrapSeries...


Still waiting for Quantum Mechanics to Tetris


Loved this course, seriously one of my all time favorites end to end. Have yet to find a course that had the same level of fluid journey


Related. Others?

Ask HN: What books or courses do you know similar to "From Nand to Tetris"? - https://news.ycombinator.com/item?id=36853931 - July 2023 (29 comments)

From Nand to Tetris (Building a Modern Computer From First Principles) - https://news.ycombinator.com/item?id=27030046 - May 2021 (1 comment)

The Elements of Computing Systems, Second Edition - https://news.ycombinator.com/item?id=26036790 - Feb 2021 (97 comments)

July 2021: 2nd edition of The Elements of Computing Systems - https://news.ycombinator.com/item?id=25329763 - Dec 2020 (1 comment)

From Nand to Tetris: Building a Modern Computer from First Principles - https://news.ycombinator.com/item?id=18519883 - Nov 2018 (47 comments)

Build a Modern Computer from First Principles: Nand to Tetris Part II - https://news.ycombinator.com/item?id=14526344 - June 2017 (35 comments)

Build a Modern Computer from First Principles: From Nand to Tetris - https://news.ycombinator.com/item?id=13209452 - Dec 2016 (17 comments)

Building a Modern Computer from First Principles - https://news.ycombinator.com/item?id=12333508 - Aug 2016 (26 comments)

Nand2Tetris: Building a Modern Computer from First Principles - https://news.ycombinator.com/item?id=10369795 - Oct 2015 (1 comment)

VHDL implementation of Hack computer from “Nand to Tetris” - https://news.ycombinator.com/item?id=10021275 - Aug 2015 (4 comments)

Nand to Tetris 1 - https://news.ycombinator.com/item?id=9593114 - May 2015 (24 comments)

From NAND to Tetris: The Elements of Computing Systems [repost] - https://news.ycombinator.com/item?id=6963338 - Dec 2013 (3 comments)

Building a Modern Computer from First Principles - https://news.ycombinator.com/item?id=5888705 - June 2013 (83 comments)

Building a Modern Computer from First Principles - https://news.ycombinator.com/item?id=4643836 - Oct 2012 (1 comment)

Online course: Build your own simulated computer, assembler, lang, OS, & game - https://news.ycombinator.com/item?id=2928973 - Aug 2011 (29 comments)

Elements of Computing Systems - Building a Modern Computer from 1st Principles - https://news.ycombinator.com/item?id=2273098 - Feb 2011 (2 comments)

The Elements of Computing Systems ("From NAND to Tetris") - https://news.ycombinator.com/item?id=1834864 - Oct 2010 (2 comments)

From Nand to Tetris in 12 steps - https://news.ycombinator.com/item?id=399141 - Dec 2008 (3 comments)

Building a Modern Computer from First Principles - https://news.ycombinator.com/item?id=205322 - May 2008 (9 comments)


I loved and completed this course, just wish I could actually be working on this in my day job!


Cannot recommend it enough.


Those who have completed this, do you recommend the book or MOOC?


I found the book more accessible, if you just want to flip through it, or have it open while completing some of the projects. I'm at the last chapter of the book, while a few years back doing the course, I only completed 3 projects. But of course, if you've completed many other MOOCs and like that format, go for it. Just start and see where that takes you.


Very cool article


[flagged]


Think more along the lines of von Neumann machines, rather than a M2 Macbook Pro. The 40s rather than right now. "Modern" can denote a specific era in the past, or the current time, depending on the context in which the word is used.


>"Modern" can denote a specific era in the past

No, it can't. Modern can not refer to a previous era.


Definition of "modern art": "Modern art includes artistic work produced during the period extending roughly from the 1860s to the 1970s".

via https://en.wikipedia.org/wiki/Modern_art

"Modern" can refer to a previous era, and in fact it does!


This is a set phrase. When the phrase was coined modern did refer to the current era. Modern computer is not a set phrase coined back in the 40s.



>The modern era[a] is the period of human history that succeeds the Middle Ages (which ended around 1500 AD) up to the present.

This was coined in around 1500 AD to refer to the current era. If the present is the modern era, then that backs up my position that modern refers to the current time.


Your position was that it can only possibly refer to the current time which is clearly inaccurate as shown many times throughout this pedantithread. The fact that it sometimes or even often refers to that doesn't make it any less inaccurate.


But this is just a matter of how thinly one chooses to partition the axis of time. You cannot have as many eras as points in time. Therefore, eras are intervals of time. So today and the 1940s can well belong to the same era, esp. given that human history extends thousands of years to the past.


In the context of computing, due to how fast things have progressed, the interval of time is relatively short.


This is incredibly incorrect. If you are going to bore Hacker News with pedantry-- at least bother to look up that the "modern" era ended somewhere around 1945.

So yes, it refers to a specific era in the past.


The modern era was the current era when that term was coined. The usage of modern in modern computers does not refer to the modern era. Here is a link to an example of Intel using the term "Modern CPU Architecture."

https://www.intel.com/content/www/us/en/architecture-and-tec...

Here Intel is talking about CPUs from the present. It isn't talking about CPUs from the 1940s.


Quite a hill to die on. If I were to exhibit the same kind of pedantry you are, where a compound noun can have exactly one meaning regardless of context and specialized terms are disallowed, then I'd point out that computer != CPU so your example is irrelevant.

Folks here are genuinely trying to help. "Modern computer" clearly has a special meaning in the context of building a computer from scratch; the term is used in contrast to calculators or specialized number-crunching machines, both of which are computers and are of interest to tinkerers.


>where a compound noun can have exactly one meaning

I am not trying to assert that. In computing when setting out to design or make something "modern" that has 1 expected meaning.

>then I'd point out that computer != CPU

The book focuses on the CPU part of the computer, but the way memory, graphics, and input is done isn't modern either.

>the term is used in contrast to calculators or specialized number-crunching machines

Those have CPUs in them too and most would be more advanced than the CPU from this book.


I think you're looking for the term "contemporary", at least in the context of art and design.


Man, there's not only postmodern, there's post-contemporary. Where do you even go from there?

https://en.wikipedia.org/wiki/Post-contemporary


That's it. We're nearing the end of time. I'm sorry to be the bearer of bad news, but this is it. I had a good time, thanks everyone! Good bye.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: