Hacker News new | past | comments | ask | show | jobs | submit login
Logic.ly – Digital logic simulator for teaching logic gates and digital circuits (logic.ly)
203 points by peter_d_sherman on June 10, 2020 | hide | past | favorite | 23 comments



In a similar vein, NandGame (http://nandgame.com/) is an online game where the challenge is to progressively build a microprocessor, starting from simple logic gates.

Show HN thread from a couple of years ago: https://news.ycombinator.com/item?id=17508151


I really liked https://asteriskman7.github.io/dldtg/

Its a cookie clicker type game mixed with an almost VHDL like circuit design language.


(Disclosure: I wrote the simulation engine that powers CircuitLab, a mixed-mode [i.e. analog+digital combined] circuit simulator, so this builds on the digital-only logic simulation concept.)

If you find it fun to play with digital logic, you should first look at combinational logic, and then add some registers and get clocked logic with memory / registers. Once you've got that down, I'd recommend two next steps:

1. Look at some simple circuits that combine analog and digital. For example, [1] a digital 4-bit counter made up of four half-adders. (It's called a half adder because it only adds two bits A+B, producing a value 0, 1, or 2. In contrast, a full adder adds three bits A+B+C, producing a value 0, 1, 2, or 3.) The four output bits of the 4-bit counter are connected to a simple op-amp and resistor based digital-to-analog converter (DAC). This helps you understand that we have lots of ways of representing signals, and in engineering practice we combine these modalities to useful effect.

2. Instead of merely using software to model circuits, understand that circuits are a great way to model all sorts of real-world problems. You may have heard of "analog computers" [2] which predate digital ones, and were used for modeling all sorts of engineering problems. You may now use a circuit simulator to model mechanical systems, thermal systems, or even the spread of COVID-19. I've done the latter here [3], using a few capacitors and current sources to implement the differential equations of an epidemic. Because you can define current sources algebraically (i.e. the current can be proportional to other currents and voltages in the circuit), you can easily take advantage of the simulator's underlying ability to simulate arbitrary systems of differential equations. I've explained the COVID-19 model here [4] as previously discussed on HN.

[1] https://www.circuitlab.com/editor/53xa3r/

[2] https://en.wikipedia.org/wiki/Analog_computer

[3] https://www.circuitlab.com/editor/zubfhu8p3q3v/

[4] https://www.circuitlab.com/blog/2020/05/28/surprising-covid-...


If this is sort thing looks fun, but you would want more of a game then one unique gaming company builds games where you create "fantasy circuits".

In TIS-100 you write assembly code for an unusual 12-node interconnected machine.

In the Shenzen I/O you write code to interface with pretend hardware, again with highly limited programming space but quite more accessible than TIS-100: https://store.steampowered.com/app/504210/SHENZHEN_IO/

An interesting twist is that most of the game manual is pretend hardware datasheets.

Hey, Zachtronics should do Microservices: The Game next.


Logic.ly or similar are great resources to use in conjunction with Charles Petzold's Code and the Nand to Tetris book. Together they pretty much enable you to understand through doing from the simplest circuit imaginable (power, switch and lightbulb) to building a fully working computer. I know because I used them in that way and found it doable whereas a lot of the computer/programming topics discussed here are way over my head.


Brainbox (http://www.brainbox-demo.de/home/) is another similar logic simulator. It is open source and the code is well-written.


Back in Highschool I used a similar program to leach myself digital logic, Logisim[0]. Although it's no longer maintained by it's original author, it does still run, and I don't remember any outstanding trouble with it.

I think the main benefit compared to Logic.ly, aside from being FOSS, is that Logisim has a number of different simulated input/output devices (button, joystick, keyboard, LED, 7-seg, hex digit, LED matrix, and tty), which is useful if you want to try and build a toy CPU or other such things. I recall it also had pretty good support for modules.

That said, it looks like Logic.ly is maybe targeting more of an intro level of course work, where stuff like TTYs and keyboards wouldn't be relevant.

0 - http://www.cburch.com/logisim/


Another cool digital logic simulator is http://boolr.me/

I played with it while reading the excellent book Code: Hidden Language of computers by Charles Petzold.


If you are interested in an offline foss tool I would suggest logisim.


Why not put the wires on a grid? Looks gross compared to every other circuit tool that puts the wires on a neat grid.

My personal favorite digital logic simulator is Logisim

http://www.cburch.com/logisim/


I used logic.ly for a class last semester, we used it for about a month and built up to a very basic cpu. I found it pretty easy to pick up and fun to work in. Wouldn't be surprised if a lot of online computer arch classes use this in the coming online semester.


Cool. My favourite class in University went from low level, transistors, to gates to assembly. We used logisim and programmed our logic to an FPGA board. Final logic lab was building the circuit for a processor with two instructions of your choice and registers.


I love it.

How is this implemented?

Is there like a metamodel behind it? is the editor completly selfmade? etc..


(Disclosure: I wrote the simulation engine behind CircuitLab, a mixed-mode [i.e. analog+digital combined] circuit simulation engine.)

Writing software to do digital logic simulation is quite simple. Everything is event-driven: if a signal "A" changes from 0->1 at time t_0, and this signal is an input to some gate "X", then at time t_0 + t_p (propagation delay) the output of that gate will update with the new value, as defined by that particular gate/register/whatever. So events can originate from (1) external signals (such as buttons/switches), (2) clocks, (3) the output of an earlier gate updating its output after a propagation delay. You can then just propagate changes throughout the digital circuit, looking at what's connected to the signal that just changed state.

It gets a bit more tricky once you combine analog and digital simulations simultaneously. :)


What do you do if the propagation delays are unknown or not constant (e.g., due to manufacturing process variations)?


Real digital logic is usually clocked at a rate which is slow enough for propagation and other effects to settle AND/OR for chip design, parts are speed binned after manufacture.

The principle is the same - you drop the clock rate until the circuit works reliably. You can usually do some ballpark estimates while designing - e.g. a small board full of TTL will usually be happy clocked at 10% of the theoretical maximum, it may work at 25%, 50% is optimistic unless the circuit is trivial, and 100% is only possible for a single gate or two if the board design is also fast enough.

Occasionally you'll see non-clocked designs, but they're much harder to model at speed because everything turns into a monostable, and each monostable will have a time tolerance. This is a bad thing from a design and reliability POV.

It's the same principle at the pro level, with the difference that you have multi-layer boards and much faster clock rates. The board routing uses some tricks to minimise propagation, switching transients, and transmission line reflections, but mostly there's incredibly advanced engineering and modelling involved in making a small-ish board that works reliably at GHz rates - certainly more complex than trying to guesstimate propagation using a very simple delay model.


Thank you for your answer. Since non-clocked designs are hard to model and clocked designs at high rates are also hard to model, is there ever a crossover point in hardness (e.g., when they exceed a a certain number of components or physical size) or are clocked designs always considered asymptotically easier? On a related note, does the competitive advantage of one hardware manufacturer over another reduce primarily to a matter of which one has better circuit simulation and routing software?


I have just a basic understanding but for example in a serial to digital converter they will use a base clock and a phase delayed clock signals to capture multiple samples in one base clock cycle time instead of just cranking up the base clock frequency.



This makes me think of the Von Neumann bottleneck. How much do you think our CPU designs are affecting the high level languages of today?


Is the web version feature-complete? What's included in the desktop version that is not available in the online demo?


Simply awesome!


this is awesome




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: