Hacker News new | past | comments | ask | show | jobs | submit login
The Analog Thing: an open source, educational, low-cost modern analog computer (the-analog-thing.org)
272 points by uticus on June 2, 2023 | hide | past | favorite | 130 comments



A very popular analog computer patchable with 4mm banana jacks was the Comdyna GP-6. I went down a rabbit hole of reverse engineering it from photos and schematics years ago in order to build a version that would fit in a 3U panel for Eurorack modular synth interfacing. Never finished it, but the total BOM was... well, not much. The panel was going to cost more than the board+opamps+passives. I wanted to buy a real Comdyna but I remember the cheapest I found one for sale was ~$700 and I thought that was expensive. Not so much when you compare it to this unit which is housed in just open PCBs and a much smaller format.

If you are a student and want to learn a bit more about how operational amplifiers, resistors, and capacitors alone can solve differential equations... you can get an educational license for Matlab for nothing and Simulink can emulate all the analog circuitry you want and more.

Another approach might be using the Eurorack simulation software VCV Rack - If there isn't an set of modules you can install that do basic gain, integrators, multipliers, comparators, etc then it would be very easy for someone to write one.


> If there isn't an set of modules you can install that do basic gain, integrators, multipliers, comparators, etc then it would be very easy for someone to write one.

VCV Rack cannot do Zero Delay Feedback because each cable adds one sample of delay. You can do it inside your module, of course.


Simulink is of course the modern replacement for an analog computer and does a lot more. For open source there is Scilab xcos and underneath it OpenModelica.

It would be crude and inefficient but you also could use spice and rig up op amps just like a physical analog computer if you just want to learn or if spice is your comfort zone. Or if you want to add additional analog circuitry or spice models. It’s the 21st century so it will be fast enough computationally for small to medium sized models.


Might I suggest the LMN-3 - soup to nuts the most positive project I see and gaining traction and affordability: https://youtu.be/h5UmPTttN1s



I've had an ebay watchlist for the Comdyna GP-6 for a few years now. Seems like Im not the only one in this hole. :-)


Question since you appear familiar with math simulation, could you do what you're describing with Matlab with Octave instead?


I'm not familiar with matlab or octave :) but I do know the discrete and continuous underlying mathematics involved here, and from the https://octave.org/ homepage, "the Octave syntax is largely compatible with Matlab", the answer is yes.

just to ELI5 for folks, the cool thing about analog computers (or digital approximations or simulations) is that since they are actually integrating (storing charge in a capacitor as the voltage varies) or actually differentiating (current through a capacitor varies with the voltage differential/rate of change) they can calculate what would be very complex to do on paper.

Or more plainly, if the rate that the water flows out of the drain of the bathtub depends on the water pressure above which depends on the depth of the water in the tub, you can study all the math involved (what you did in calculus) or you can just put water in the tub and set your stopwatch (analog computer); and electrons through or in a capacitor is just like water in a tub

this is basically all that's involved inside a music synthesizer. The complexity comes from stacking together a variety of different concepts once you understand that the concepts are stackable (like when they taught you about arithmetic being commutative, associative, distributive, reflexive, etc)


Yes.

Easier with Julia, though, because the ODE and PDE support is better and you have better / simpler notation. Performance will be better as well, but for simple problems you can't tell.


For circuits a lot of them are represented by differential-algebraic equations (DAEs) and require modeling tools in order to handle the high differential index of the systems. This is the reason why they are typically handled via acausal modeling systems which can do index reduction. For Julia, this is the ModelingToolkit portion of the SciML ecosystem (https://docs.sciml.ai/ModelingToolkit/stable/), and some modeling tools like https://github.com/ModiaSim/Modia.jl and OpenModelica front-ends https://github.com/OpenModelica/OMJulia.jl.


Yes, but simulink's graphical interface makes simple circuit simulation a bit easier for rapid iteration, testing, etc.


I'm wondering if Modelica might be useful for this. Its an open source modeling language though I've never used it. Its basically writing models with differential equations for each of the components and then connect them together to form a system.

https://en.wikipedia.org/wiki/Modelica


The creators of Modelica, Hilding Elmqvist and Martin Otter, work in Julia these days so there's been a boom to the development of acausal modeling tooling in Julia. They built https://github.com/ModiaSim/Modia.jl and then we collaborated with them on some ideas to build ModelingToolkit.jl (https://docs.sciml.ai/ModelingToolkit/dev/). The latest versions and experiments in OpenModelica are now targeting ModelingToolkit as a compilation backend, which gives a pathway for legacy Modelica code to be used with the MTK ecosystem. The OpenModelica community has always been a very small group in comparison to the Dymola compiler team and so the open part of Modelica never really caught on, but we hope to help them piggyback off of our compiler to get it there. We'll have a workshop on this collaboration in Linköping next February if you're interested in the details.


I have used https://openmodelica.org/. It's another one of those "invest substantial time understanding what they are doing, and you will likely be rewarded." Simple things are simple, but I found it took longer than I would have like to bang out e.g. a PID algo with a particular motor and load. If you want to try this, buy the book, and follow it to the end. Then keep learning, as you build more and more sophisticated simulators.


I'd argue that acausal modeling is much easier for circuit design. And there's quite a bit of research, including surveys of experts, that backs it up as well. I recently put out a video exactly on this topic that summarizes some of the research and shows some examples to this end (https://www.youtube.com/watch?v=ZYkojUozeC4). Acausal modeling definitely isn't something that is taught in most (American) engineering schools, but once you get the brain flip it pretty clearly can improve productivity, no matter which acausal modeling language is used.


Responding at the bottom of the thread here, and also just to say Hi to Chris above.

So in my top level I mentioned Simulink as a nice path for students to learn modeling circuits, and in a way then could use this path to simulate what an Analog computer such as that in the original link is running.

My early forays into numerical simulation and linear systems modeling were thru Simulink, and I've built models of some very big systems in Simulink and used them in the design process of various things- typically vehicles, automotive systems; modeling powertrains, electro-hydraulic controls, EV drive cycle range simulation, etc. That was all many years ago.

I then went down the path of learning a bit of OpenModelica and as I was intrigued by the acausal approach and the open source aspect. Like anything the learning curve was steep (steeper than simulink) and I never got past a lot of clunkiness.

Then I discovered Julia a few years ago and even though I've moved on to other work, I've followed how people using Julia are modeling and solving closely, and what Chris says about how building acausal systems can be easier, and the benefits seen in ModelingToolkit.jl combined with the full Julia ecosystem all rings so true. I'm really excited to eventually see a full on acausal, block-based, yet easily extendable Simulink competitor sprout from Julia devs because as it does it will be the killer app for the language for a lot of engineers, IMO.

Funny enough, now Mathworks implements an acausal modeling approach in the Simulink GUI domain with their 'SimScape' add-on, but I have not gotten a chance to use it.

Anyway, yeah I need to get a project going to utilize some of the latest happenings in Julia modeling, and maybe find a way to contribute. Cheers to you and the other devs, Chris!


Matlab has Simulink, but I don't think there is anything equivalent for Octave. However, Scilab has Xcos. In general, if you don't need total compatibility with Matlab, you should take a look at Scilab, because it is much more polished than Octave.


Very familiar-looking to anyone who has been following the modular synthesizer renaissance. Modular synths are basically analog computers, just not well-optimized for precision in most cases. All the basic building blocks of THAT are available as modules from a number of boutique manufacturers.


Yes but the emphasis is different. Modular synths typically come with oscillators; analog computers typically don't. But if you want an oscillator you can build one out of integrators in a feedback configuration.

Modular synths typically come with filters; analog computers typically don't. Analog computers do come with integrators which could be considered very weird lowpass filters. Likewise a differentiator can be considered a weird and always imperfect highpass filter.

Both typically come with adders, but a modular synth will call this a mixer. Both typically come with multipliers but a modular synth will often call a multiplier a ring modulator (and it might not multiply all four quadrants).

(Just to make things more confusing, at RF frequencies the word "mixer" typically refers to a multiplier rather than an adder. In the audio range, mixers are adders.)

Modular synths are designed for making sound; analog computers are designed for precise, much slower waveform generation. Its circuits might not even be capable of operating fast enough to make sound.


Yeah- as a long time modular synth user I practically drool over stuff like this. But for the money I can just get more synth modules.


I was looking for the modular synth comment(s). That's immediately what came to mind when I saw this. How can I integrate this with my synths.


As long as everything is voltage controlled, your voltage ranges match, and your analog computer is set to generate voltages rather than currents you should be able to use an analog computer to generate very wild LFO-type control signals. It might not work fast enough to replace your regular (audio) oscillators however.


It was so cool to realize that NIN used one for making a song. Caustic had an emulated one with a lot of features, but I dont know if there are other DAWs with similar ones...


The main ones are:

- Bitwig

- Softube Modular

- Cherry Audio Voltage Modular

- VCV Rack

- miRack

- Reaktor Blocks

- Nord Modular

- Max/MSP BEAP

Can't emphasize enough how awesome Eurorack is though


And "not well-optimized for precision" is often considered a feature.


The value of analogue computation is limited. They are not universal computers. The problem with analogue computers is that there's a limit on the number of consecutive operations that they can perform to produce a useful result. With digital computers, errors can be corrected after each operation. Whereas with an analogue computer, evey step introduces noise which cannot be corrected. It cannot be corrected because in an analogue system any input value is 'valid'. In a digital system input values that are slightly higher or slightly lower than a digit can be corrected to the nearest digit.


Also, making analog computer components requires high precision. With digital components, the individual elements are driven to saturation, so much less precision is needed. This means digital components can be made very small. Indeed, this is why integrated circuits work so well.


This recent article talks about advantages of analog computers and their possible uses as coprocessors to digital processors

https://www.wired.com/story/unbelievable-zombie-comeback-ana...


Analog coprocessors are the actual mission of their parent company<1> This computer is their marketing tool!

<1> https://www.anabrid.com/


While it's true that analog computers have limitations in terms of precision and error correction compared to digital computers, they can still offer some advantages in certain applications. For example, analog computers can be faster and more energy-efficient for specific tasks, such as solving differential equations or simulating physical systems.


there’s a fallacy in here somewhere because human brain


The human brain is digital. The human genome is also digital for the same reason.


That's cute. Almost exactly the same capabilities as the tube-era Heathkit EC-1 educational analog computer [1], but much smaller.

Analog computers are no fun without an oscilloscope. Once you can see graphs, you get intuition about how the inputs affect the outputs. If you only have a meter, you have to write down data and plot.

[1] https://www.analogmuseum.org/library/heathkit_ec1_operation_...


This is my EC-1 simulating a projectile fired from a cannon, driving a plotter; a Comdyna GP-6 showing 'chaos'; and Joe explaining "Bouncing Ball" on another EC-1, complete with original 3" Heathkit 'scope as output (close-ups of it in operation at then end). 66 sec vid clip from the West Coast Vintage Computer Festival, 2021 https://drive.google.com/file/d/1gbF9RZ_UZrAIBq10mJ_SmZru1Gt...

Some "GlowFETs" in the EC-1, using only filament lighting and a long shutter https://drive.google.com/file/d/1yz1zTdI9rfBqCx4oMTaGNsTzCEH...

Inside the EC-1 showing the NE-2H neon lights plus other GlowFETs https://drive.google.com/file/d/1YPFjxrtmRunZ5u-JfgcOw3IeRiw...


According to section 6.1 of the manual, it can be connected to an oscilloscope.

https://the-analog-thing.org/THAT_First_Steps.pdf


Yeah, wish it at least had an analog meter.


They should have built in one of these $20 oscilloscopes[1] in place of the LCD meter. And a speaker. Their Analog Thing costs over US$500, after all. Then you'd have a self-contained unit good for student use.

[1] https://www.aliexpress.us/item/3256805426720735.html


The instructions say to connect to the audio in of a computer.


That's really neat. Veritasium did a couple of videos on analog computing a year ago. I'm glad there are people lowering the barrier to entry.

https://youtu.be/IgF3OX8nT0w

https://youtu.be/GVsUOuSjvcg


These are fantastic, thank you for sharing!


I’ve been toying with the idea of building a “learn analog electronics” course by having the student build a musical synthesizer one stage at a time, starting with dual tone generators that can be made to deliver frequencies and a selectable wave pattern, then through to evelope filters, modulator (which part of it exists as the wave selection from earlier), and all the controls to make it happen. Not sure if this has been done or if it’s a dumb idea.


Erica Synths and Moritz Klein have released an educational eurorack synth range of modules with many resources for learning:

https://www.youtube.com/@MoritzKlein0

https://www.ericasynths.lv/shop/diy-kits-1/mki-x-esedu-diy-s...


The trickest part I see there is you're going to need some sort of midi input and associated parsing to actually make the thing playable.

Or are you thinking of building a keyboard as well.

I suppose there are probably MIDI to CV modules out there one could source.


I was going to have them build a rudimentary octave (12 note keyboard segment) but that has been something I’ve been trying to sketch out as well.


A theremin input maybe? Could even be part of the build experience. No mechanical parts, which is nice, and pitch bends are fun sounds.


You could use the D/A on a Raspberry Pi or an Arduino to generate the control voltages.


Not good enough. Gotta be able to play it. That’s where the fun is.


Have you checked out Lantertronics on YouTube? He has a course "Analog Circuits for Music Synthesis" but it might be tough for beginners to follow.

https://www.youtube.com/@Lantertronics/videos


It sounds like a great idea to me. I’ve been trying to carve out time to do more electronic tinkering and learning and that sounds like a neat set of projects.


I love the idea as a learning tool, but:

1. ~USD $535 isn't exactly 'low-cost'

2. I'm confused who the audience is.

It can't be kids--this is too advanced. It can't be engineers (or student engineers)--there are better, more practical learning tools. Maybe people with an extra $500 who just enjoy fiddling around?


Nerds with too much disposable income who will buy this, spend maybe 20 minutes total and then put it away to collect dust.


My raspberry pi is looking at your post guiltily


To be fair, though, RPi's price point is well within the impulse-purchase threshold of normal people, let alone wealthy nerds.


Same here. I fear we may be the target demographic for this thing.


mine too. at least the box (with pi inside) makes a good stand for a mesh node.


I could see it being used in classrooms. What level/grade? No idea.


It definitely has classroom energy, but for that to work out it would need to come with a whole curriculum, textbook, etc. And even then, I think a lot of parents would be asking why this oddball thing vs playing with regular 555 and opamp circuits on a breadboard.


> I think a lot of parents would be asking why this oddball thing vs playing with regular 555 and opamp circuits on a breadboard.

A lot? Like more than 50% I am very doubtful. But maybe I'm running from different demographic intuitions.

I'm not sure how many parents would know the "proper" process.


Okay, fair enough. But I know whenever schools try to adopt, say Linux or OpenOffice instead of the "industry standard" choices, there always seems to be wringing of hands about whether the children are being properly prepared for their futures.


LOL! Maybe in Santa Clara


Definitely useful once calculus is looked at. I attended a"mathematics summer school"back in 1979 at the local atomic research facility. (I was at the end of year 10). It had an overall theme that looked at global oil production and consumption, modelled using exponential functions, integrals and the like. While it was focussed on FORTRAN and BASIC, they also had a massive analog computer (hooked up to an oscilloscope) which seemed to give more real-time insight as you tweaked those coefficients.


I just completed "instrumentation lab", a college physics class, where we built amplifier circuits (presumably for use with sensors in experiments, although we just used a signal generator). Seems like this would fit right in since the other portion of the class was oscilloscope training.


Very interesting device. But it costs as much as an office PC, that can emulate this analog computer and do a lot more.


It's an educational board. I made a custom educational board for my kid a few years back (with a 16 bit digital CPU though) just to have something physical, with the possibility to be underclocked to extreme values and a bunch of LEDs in key points to illustrate the principle. I could have used an emulator, but something like that is 1000x better as it's bare metal and doesn't have any black magic under the hood.


When it was originally released it was a bit less expensive than the current price on it (glad I got one then).

https://web.archive.org/web/20220528131517/https://shop.anab...


A digital computer cannot emulate an analog computer: it can only simulate it to an arbitrary level of precision. That’s the whole point.


The difference between the words emulate and simulate are difficult to grasp for me. One comes from the latin `aemulus` and the other from `similis`. One talks about imitation and the other about similarity. When people discuss the differences between these terms, they say things like one aims to be able to replace a thing, while the other aims to replicate the thing's internal state. Or, that one aims to replicate the external behaviour, and the other aims to replicate the internal state.

I somewhat discard these interpretations. My conclusion, is that emulation is about making something equal to something else under some circumstance, while simulation is about approaching emulation (under some circumstance), but not aiming or achieving complete emulation (under that circumstance). Basically, the difference between becoming equal and becoming similar. This is counter to popular usage I think, but popular usage is a bit of a mess, in my opinion.


The problem with simulation is that it might produce artifacts that a user might exploit ("hey, this is cool!") and then finds out that it only exists in the simulation, not in the real world.


The concept of perfectly accurate emulation lies at the core of formal definitions of computing such as Turing’s seminal “Turing Machine” introduced in “On Computable Numbers” way back in 1936.


I went down this rabbit hole recently and came to the conclusion that their is no real difference between those. When does something cease being an emulation and turn into a simulation?


When you can't replace the emulation target with the emulator, it stops being an emulator.


Analog computers don't have infinite precision due to the presence of noise, so digital computers can emulate that with high-enough precision arithmetic.


Yes but there are equations (like stiff differential equations) that are extremely difficult to solve accurately with a digital computer but which are trivial on an analog computer.


‘Emulation’ means something very specific. What you are speaking of is “simulation to an arbitrary degree of precision”, as I mentioned.


> 'Emulation' means something very specific

What exactly? And how does it differ from "simulation"?


Simulation is about mimicking another device or system. Emulation is about setting up a system that is logically indistinguishable from another irrespective of its implementation substrate and details thereof.

A thing is successfully ‘emulated’ when it is logically impossible to distinguish the difference between the system and its emulated counterpart.


You know, by that definition one analogue computer can't emulate another of the same model.


Exactly! For analog computers, every single ‘run’ is different!


Since, as you said in a sister reply, even one analog computer might be slightly different from another analog computer, and thus unable to emulate it, if you had the outputs of two different computers, one analog and one digital one simulating it to a high precision (higher than the noise of the analog one) how could you distinguish which was digital and which was analog?

If you can't, then this is a meaningless semantic discussion. The digital computer can emulate the analog one as well as any other analog computer can.


The point is that discrete computers can exactly and trivially emulate each other. The inability to emulate an analog computer by a digital or analog computer kind of is the whole point.


What was that application from a long time ago that had analog wiring sound systems that you had to manually (on screen) connect a wire between ports... and you could flip the rack from front to back?

--

One of my best friends growing up built a ton of analog mixers IRL while working at Melekko Heavy Industries... (I helped him a tiny bit create the CAD files for the CNC for the faceplates.)


Reason?


Yep!

Thats the one - imagine if you had an UX/UI to mynoise.net with a REASON frontend?

We should escalate this as a "bug which is missing as a feature" that they dont have a UX like this :-)

Stephane @ mynoise.net


Sure you can, you only need to simulate it to near some orders around the planck constant. And then you can go even further. Analog does not have infinite precision either


As a concept, analog computers rely upon an assumption of continuity.


But due to limitations of physics, nothing will be continuous.

Is amount of water in a bucket continuous? No, you can count each individual atoms. So you can simulate that by using large enough integers. Same principle applies everwhere


These are assumptions. If you think the assumption of continuity is ridiculous, note that the definition of universal Turing machines requires an infinitely long tape (infinite memory), which of course conflicts with the finite memory of any actually implementable digital computer.


I am not saying you need a turing machine, a finite one will do since we are also dealing with a finite analog system. If analog system is finite and has finite states that we can measure, then a finite computer will just do fine


I’m saying that these properties are derived from equally ultimately unrealistic scenarios.

I’m honestly quite surprised that people are chiming in with their ‘opinions’ on proven mathematical facts.


> A digital computer cannot emulate an analog computer: it can only simulate it to an arbitrary level of precision. That’s the whole point.

A modern digital computer can simulate this particular analog computer beyond the noise floor. Practically speaking, that means a digital computer can perfectly emulate this system therefore it's simply a toy or perhaps for aesthetics.


That's much less cool though.


Question: One problem of current superlarge language models is their high inference cost. Wouldn't that be a perfect application for analog computers, since they are known to be power efficient? As I understand, the weights of a neural network are approximate values anyway, so it probably wouldn't be a problem when the calculations (vector multiplication or something like that?) aren't perfectly precise.

I assume this idea has long been considered and doesn't actually work for some reason. Perhaps because implementing an order of hundred billion parameters in an analog computer isn't technically feasible? That's about how many transistors are in a CPU, and those are presumably much simpler to implement than the weights ("synapses") of a neural network.

(I know there are special inference accelerator chips, like TPUs or co-processors on smartphone chipsets, but those seem to be fully digital and not actually "neuromorphic".)


This is essentially how the very first neural network, the Perceptron, was implemented in the late 50's. It even has potentiometer dials to set the weights, and they were driven by motors during training!

As for feasibility of doing analog inference today, it's an interesting idea and I'm not sure. Billions of parameters would mean billions of wires though, there may be substantial parasitic losses in building such a thing.


Simpler machine learning algorithms are being implemented in analog hardware. It costs much less power:

https://sites.dartmouth.edu/odame/research/asthma-symptom-mo...


Analog nature by itself doesn't have to mean individual wires or knobs. I woner what the analog equivalent of a FPGA (mutable connections) would be.


Carver Mead tried back in the eighties. There’s a book - and Synaptics.


The most advanced form of analog neuromorphic computer seems to be this:

https://www.nature.com/articles/s41586-022-04992-8

It makes use of a special ReRAM ("memristor") chip. But, as I suspected, it is far too small to run large language models.


Reminds me of the old Science Fair Electronic Computer Kit Model No. 28-180 from Radio Shack...

https://www.oldcomputermuseum.com/electronic_computer.html

I had one as a kid and regret a) that I didn't figure it out more, and b) threw it away at some point.



Is 499 EUR for that BOM fair? Genuinely asking... high end SoM modules with carrier boards and a metric ton of software can be obtained by half of that price, and op-amps aren't that costly.


Mechanical analog computers used in WW2 era warships were quite interesting.

https://arstechnica.com/information-technology/2020/05/gears...


Very cool! Albeit I don’t see the educational value if not paired with an oscilloscope and/or loudspeaker.

This screams modular system at me. And I think it is much easier to learn what all of these functions do when combining multiple senses.


Is it actually open source? I can only find PDF schematics, not source files. The git link takes me to a closed gitlab instance.


Related:

The Analog Thing: An open source, educational, low-cost modern analog computer - https://news.ycombinator.com/item?id=28614840 - Sept 2021 (65 comments)


But if folks could also maybe come up with a way to make patch cables not destroy any sense of understandability, that'd be great.

(and maybe solve the whole "good luck 'saving' the thing you just made" problem at the same time, too. cheers)


Open source? Really? Where’s the schematic?


Schematics are available https://the-analog-thing.org/docs/dirhtml/rst/schematics/ though usually Open Source Hardware refers to releasing the original design files, including schematic and board layout, rather than just PDFs of the schematic: https://www.oshwa.org/definition/


If you are looking for documentation, try clicking the prominent "read the docs" link. Once you do so, you'll notice "THAT schematics" is the very first link after the overview.


Very cool, but it would need to be closer to $200 for me to openu wallet.


Related: Versatium's video on analogue computing: https://youtu.be/GVsUOuSjvcg.


I knew digital was just a fad.


https://mythic.ai is an analog chip for doing neural nets.

https://youtu.be/GVsUOuSjvcg?t=898 (this is a time stamp'ed link for a video mentioned elsewhere in the comments).


Saw this a while ago. I don't know enough about math to know if this works as described...I hope it does! A tiny dedicated chip for AI inference jobs is probably the next game changer the industry needs to put AI in everyone's pocket. Too bad OpenAI is going to kill any chance at something like this with regulatory capture.


If you have an iPhone or a modern Pixel, you already have a dedicated AI accelerator in your pocket.


Actually, I have a Pixel 7 with the Tensor G2. I can't find the tera operations per second on it, but Google has said that its [1] 60% more powerful than their Tensor Coral (4 TOPS at 2W of power). That puts the G2 at about 6.4 TOPS.

The Mythic analog chip [2] is pumping 25 TOPS at 3W. For comparison, modern (digital) GPUs are doing 25-100 TOPS at 50W-100W...while costing >$1000. So, its a big jump.

[1] https://www.xda-developers.com/google-tensor-g2-changes/

[2] https://youtu.be/GVsUOuSjvcg?t=1060


but what is the 'software' equivalent for this thing?

the physical state of all them wires? not very portable


Think of it in a streaming/functional way. Programming would be about signal flow, components that transform, and you get a continuous output.


You can model passive circuits with RLC ordinary differential equations (depending on the setup of the equations there's some algorithms such as Runge Kutta to solve them numerically). Afaik you can model some active components with ordinary differential equations, but I wouldn't be surprised if you had to resort to partial differential equations (More or less a "complete" electromagnetic simulation at that point).


you're onto something

I was looking at some of the open documentation[1] and they show schematics and differential equations alongside each other.

I just keep burning out whenever I try to make sense of these things;

[1] https://the-analog-thing.org/THAT_First_Steps.pdf


polaroids


A simple analog computer would be a slide rule (or disk). The advantage is that it doesn't need batteries, but usually has very low precision (about 1.5-2ish decimal digits).

Hell, you can even print it and have a calculator with mul,div,sin,log,√...


See the "Doing the math" section https://www.sliderulemuseum.com/


€524,16 EUR is not affordable


i would buy one if it cost ten dollars not five hundred dollars


Asking as someone that is not familiar with general purpose analog computers: are there benefits over digital?


To me the appeal is that they more closely model "the Real World". Perhaps like modular synths, they invite experimentation, learning (maybe a bit like a graphing program?).


I just want to know when someone is going to get around to make a single board photonic computer...


i wonder why no one commented about Putedata, Csound and Supercollider as a cheap alternative…


Neat, but... this "analog" thing's built with a lot of digital circuitry


Like what? The 7-segment display and the mode selector?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: