As a hardware guy, I'm kind of fascinated by how many stories I've seen on HN recently on HDLs, and equally fascinated by how software people describe their mental model and thought processes when thinking about hardware design. It's neat to see how the structure of the problem space is fundamentally different, I think, to how a hardware guy sees it - which is not to say any less valid, and quite possibly becoming more valid.
But I also think that to do hardware design in VHDL or Verilog, you need to understand the underlying structures - latches, flip-flops, decoders, muxes, etc.
As an old-school software gal, it's amazing to me how software engineers these days think, too. The sentence "ever wondered why integers in software are powers of two?" made a very powerful point for me - namely, how far software engineering these days is removed from the underlying implementation.
Mind, I don't fault the author for being surprised. It's a logical consequence of that remove. But as somebody who grew up with blinkenlights and core memory, the idea of not knowing this fact is completely foreign to me, even though I have no HW skills worth mentioning.
Part of me wonders if that means we should teach SW engs about those realities. Especially given the authors wide-eyed realization that all that code executes in parallel, all the time. It's a mental model that seems rather useful given the amount of parallelism we see and will be seeing.
I'm the author. First, I'd like to point out that I started out as a programmer for embedded devices.. so I did know about the parallel nature and powers of two. BUT, I do know lots of programmers who don't. My intention was to teach those who know nothing of computer hardware and are asking these questions. Hopefully you'll be less grumpy now. (:
As I said, I don't see not knowing this as a fault - it's a product of our current environment and the fact that the gulf between HW and SW has widened. As such, it's not making me grumpy. I just thought it made an interesting point about how much we've progressed.
And good on you for pointing it out and teaching about it!
But I also think that to do hardware design in VHDL or Verilog, you need to understand the underlying structures - latches, flip-flops, decoders, muxes, etc.
Only barely. Modern FPGAs are made up of LUTs and flip-flops, which can be abstracted as "cloud of programmable asynchronous logic surrounded by D Flip-Flops". I think if you start by explaining the abstract notion of asynchronous logic, and the notion of gating a design using D flip-flops, you can get someone up the HDL learning curve really fast without going into the depths of what boolean functions are, what a mux is, etc. Boolean functions and muxes aren't a central component of modern programmable logic anyway.
I've been thinking about writing up "30 minutes to your first HDL design" at some point. Dragging a competent C programmer up the learning curve is pretty easy, as long as you don't start off with "it's like C but...". I've trained a few SE interns to write some halfway-decent CPLD designs in just an hour or two, and consequently I think that the way they've taught it at school is way too low-level for someone that's going to be working with modern programmable logic, and it needn't be so painful.
entity and_entity is
port(
x, y: in std_logic;
result: out std_logic
);
end and_entity;
It took a couple of minutes of staring to realize that it was a way of representing the concept of just splicing 2 wires into a 3rd.
To me, it's making an always-on connection, such that any electricity on either line "x" or "y" will induce an identical electrical current on "result". This would obviously include any electrical signals, including HIGH/LOW pulses. And they're combined, so HIGH on either "x" or "y" outputs HIGH on "result".
Because of how I read code, I see that as: whatever is in buckets "x" and "y" are combined and copied into bucket "result".
I've never thought of lines, buses, leads, wires, etc as buckets. Yet that's how I think they're described in this code. But if you think about an individual clock cycle frozen in time, some wires have HIGH pulses, and the intention is to copy the HIGH pulse from that wire/bucket.
That code fragment is really just the pin-out of a block - or to a software person, it's basically a functional interface. Naming aside, it could be the pinout of a number of number things - it could an and gate like this, or one of the signals could be a clock and this is a register... or an entire delay line.
A really interesting mental-model barrier to watch hardware newbs overcome is to present them with designing a counter - and watching as the very first thing they want to type is a for loop, which is generally not how you'd do it, but almost reflexive to someone who has been programming for a while - particularly in an imperative style.
Either the author is not explicit enough in the article, or he makes a critical, wrongful assumption. Either way it's dangerously misleading.
Sadly, I'd lean towards the second one because of text like this:
> A major difference between software written for CPUs and VHDL is: VHDL is concurrent by default.
VHDL is a description language, not a programming language.
When writing VHDL you are not, at any point, writing software, you are laying out a map of the hardware. This map describes physicalities, and (barring relativistic considerations) physics happens everywhere at once. It is not parallel computing, it is physics.
The logic arrays do not physically transform when you program a FPGA. They get input from the program that determines their function.
The difference is essentially that with a normal CPU, the function is changed every time you load a new instruction. With FPGAs, you allocate a bunch of logic arrays to just one specific block of instructions all the time. That gives you the magic parallelism.
(The point is: you might argue that VHDL is just describing logic, but so is any other compiled code!)
> They get input from the program that determines their function.
Agreed, this is a program, yet this results not in software, but (configured) hardware.
> That gives you the magic parallelism.
And thus it follows that it's not magic at all, merely physics. Put an electric signal in, an electric signal gets out, and everything happens at 'once' (bounded by the speed of light and various other electronic considerations). This is what the essential difference is: hardware (however configurable, it is not software) vs software (however low-level, it is not hardware), and it is a very fundamental difference.
I'm the author. I think I made the distinction quite clear. Surely I understand concurrent means several independent components working in parallel and not parallel execution of code on CPUs. Did you read the whole post?
Yes I did, I just find the wording ambiguous enough to mislead newcomers, especially at the very start. The article is good at making parallels with the software world and explaining how to map previous knowledge, but does not make it clear at the beginning that this is not software. Even at the end, things are muddy, with things like:
Hardware - write concurrent code, write serial code where needed.
For you an I who get the difference, we can read between the lines and get it, but a complete newcomer might just as well end up thinking: "okay I get it, this is very low level software code which is executing concurrently by default", when really, it is "this is very high level hardware which is described here, and physics makes it 'happen everywhere' at once, but I can understand what happens through a mind trick".
Mind you, I like your article, but I just feel it may make some hardware beginners get some fundamental basics wrong.
As an anecdote, I personally noticed the critical importance of this distinction first hand, as in a software program, we went through a hardware course on this very subject and the teacher incessantly restated this fact, with due reason since later down the road each one of us understood that basically every single mistake one made was due to one trying to shoehorn software concepts into hardware, or expecting software behaviour from hardware.
VHDL looks declarative and also appears discrete enough such that you could probably write a virtual machine that would create virtual machines from your VHDL.
Well, you would not be writing a virtual machine, you would be writing the hardware definition of a real machine. A virtual machine is virtual precisely because it's hardware mimicked by software, whereas there you're doing hardware directly.
On a tangent, I really hope that Minecraft helps encourage people to go into hardware. Or at least helps people understand a bit more about real world hardware and logic.
It helped me greatly. Even just learning how to build an SR-NOR latch for a redstone-operated door was awesome. I planned to try building a calculator, or even doing nand2tetris, but I don't have the time.
Arduino and Launchpad helped me too. I expect to see custom Minecraft maps made of a small microprocessor -- with some way of loading the same code on both the game's microprocessor and the real-world microprocessor, and watching the code function both ingame and in the real world simultaneously. Serial logging could even be used to maintain concurrency between the microprocessors.
I highly encourage anyone interested in FPGAs to learn Verilog. There is far less boilerplate and the syntax is a lot more concise.
Mentioned elsewhere in the thread are really good ideas to internalize. HDL isn't programming. You aren't merely writing instructions for a state machine, you are describing the state machine. It is very important to learn how hardware constructs are described by HDL. The best practice is to design tbe hardware on paper and then describe that architecture in HDL.
As a software guy, I'm pretty amazed that you can actually build and run your hardware directly in software with that much ease. Pretty incredible - a few more iterations of this kind of tech and I'll be coding hardware with python or something? Shouldn't be too difficult to convert some kind of scripting language into HDLs, so that functions can be turned directly into hardware by anyone?
Sounds like the massive scale software of the future even - possible huge field just waiting to take off when paired with at-home 3d circuit printing technology?
But I also think that to do hardware design in VHDL or Verilog, you need to understand the underlying structures - latches, flip-flops, decoders, muxes, etc.
I'm enjoying this recent trend!