Hacker News new | past | comments | ask | show | jobs | submit login
1x Forth by Charles Moore (1999) (ultratechnology.com)
63 points by falava on Oct 7, 2012 | hide | past | favorite | 16 comments



I appreciate Charles Moore for being opinionated and for inventing and evolving Forth, but IMHO there's no way that any low-level stack oriented language will ever be popular. The reason: it's just far too difficult for the majority of programmers to grasp the concept of stacks or to factor their code in a way so it only needs a small stack. It's the single most difficult concept to deal with, and patently not superior to registers and other normal techniques, as I found when I wrote JONESFORTH.


I don't think reverse polish notation is all that difficult, the difficulty lies rather in that Forth is best done as an actual low-level stack oriented language, not as a general purpose "does mostly everything at least okayish" language such as C++, Java, or Python. Few programmers work at such a low-level. There's usually an operating system in the way of your program and the machine, or at least a complex kernel that does more than strictly necessary, and in general there are just quite a few layers of indirection to get at the "actual" hardware. Even a lot of assembly programmers don't work low enough unless they happen to be doing small "kernel"-less (or "kernels" they made themselves) embedded systems. Verilog/VHDL/MyHDL programmers are probably in the same camp as Forth programmers, as long as they aren't just working on top of some big IP core they don't understand (like Java programmers writing on top of the JVM on top of the OS). Forth wants to talk to actual hardware, mainstream programming is about getting as far away from hardware as possible and leveraging other people's code whenever possible instead of doing it yourself. (This isn't to say the mainstream view is always wrong, it's often good that programmers work far away from the hardware, but for those sorts of problems I don't think Forth is really a good fit.)

This Washing Machine example isn't hard to understand: ": WASHER ( -- ) WASH SPIN RINSE SPIN ;" http://www.forth.com/embedded/swiftx-embedded-systems-7.html provides the rest of the implementation, which is also not hard to understand, and is at least as easy to understand as a corresponding assembly (or C) version would be. But it requires knowledge of the washing machine, which requires reading data sheets and maybe even doing some manual voltage tests, and it requires knowing (or choosing) which pins (or memory addresses) on the program host connect with the washing machine. There's a minimal kernel involved. You have to know what accomplishment it is you even want to achieve (as the Forth program begins with designing the high-level behavior I quoted, then continues by implementing bottom-up from the compiler primitives to meet the design).

I think Forth will be at most as popular as HDL languages are popular. That is, not very. And in both cases the allure of using C (whether compiled to what the embedded system's processor understands or compiled to the primitives of an IP core implementing a whole processor) will be strong, for there are lots of C programmers and C is more-or-less useful in any domain.


The jvm is doing fine. A low level stack oriented language can be a target.


What does that have to do with _anything_ said by either Chuck Moore or the person to whom you replied? I get that programmers (and I'm no exception) have a mental defect where they try to show how smart they are by compulsively interjecting at every opportunity. But for that to have a chance of scoring you geek points you need to have some claim to relevancy, however pedantic and tangential.


I'm not here for the points.

I'll spell it out for you: the JVM is the most widely deployed stack machine in use today. The fact that you don't normally program it directly using a stack-machine like interface does not diminish the fact that the concept of the stack machine is very successful. Yes, there are some warts and issues but on the whole it works quite well and it never ceases to amaze me how fast it can be made to work using JIT and other technologies that weren't even around when Chuck did most of his work. The stack machine has been one of the most enduring concepts in programming.

If you haven't programmed the JVM directly then I would invite you to do so, it's lots of fun and you can pull tricks that would be hard to replicate using java.

Now that clojure and other source level languages have been added to the mix there is even more life in the ecosystem.

In short - and to make the point again - stack machines don't need to be the source language to be effective, but if you really want then you can use them that way.

I'm not sure if factor is running reliably on the JVM yet (there were some attempts in that direction) but that would be a fantastic way to unlock the stack languages true potential, a forth like language on a virtual machine with enormous deployment. That could do for the forth family of languages what clojure is doing for the lisp family.

edit: I simply love being downvoted for a gracious reply to a bunch of insults.


Moore's talk touched on the advantages of stack machines, one of which is portability. Hey, guess what is a particularly important and portable stack machine.

But I get that programmers have a mental defect where they try to show how smart they are by compulsively insulting people at every opportunity. But for that to have a chance of scoring you geek points your comments need to be not completely useless.


Th JVM's stack-oriented bytecode is a liability, not an advantage. http://static.usenix.org/events/vee05/full_papers/p153-yunhe...


That paper does not take into account the optimizations done at JIT level.


I have the same love/hate relationship with Factor: I'd love to use it more, the implementation is great, but the shuffling is hard to write and harder to read.


Canonical Factor prefers the use of locals rather than stack shuffles to avoid stack shuffling mental overhead. Thus, one hopes that you're just left with the 'love' part of your Factor relationship.


Simplify the problem you've got or rather don't complexify it. I've done it myself, it's fun to do. You have a boring problem and hiding behind it is a much more interesting problem. So you code the more interesting problem and the one you've got is a subset of it and it falls out trivial. But of course you wrote ten times as much code as you needed to solve the problem that you actually had.

It seems that many programmers believe that writing 10x the amount of code to solve a "simple" problem in a complex, yet "general" way is somehow a win.

I wonder if that has to do with having a better grasp of the "complex" problem, so you develop a belief that you "understand" the problem, and can therefore "solve" it?

We know from study after study that 10x the code generally means 10x the bugs -- that bugs scale linearly with text on the page.

And yet we still don't want to solve the problem at hand in the simplest possible way. We don't want to specialize our code. We don't want to write 1/10 the code that just does the present job, and does it well.

I'm as guilty as the next programmer, but I wish I understood the impulse better...


Simplifying problems is hard work - it means making firm decisions about your needs and laying bare how much of your work is truly original, vs. being regurgitated effort done thousands of times by others.

Writing a very complex solution appeals because it defers that work until later: "when I decide what I want, I'll just configure it that way." (And then you never decide.) Bonus points for "it will be the fastest/most comprehensive/most modular solution around." You reassure yourself, with one of these fallacious ideas, that "other people will use this for many years."

This is an issue that goes straight to the programmer's ego and purpose as a craftsman - who are they writing code for, what problems they are solving, what is the thing they are going to tout on their resume. They aren't going to be confident or arrogant enough to railroad all projects into minimal, ground-up Forth-idiomatic projects. Most of the perceived brass rings revolve around short-term ideas like a buzzword technology or a new platform or some other obviously overcomplicated and unnecessary thing.

I have tremendous respect for Chuck and have definitely aimed to emulate him more and more, little by little, though I am not a Forth coder and cannot claim to have neared the style he advertises. My most reusable bits of code tell a similar tale, though: Algorithm and data structure implementations. Parsers and compilers. Formats and protocols. In between those, lots of architecture that I thought was a good idea. Most of it unnecessary.

Side note: Format and protocol support tends to introduce a tremendous amount of complexity. The sign of a good format is in how much it manages to limit this "complexiplosion" while still giving users the features they need. One of the things holding me back is my desire to stick to existing formats.


Chuck Moore's philosophy is well explained by this paragraph:

> Likewise the conditionals. In Classic Forth we used IF ELSE THEN

> And I have eliminated ELSE.

> I don't see that ELSE is as useful as the complexity it introduces would justify. You can see this in my code. I will have IF with a semicolon and then I will exit the definition at that point or continue.

He also highlights why people cannot code well in Forth:

> There is something more than the formalism and syntax of Forth that has got to be embedded in your brain before you're going to be effective at what you do.

The average programmer cannot grasp that "thing" that makes you a good Forth programmer.


Chuck's philosophy is all about getting rid of everything that is not absolutely necessary for the task at hand, paring both the problem and solution down to the minimum. Compare this with the instinct that most programmers have to generalize every problem and abstract code into "reusable" modules, whether or not that code is ever actually reused. Chuck might say that the best way to write maintainable code is to keep the codebase small enough that it can be rewritten when the requirements change. To program in Forth you have to fight against instincts drawn in from other languages.


"focus on applications instead of refining the tool"

It's true. How many times have we seen someone share their reimplementation of a FORTH like interpreter? Far fewer times than we've seen people write and share FORTH applications. Why that is, I'll leave as a question for the reader.

We know FORTH works beautifully as an small interpreter. There's no need to keep rewriting it except as an educational exercise. But would any disagree that there is a real need for more applications to be written and shared if the brilliance of FORTH is ever to be truly realized by more than a small group of people? Not necesarily to create some sort of silly "App Store" culture but just to show what can be done with the language. Proof of concept.


I use my Forth system for writing video games: https://github.com/JohnEarnest/Mako

It's also fun to implement features conventionally associated with high-level functional languages, such as lazy generators: https://github.com/JohnEarnest/Mako/blob/master/lib/Algorith...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: