Hacker News new | past | comments | ask | show | jobs | submit login
Chuck Moore, Extreme Programmer (uni.edu)
313 points by ingve on Nov 1, 2017 | hide | past | favorite | 174 comments



I want to take a moment to reflect on what a great community HN has. It never ceases to amaze me.

The link is from the CS department at the University of Northern Iowa in Cedar Falls, IA. It sounds obscure, but that's my hometown. I grew up a couple blocks away from it. I sat in on classes there. I even know the author's family. That's small town Iowa, for ya.

I never thought I would see anything related to that on here. This is truly a place for all.


I work in the fast food industry, don't code, and I'm not even particularly interested in tech, but spend hours reading the articles and comments here.


Maybe you should start coding, you may actually like it too


that's definitely the sort of thing i like about this place. i'd imagine programmers are very very heavily represented here, but it's nice that it's not to the exclusion of everyone else.


I was a network engineer and would do the same, can't place the exact reason why, but today i am a python programmer doing network automation. I do think HN had a major part to play in this change


Network Engineer here too. I learned to code with Python and now do full-time PowerShell automation at work. It completely changed my career. Best decision I ever made.


Learn to program. It will be a worthwhile investment into your personal growth and self-development.


serious question, if you arent interested in tech, what interests you about the articles and comments?


There's a lot of interesting articles here: https://hn.algolia.com/?query=%22how%20is%20this%20hn%22&sor...


"I'm from Iowa. I only work in outer space."


Economist here

I do the same. :)

I am addicted to be very honest. I learn so much, it's worth it.


Another (former) Iowan here - fully agree how great it is that so many different places are represented here.

I ran a few meetups while in Iowa, and one of the common themes myself and the other organizers tried to help everyone feel is that these days, location really is less important than ever before.

It may help to know people face to face in the Bay Area or other major metropolitan for some opportunities, but now is the best time ever to find people with technological interests in your neighborhood, even if you’re not in a tech hub.

In fact, all the better if you’re not in a tech hub - you can become a local subject matter expert more easily in a growing community.


I left Cedar Falls precisely because I became the subject matter expert within a couple years of graduating--there simply weren't many people to learn from. There were people around me who were experienced in lots of other related areas which were rapidly declining in importance and frankly didn't excite me. Besides, it was really hard to find people who were as interested in learning as I was--I was mostly viewed with suspicion because I didn't put in the bare minimum and go home, and I quickly realized that wasn't the sort of team I wanted to work on. Further, there weren't any companies in the area that were doing exciting things, probably due to a dearth of talent in the area (not to knock on rural folks; assuming the distribution of talent is the same in rural areas as urban areas, the hiring pool is much larger in an area where people are more densely populated, and this is ignoring multipliers like the draw of talent to large cities and the improved rate of learning when your talent pool has improved access to other smart people).

I guess this is to say how you view this depends on whether you interested in being a local subject matter expert or a global subject matter expert.


Dr. Wallingford's blog is one of the few that I make sure to never miss an entry of, it's definitely cool to see him get some recognition on here.


It is always cool to see UNI and Cedar Falls on HN. Also from CF and got my MIS degree at UNI.


The internet startup[1] I work at is based out of Cedar Falls. Our product team is in Minneapolis (including myself), but the CEO/sales is in CF. I'm down there all the time! I've actually grown to like it quite a bit. Cool little town.

http://www.threadsculture.com/


Ha! I have direct connections to half of the logos in your "case studies" page. Small world.


I went to UNI for CS as well. Dr. Wallingford was dept head at the time.


Dr. Wallingford was my advisor at UNI. Pleasant surprise to see his blog here!


[flagged]


Small town Iowa is probably not very well represented on most internet forums, so the commenter was happy to see that it was being referenced... That's all. Not sure where your negativity is coming from


White middle class suburbanites aren't well represented on internet forums?

Don't mean to be negative, it's just insane how insular and specific this community is and how little recognition that is given here.


I'd imagine life experience for people living in small town Iowa is very different from that of people living in major urban centers on the coasts. You're assuming that if someone is white and not rich or below the poverty line that their life experience is equivalent to that of other white not rich/not poor people. That's a false assumption.

Diversity should be celebrated regardless of whether it's within or between different races. The OP was just celebrating a bit of diversity that they noticed and was relevant to them. They were not diminishing other forms of diversity that you may be more concerned with for whatever reason.


The idea that you're calling middle class suburban experience "diversity..." woo boy.


Again, small town Iowa is very different from elsewhere in the country. It's not what you would expect on a site focused on tech where most probably hail from coastal urban or suburban enclaves.

Of course if you refuse to believe anyone who is white could contribute to diversity of a community, well then I guess you're just racist.


>Again, small town Iowa is very different from elsewhere in the country. It's not what you would expect on a site focused on tech where most probably hail from coastal urban or suburban enclaves.

He lives outside of a town of 40k, it's not particularly small. For reference I grew up in a town of 6k- I'm not trying to play small-town olympics here but he's hardly from the boonies. The University of Iowa is there...

>Of course if you refuse to believe anyone who is white could contribute to diversity of a community, well then I guess you're just racist.

Heh, what a way to frame my words... who would think that? Nobody would, which is why strawmen are useless rhetorical tools.

My point is that on a website dominated by suburban, white, middle class software developers, a reference to suburban Iowa is not diverse. It's insane to think that a reference to that community shows how inclusive HN is. It's extremely exclusive.


I don't think I've ever met anyone from Iowa. Most people on this site are probably from the coasts. So I would consider it an example of diversity.


What ethnic / socioeconomic background are most people here from?

You're really grabbing on to physical location to justify ignoring the fact that that person is perfectly representative of most of the folks here.


You're really grabbing on to race and socioeconomic status to justify ignoring the fact that that person is not representative of most of the folks here.

See? We can both play these games.


Race and socioeconomic status are two of the most important factors in your probability of success in the US. Ignoring them makes you look willfully ignorant of your privileged position in society. There's little / no correlation between geographic location and success, at least not relative to race and socioeconomic status...


And I'm happy to see someone from Iowa posting here. If someone from a different minority group posted here I would be happy as well. There are many ways to be diverse.


You have kind of an absurd definition of diversity.


You only care about diversity involving people from socioeconomic or racial groups associated with poverty or historical oppression.

That ignores so many other kinds of diversity in the world.

Your definition is far too exclusive and narrow minded.


What types of diversity can you have amongst racially, socially, and economically homogenous people? What are you talking about? What is your grand, open-minded idea that ignores what actual diversity is?


I'm glad you asked! Diversity in musical tastes, religious and political ideology and affiliation, food preference (because new cuisine is always great to discover), experiences as a result of living in a primarily rural vs urban setting, occupational diversity, etc etc. I could go on. You care about diversity in things that don't really matter. When I walk into a room full of people I don't really care about their income, or the color of their skin. I care about them as people, their interests, their unique life stories. You should stop obsessing so much about race and economic status. This is a free to use internet forum. No one is barred from joining based on their race or wealth.


1 point by lobf 1 hour ago | edit | delete [-]

I'm very sorry for offending you by not recognizing that "diversity in musical taste" provides for greater diversity than racial, social, and economic status. Silly me, I've been thinking about diversity all wrong. You're right, HN is very diverse- we have middle class white guys who like Metallica, and others who like show tunes. We have middle class white guys from suburban Iowa, and others from suburban Massachusetts. This is a very diverse website, I was very confused before.

>You care about diversity in things that don't really matter.

Race, social, and economic status don't matter. Musical tastes matter.

This place is so out of touch with reality. I'm glad I moved out of the bay area.


So you care about diversity in the color of someone's skin but not in what really defines them as a person? Or you think skin color defines people? And that all white people are pretty much the same? You're showing yourself to be quite racist.


Right, recognizing race is one of the most important factors in deciding your status in this country is racist. Pretending race doesn't exist isn't racist.

Again, you guys are so laughably out of touch. This is a poe's-law level discussion.


Again, it seems you're assuming that just because someone is "white" and not rich/poor they're the same as other "white", not rich/poor people. That's offensive and naive.


I'm very sorry for offending you by not recognizing that "diversity in musical taste" provides for greater diversity than racial, social, and economic status. Silly me, I've been thinking about diversity all wrong. You're right, HN is very diverse- we have middle class white guys who like Metallica, and others who like show tunes. We have middle class white guys from suburban Iowa, and others from suburban Massachusetts. This is a very diverse website, I was very confused before.


Cedar Falls isn't a suburb. 2 hours to Des Moines, 3 to Minneapolis, 5 to Chicago.


I take it to mean any place that isn't urban but also isn't rural.


It's pretty close to rural. 40K citizens and corn fields a mile or so from campus. Not sure what your overarching point is though--what's wrong with remarking on the improbability of a submission from a tiny university in Iowa?


I grew up in a town of 6k and we were the big one in 20 miles. 40k is suburban IMO.

Edit- and I was ridiculing the idea that the mention of a wealthy, homogenous suburb indicates how inclusive this community is. Pretty much every story here deals with things from a young, white, wealthy perspective.


To be clear, no one argued that "posts from Iowa" == "inclusive"; you're arguing against a straw man. But more importantly, it seems your beef with HN is based on the assumption that everyone from a town of 40k or more has the same experiences. This is such a distorted idea of "diversity" when you consider that it includes people from major metropolitan areas, silicon valley, red states, blue states, and all over Europe (and yes, all of those groups are well-represented on HN). The fact that you're here from a town of 6K and I'm here from a town of 2K is testament to the OP's point, and it contradicts yours.


Quote from (http://www.ultratechnology.com/1xforth.htm):

I wish I knew what to tell you that would lead you to write good Forth. I can demonstrate. I have demonstrated in the past, ad nauseam, applications where I can reduce the amount of code by 90% percent and in some cases 99%. It can be done, but in a case by case basis. The general principle still eludes me.


I'm not surprised that the general principle that helps Moore reduce his code size eludes him. Authoring something blinds you to your own style—both to your mistakes, and to your brilliances. It's why prose writers can't be their own editors: you need someone who wasn't in your head during the writing process to actually be able to see what's ended up on the page, rather than just seeing reminders that immerse you back into your own implicit context.

I would expect that, if you took five of these rewrite projects of Moore's, and compare-and-contrasted their source with the source of the originals, principles would leap out at you.

(Why don't we programmers, as a culture, read code this way? It really seems like the only way to deduce novel facts about programming for yourself. Is it just that a rewrite where both the old and new codebases are shared-source is rare?)


To clarify a little Chuck's projects were not rewrites in the usual sense, but they can still be read that way, as derefr suggests. His famous BASIC compiler for example was an implementation of BASIC, something that's been done a thousand times before but Chuck's was not a rewrite of any particular program.

Much of Chuck's code reduction comes from simply not implementing parts of the problem that most people would not think of eliminating. Other parts he will eliminate that most people would think inconceivable to eliminate.

Other "secrets" of Chuck's ways are that he uses the CLI as a front end, or, spends a great deal of time working on an algorithm and finds a simpler way that is good enough in the application he's working on today. So instead of spending time optimizing for speed or modularity he's willing to specialize his subroutines per application to optimize for concision or simplicity. Somewhat like Woz's optimization to save chips.

I bet most people who investigate Chuck's methods will feel disappointed that the methods don't seem as clever as the end product. What I think most people can take away from Chuck is skepticism about the utility of most programming technologies, which do automate tedium but tend to substitute their own architecture and complexity, for a net loss.


I'm a big Chuck fan, but I get the feeling that he isn't on a short deadline to build some standard business app like 90% of coders where you almost have a template...sure it has lots of code you don't need and to paraphrase Rich Hickey, it isn't simple, but it's really easy (at least in the short run). He's doing things where he is allotted the time to really understand the problem and deliver a minimal solution. Listening to his interviews make me sad as they call up images for how programming could be if we'd gone about it differently. Small teams writing small and efficient software where you can grok the entire thing at once.


Chuck Moore was amazing before I even knew how to code (and was a big inspiration for me), so I'm not going to even pretend that I know how to do what he does. However my view of this is that we, as programmers, can't just look at our jobs from a technical perspective. Nobody is given the time to make great solutions. The best programmers know how to work so that they can get the time they need. This is not trivial.

You can't just wander up to management and say, "I need X to make an optimal solution". They will negotiate you down. That's their job -- to negotiate into a position of advantage. When you need protection from the chaos that surrounds the rest of the organisation, you can count your lucky stars when you have an expert negotiator on your team. But when you go up to that expert negotiator and say, "I want X", the first thing that's going to go through their minds is, "I wonder if I can get that cheaper".

So programming has to be a balancing act of constantly delivering functionality while at the same time ruthlessly refining your vision of how it's going to work. If you are delivering quickly, regularly and constantly nobody will touch you for fear of screwing it all up. But knowing how to do that while still marching inevitably closer to simpler and simpler solutions is hard. A normal person needs to practice (and get it wrong) a long time before they get good at it.

Chuck describes a lot of his techniques in his books, but it's fluency that you need -- and it's fluency while things are hitting the proverbial fan that makes all the difference. You need great judgement to look at what you need and to prioritise it appropriately. If I deliver X quickly, I'll show progress. Do I need to show progress now? How fast? Can I refactor Y while I'm here? Is all of this necessary? And so on and so on.

It's tempting to think, "If only I was given lots of time.", "If only I didn't have incompetent coworkers", "If only people would listen to my advice", "If only management could look past tomorrow"... But that's not useful. Anybody can be a great programmer if we remove every problem before they start. The measure of your worth is how many of these complicating factors you can deal with while still producing great code. That's what's rare.


This is quite old and probably been posted here: http://yosefk.com/blog/my-history-with-forth-stack-machines....

But this in my opinion explains well both why people are seduced by forth and why forth is not widely used.

EDIT: also gives opinion why Chuck Moore can be successful using forth.


You certainly can be your own editor, but you have to wait a few years between the writing and the editing.


Are the original and rewritten-by-Moore versions available for any of these projects?


Based on the Silicon Valley Forth Interest Group email list, Chuck Moore has declared that he has retired from social media, he has not retired from Forth. He is scheduled to speak at Forth Day, November 18, 2017, as he has for years. https://svfig.github.io/ http://www.forth.org/svfig/next.html


> Moore also has a corollary called Do It Yourself!, which encourages you, in general, to write your own subroutines rather than import one a canned subroutine from a library.

The other side of this coin is NIH Syndrome. I've seen that go wrong in all sorts of ways.

One of the key skills of being a great engineer in the real world is exercising good judgment whether to "build or buy".


In the real world, anecdotally, my employers have always preferred to buy 160 hours of my time rather than share 480 hours of someone else's time with 1000 other customers.

It may have been because that person is then 1/1000th as responsive to their specific needs. It also may have been because that person wasn't cleared, and paranoia requires their code to be vetted before allowing it within breathing distance of our network. Or it may have been because buying a third-party tool requires approval of the manager's manager's manager, while rolling your own is the manager's call.

I have never seen a company that makes it easy to buy a third-party license, and never a company that makes it easier to use a gratis licensed software than a paid one. They're all afraid that GPL will infect the code, and it will dance out of our source control onto the Internet like children following the Pied Piper.

And on the other side of that coin, I have also seen licensed third-party software become vastly more trouble than it was worth, especially when it came time to upgrade it to a newer version. In general, though, if a third-party tool already does exactly what you need it to do, and you never need it to do anything else, it is always better to license it, configure it correctly once, and then never touch it again (aside from security-related patches).


> I have never seen a company that makes it easy to buy a third-party license, and never a company that makes it easier to use a gratis licensed software than a paid one.

Meeting anecdote with anecdote, I've never worked at a company that made paying for licensed software easier than using gratis.


The steps of learning <anything>.

Beginner: Learn the basics, be proud of getting something done that works, even though it might not be a useful result.

Experienced: Learn the systems. Figure out which system is more helpful than others. But don't worry too much before you start learning a new system. Even a "bad" system will teach you something valuable. In the end you need to learn far more than one system anyways to progress.

Expert: Learn that no system is perfect and use them freely according to context, maybe create a few yourself. See that even these "bad" systems come in handy once in a while.

Master: Don't bother to get here. Even most freaks won't get here and from those who do most will tell you that they had no choice to do something else and they were just lucky that at least in this area they had talent. You're probably better off than them in general terms. If you want, learn here that <anything> is just one system. Welcome to the next abstraction level as Experienced.

(This system about learning is developed by yours truly.)


how many levels of abstraction can you name? i think that would be a useful exercise.


I think a degree of skepticism is required with NIH. I'm probably not going to roll my own crypto, or high performance statistical routines. But if something invented elsewhere "solves" a problem that really isn't rocket science, or if it's a framework - I will almost always avoid it.


out of curiosity, did you roll your own logging framework? When I read that point, logging came to mind. Should I use SLF4J, or simply print to out?


That's a wheel I have reinvented to good effect. Most logging libraries are quite dumb, if you set your threshold to info you get way to much information, if you set your threshold to exception you lose to much information from before the exception was thrown. Different messages from different threads are all mixed together, etc.

My app specific logger is aware of the context it is running in, it will store all the context but only write it on an exception or error and discard it otherwise, so I get all the information I need without the noise from the 99.9% of the time I don't need it.

I do use logging libraries, but they are for the low level stuff like file truncation, rollover, etc and I wish there were "logging" libraries that did just those tricky little parts on let me write the 50 or so LOC that the high level interface needs.


> If you have read any of Chuck Moore's work, you know that he is an extreme programmer in the literal sense of the word: a unique philosophy that embraces small, fast, self-contained programs in a spare language and an impressive development environment

I haven't read any of his work, what is impressive about his development environment?


He designed CPUs from scratch using a bare-bones minimal Forth (ColorForth) as his preferred way of writing code. The resulting chipset is utterly fascinating: http://www.greenarraychips.com/


> from scratch

This is even understating it -- it's not like he used existing software to lay out the chip, and then ran tests using PSPICE to verify the functionality. He wrote his own chip design software and analog simulator, and designed the chip in his own environment, and created the CPU, the GA144: a working 144-core processor designed to run "ArrayForth", a parallel version of Forth that he designed and authored, just as he designed ColorForth, and just as he designed, once upon a time, the language known as Forth.

Any of the tasks above seem like it would be a reasonable accomplishment for a good programmer. Combined, they make me question whether I even qualify as a developer by this standard, and make me wonder whether I should take a year off of development and just really learn Forth.


I feel like a desire to program in Forth is more the consequence of being a crazy-in-the-good-way programmer, rather than something that aids in becoming that sort of programmer.

Moore, from my perspective, decided to bootstrap a computing architecture from as primitive and restrictive of a starting place as he could come up with. Forth makes sense as a language if you imagine yourself as a time-traveller stranded in the 1970s (or even earlier!), trying to get a "foothold" into your futuristic programming environment using the wimpy computers of the era, so that you could design and build the hardware on which would run the software that you actually wanted to write. (Which would, in turn, fix your time machine or whatever.)

Learning Forth isn't really a step on that path. Rather, designing your own minimalistic language under the constraint that it'd have to 1. run on the wimpiest hardware you can imagine, but 2. allow you to be productive in the rest of the bootstrapping process — that's the first step on the path to being Chuck Moore.


The "stranded time traveler" explanation explains a lot, actually. Including motive.


I also really liked that analogy.


>Any of the tasks above seem like it would be a reasonable accomplishment for a good programmer.

I think that's understating it. I would consider any of those accomplishments to be a monumental undertaking. Combined, it is legendary.

If there is ever any doubt as to the existence of the 100x programmer - I would think this guy is proof. Somebody who creates their own chip design software as a means to creating their own multicore processor to run their own language, faster, is a motivated genius.


Your ending summary was exactly my thought process as I read the first paragraph. I essentially went through the 5 stages of grief as I realized I am no longer a developer.


Round every circle a larger may be enscribed. Don’t be downtrodden but be inspired.


"If I have not seen further, it is because I have been standing in the footprints of giants"


You're still underselling it; Colorforth is a traditional forth: it's the OS as well as the application[1] - it comes with drivers to write your progress to floppy disk.

One of my favourite blog post by Moore is where he builds up to generate the vga signal needed to drive his monitor from the ga-144. Sadly the code links are dead, even at archive.org - but the text remains:

https://colorforth.github.io/video.htm

https://web.archive.org/web/20160310112830/http://colorforth...

[1] https://colorforth.github.io/install.htm


I've heard of the wonders of Forth for many years -- about how it was just as good at metaprogramming as Lisp, how it was very powerful and yet extremely light-weight, about how you could run it on bare metal, and write your own OS and utilities in a fraction of the time it would take in other, more well-established languages, and so on.

I got really excited about it, and when I finally started learning it, I had very high expectations and really wanted to love it. Unfortunately, overall those expectations were not met. I did find the experience enlightening, as it was a pretty different way of programming, and I love to gain new perspectives. That said, I was disappointed to find Forth was mostly an obfuscated, write-only language.

Many in the Forth community preach that one should write only very short, well-documented "words" (ie. functions) that are clear and easy to understand. But the reality in the Forth code that I've seen was a lot of long, poorly documented, convoluted words which were a nightmare to read or debug. Granted, this was in the open source community. I've heard that the code in commercial Forths is in a much better state, but I don't have any experience with that.

One also seemed to have to write a lot of one's own code in Forth, because there just really weren't that many libraries you could lean on. I guess I prefer the more batteries included approach. I've also heard even Forth fans admit to me that writing Forth was painful and that the point of using Forth was to write yourself a higher level language as quickly as possible, so you don't have to work in Forth.

Well, I'm personally not all that interested in writing languages myself, and am quite content to use a pre-written language where all the hard and boring work was already done for me by someone else, and I can just use the nice higher-level features that already exist in these high level languages.

I guess if you're on a resource-starved system like some tiny microcontroller, using Forth over assembly might make sense. But as I usually work on relatively powerful desktops or servers, I don't really see the point of using Forth there at all. I'd just rather save myself the pain and hassle, and use any of number of existing high level languages to begin with.

As for Moore's work, I kind of see it like the work of someone building the Eiffel Tower out of toothpicks. It's certainly an impressive accomplishment. But how practical is it?

I'm also not sure how much Moore subscribes to what have become well-accepted software engineering practices such as having good documentation, clarity of code, and testing. It seems like he prefers to sacrifice all at the altar of size and performance. That's great as far as it goes -- if that's all that's important to you and you can make it work for you (and he certainly made it work for him). But looking around at the mountains of poorly documented hacky spaghetti code in the rest of the world, it just seems like following that path leads to an unmaintainable, buggy mess.. unless maybe you are Chuck Moore and can make it work anyway.


Forth is no different from other languages in that you can write good code and bad code in it. The 'Forth way' is not always intuitive and in that it fails from the purpose of being a language friendly to newcomers, it's a bit like being dumped on someone else's farm without power, water or sewage and a pile of tools and raw materials for you to use.

Taking a step down like that when coming from a modern environment is really hard. But when looked at from the other direction, as a step up from assembler and a way to become much more productive while at the same time reducing the amount of space bugs have to hide in it makes a lot more sense.

But in the present day it's no longer very useful unless you are doing something embedded or if you want to entertain yourself with an eternal match of code golf.


Forth IS different in one way, from most other languages. The syntax doesn't mark which parameters are being passed to which functions. In C for example you have parentheses and commas that provide that information to the reader. The marking of parameters-to-functions helps most readers read a program and determine its meaning.

In some cases it doesn't matter which parameters are going to which functions, and the reader can get a sense or enough of a sense, from the choices of names the programmer made. You can get by without knowing.

But in many cases it is vital to know which parameters are being passed to which subroutine, in order to understand an algorithm. For most people, having to puzzle and re-puzzle this out while the program is being studied is too much extra work or extra ambiguity, or, they just don't buy having to put up with it, given the ready alternatives.

Those specific markings -- parentheses around function calls -- makes a big difference in readability overall.


I've often thought about an editor for a stack/concatenative language that makes the arity of words clearer. like when you select a word it highlights which of the previous words in the definition it's consuming.

I don't know if it would work for forth - too low level - but for something like Factor I could see it being really useful.


I’m surprised this is the only reference to Factor (http://factorcode.org).


I think the important distinction-- and the reason that I never got very far in Forth myself-- is that with Forth, you don't have the pile of tools and raw materials, but instead a forge and some ingots of various metals. You really do often end up rebuilding even things you'd normally consider simple library functions in most other languages.


> He wrote his own chip design software and analog simulator

I'm going to need a citation for that. I browsed through the website and wasn't able to find anything at all mentioning the development of a full EDA software suite. There are only a handful of companies in the world that can build and maintain a full HDL + synthesis + P&R + layout + verification toolchain...

Also, I would be interested to know what kind of process the GA144 is fabbed with.


I don't know how much of the usual stack it accounts for or if it specifically was used for the GA144 but they're referring to OKAD, often talked about as in hundreds/thousands of lines of colorforth https://duckduckgo.com/?q=Moore+okad+forth&t=fpas&ia=web


Before I put on my cynical hat, I'd like to say that it's pretty impressive stuff for a one-man shop in the 90s.

-----

So it seems like OKAD is basically a layout editor, simulator, DRC checker, and GDSII compiler. The simulator uses a very simplistic transistor model that would only work at very old process nodes (>350nm?) due to the low impact of parsitics and leakage.

Not even a moderately complex digital chip from the mid-90s could be designed and verified using such a simplistic toolchain.

To provide a software analogy, Chuck wrote an assembler for a custom assembly dialect targeting e.g. x86, and then directly used that assembler to build an operating system kernel. Given the absence of "higher-level" constructs, the resulting OS will never be as complex as something like an NT kernel, especially due to the difficulty of testing and verification.

Going back to circuit design, OKAD will never be able to design a complex digital chip on a state-of-the-art process. It's really nothing more than a simplistic EDA toolchain that would only work on simplistic circuits and fabrication processes.

A modern EDA toolchain from e.g. Cadence is many orders of magnitude more complex.


>Given the absence of "higher-level" constructs, the resulting OS will never be as complex as something like an NT kernel

i wonder whether you've worked with Forth - in my view of Forth (spent 1987-89 mostly in Forth running on 8088 clone based terminal of USSR clone of IBM 360, so Forth is my second language after mandatory Pascal and the first true "computer" love really) what you said is a huge advantage and natural characteristic of Forth, not a deficiency. Everything in the Forth world is like Salvador Dali's clock face plates - flexible/pliable/flowing and you build whatever "higher-level" constructs and complexity levels you need right now and leave them behind the moment you don't need them anymore, create and destroy worlds as you go... In comparison to that the static monstrosity like NT kernel is just dead dusty world of an ancient complex tech civilization who died off under the weight of its own complex tech.

To Cyph0n below: man, as an enterprise programmer for 25 years i completely understand and agree with you. As that Forth programmer enjoying it 3 decades ago i can only say to you - you just don't get it :)


That was an analogy attempting to explain what OKAD does. The kernel represents the circuit being designed, while the assembler represents the OKAD toolchain.

The goal of the analogy was to demonstrate that OKAD is not suitable for designing large circuits, just like using assembly to write a modern kernel is not practical.


Well, presumably it’s not meant to be suitable for designing large/complex circuits, but rather to solve just the specific problems that Moore had when he wanted to make a chip. That’s an essential part of Forth philosophy: write only and exactly what you need. Forth itself is designed this way, to be the minimal useful tool you can bootstrap in a few dozen instructions on a chip with almost no resources.


> Not even a moderately complex digital chip from the mid-90s could be designed and verified using such a simplistic toolchain.

Maybe he didn't want to design a moderately complex chip to begin with? His 144 core chip is likely very simple. OKAD may have been sufficient.


>>modern EDA toolchain from e.g. Cadence is many orders of magnitude more complex.

Moore considers that a mangitude of bugs not features.

http://yosefk.com/blog/my-history-with-forth-stack-machines....

Has some good quotes on it if you scroll down to the end


Oh N.B. I just looked it up the GA144 is fabbed at 150nm


And ?


TL;DR Not really as impressive as it was made out to be.


You're entirely missing the point made by Chuck Moore and by extension probably don't understand why his way of looking at programming and the Forth language in particular are a big deal even this long after their invention.

This kind of minimalism and stripping things down to their essence is a powerful tool to allow you to focus on what matters rather than at all those things that don't really matter. If you don't actually need 4 GHz chips and billions of transistors to get the job done then why would you?

Reliability and simplicity go hand-in-hand.


And here's why that matters. Below is the complete source of a Forth block editor. This editor is more than sufficient to write an OS with, although of course emacs or vim would be better. That is pretty awesome.

Forth is a language in which it's realistic for each programmer to have his own custom editor. That's even more awesome.

    | RetroForth Block Editor (http://www.retroforth.org)
    | * Released into the public domain *
    |
    | This is the block editor from RetroForth Release 9.2.1
    | It splits the normal 1k block into two smaller 512-byte blocks,
    | the one on the left for code, and the one on the right for
    | documentation/comments. Both are displayed side by side.
    |
    | It makes use of some features specific to RetroForth, so it
    | will not work on an ANS FORTH system without changes. 
   
   
    tib 1024 + constant <buffer>
    128 variable: <#blocks>
    <buffer>  variable: b0
    variable  current-block
   
   
    : there b0 @ ;
    : #-of-blocks <#blocks> @ ;
   
   
    : new there #-of-blocks 512 * 32 fill 0 current-block ! ; new
   
   
    : (block) @current-block : block 512 * there + ;
    : (line) 32 * (block) + ; 
   
   
    : p 2 current-block -! ;
    : n 2 current-block +! ;
    : d (line) 32 32 fill ;
    : x (block) 512 32 fill ;
    : eb (block) 512 eval ;
    : el (line) 32 eval ;
    : e 16 for 16 r - el next ;
    : s !current-block ;
    : i 0 swap : ia (line) + lnparse rot swap move ;
    : \ 1 s e ;
   
   
    loc:
     : | '| emit ; 
   
   
     : row dup 32 type 32 + ;
     : left# -16 + negate dup @base <if space then . ;
     : right# negate 32 + . ;
     : code|shadow row | swap row swap space ;
     : rows 16 for r left# code|shadow r right# cr next ;
     : x--- 2 for ." +---:---+---:---" next ;
     : --- space space space x--- | x--- cr ;
     : blocks @current-block 1+ block @current-block block ;
     here ] --- blocks rows 2drop --- ;
    ;loc is v
   
   
    : edit [[ clear v ]] { is ui } ;


> : right# negate 32 + . ;

This is an example of "ugly" Forth code. It's ok here to make the code as short as possible.

In reality I would write (educational) Forth code this way. The texts in parentheses are comments.

  ( compute Hypotenuse sqrt[a²+b²] )
  : hypo       ( a  b   )
    swap       ( b  a   )
    dup *      ( b  a²  )
    swap       ( a² b   )
    dup *      ( a² b²  )
    +          ( a²+b²  )
    sqrt       ( result )
  ;

  ( Application:  1.2 3.4 hypo )


Isn't the first swap unneccesary?


Of course. You learn quickly ;-)


> This kind of minimalism and stripping things down to their essence is a powerful tool to allow you to focus on what matters rather than at all those things that don't really matter. If you don't actually need 4 GHz chips and billions of transistors to get the job done then why would you?

I agree entirely. I just wanted to point out that it's not accurate to equate what Chuck designed (i.e., OKAD) to a modern EDA toolchain and technology node.

If it works for him, then great!


Nobody equated that. The time when he made his is decades ago, so obviously nothing from those days compared to a modern EDA toolchain and associated bits and pieces.

I think you subconsciously added the 'modern' in there somewhere and then argued against that.


You might actually be right. When I read this:

> He wrote his own chip design software and analog simulator

I immediately thought of a modern toolchain. Had I known that this was done decades ago, I might have reacted differently.

Nonetheless, even decades ago, both Cadence and Synopsys likely had pretty advanced tools under development. I may be mistaken though.


Moore mentioned these in passing at the end if the link https://colorforth.github.io/1percent.html He was stressing the solution fits the problem. Not a solution which can cover/contain the problem


Heh, I remember running Solo 2030 on Motorola-based Sun workstations almost 25 years ago...


Aren't a lot of us already doing what matters, but just not articulating it the same way?

I collaborate on a team, so we use one documentation standard and one coding style. We've stripped down our coding and commenting styles to the bare essential--one style that works across the whole team, across the lifespan of the project.

The business that pays our team pays for what customers want. Every once in a while a new guy on the team will write something no one wanted, and we end up chucking it. We're stripping down the project code to the bare essentials necessary to make money.

I think what Chuck Moore brings is not so much a plain ol' minimalism, but a nostalgia for a wild west "one man in a garage" tech scene that never really existed, at least not the way we like to imagine it did, because we live in a world driven as much by money as by love of cool new things.

Not to say he doesn't do cool things, but I could do cool things if I didn't have to answer to business constraints, my fellow team members, and so on.


> 'm going to need a citation for that.

"My VLSI tools take a chip from conception through testing. Perhaps 500 lines of source code. Cadence, Mentor Graphics do the same, more or less. With how much source/object code?

– Chuck Moore, the inventor of Forth

Source: http://yosefk.com/blog/my-history-with-forth-stack-machines....

500 lines are actually possible since Forth words are threaded. That means, any new defined word w can take advantage of all available predefined words. Follwing words can take advantage of w, etc. Forth words are not just simple procedures but they can also comprise new syntax structures. Forth words can be "procedures" or "macros", and both can be mixed freely.

Threaded code was the main reason why Forth was so successful in the first years when memory costs were a real issue.


I've worked with another Forth disciple in the late 80s.

He was on a PC, and thought that my Mac GUI was neat. So he wrote a portable GUI framework from scratch in Forth that had dialogs, buttons, pull down menus, text inputs, graphics, mini Mac clone. It was magic. I think it was way under 100K.

After that, I really wanted to become a Forth wizard. I tried, but it never clicked, my code was very clumsy, write only. I admired my coworker as an almost alien intelligence.

I think it really works well for coders spiritually close to hardware.


FWIW, I wrote a graphing menuing system with primitive scalable fonts (for Arabic) in Turbo Pascal in '89 on US taxpayer dimes for live variable twiddling graphing of demographic simulations. It was all roll our own back then on 512K PC's. I implemented a Common Lisp compiler packaged in a DLL for Windows before the net took off. Some kind of "hardware and syscalls up" exercise should be part of more CS programs. It is a great experience to engineer truly reusable (and reused) code before the mosh pits of corporate labors. Good times make for useful memories.


Charles' extreme minimalism philosophy aligns well with Forth, but it can work in many compact environments. Scheme and Lua come to mind. It is difficult at first but powerful and liberating to realize that one has full agency over their program. With other runtimes, there is no possible way to either understand the whole thing, or effectively fork it and fix what you want. Not so with Lua, Forth or a small Scheme.

I really like that Lua doesn't ship much in the standard lib, but what it does ship can be used to build the world.


The economy of Lua's design has impressed me.

You've got just one data structure, the table, and from that you can implement just about anything else... efficiently.


Forth was always fun for me because it really was fun to build up a dictionary of words that solved your problem. It practically forced coming at the problem from the bottom. I do admit that I liked some aspects of Postscript better than Forth, but both are experiences.

Thinking Forth is a pretty good starting point and in general a great programming book http://thinking-forth.sourceforge.net/


It's called factoring with subroutines, and Forth hasn't cornered the market on anything by calling them words.


No, but Forth provided one of the nicest, most immediate, and interactive environments to explore a problem. Instead of multiple files in a text editor, you build and test your app live and then dump it all to a text file. I have rarely had that experience with any other language except Smalltalk. I understand some Lisp environments act that way, but Forth was available places Lisp and Smalltalk were not.


>you build and test your app live and then dump it all to a text file

The Tcl shell is really great for doing this. Since everything can be represented as a string-type, you're able to introspect your defined procedures and dump them to disk.


Developing on the target is just so last century. Now that programmers have developed the IDE you'll have to pull them from their cold dead hands.


I don't think I've used any IDE (other than Smalltalk) that was vaguely in the class of using Forth for interactivity and building of programs from the bottom up.

Whats up with mocking everyone you reply to on this thread?


I found this video of Chuck speaking in 2011 just awe inspiring https://www.youtube.com/watch?v=odBjuSCX8jE


I really worry for what we'll lose when he dies and no-one carries on his work. Reading about his systems is like peering into an alternate universe of engineers and technicians making the world go 'round.


Is Moore okay? His charming homepage (http://www.colorforth.com/) has been down for a while now.


Last working archive on archive.org is here: https://web.archive.org/web/20160414102635/http://colorforth...


Per another comment, he's scheduled to be at an upcoming Forth meeting: https://www.meetup.com/SV-FIG/events/243613446/


This thread worried me. I hope he's fine.


If our technological civilization collapses and we need to recreate it from scratch, Chuck Moore could do significant part of that.


Off topic, but we can't. All the readily available sources of energy have been used. As someone else said, this is literally the last great civilization. Best not screw it up!


That's a note on whether a future civilization without access to our knowledge attempted it, no? I imagine some good physics and engineering texts from now might help a future civilization bootstrap itself technologically, even you have to slowly and painfully jump a few steps that would be easier with readily available oil.


I haven't been on line much these last few days. Did we max out the Sun?


Legend has it Chuck Moore had a 3D wire-frame CAD program he used to carry around as a punch-card deck in his shirt pocket.


I've heard he just types it in from memory when he needs it.


Not from memory: from scratch, every time. Adapted to the problem at hand and not carrying unnecessary baggage.


THE central problem with pure stack machines and stack languages:

The programmer knows in their heart that moves, swaps, dupes, drops, etc. - any stack manipulation that doesn't involve a functional change to the data itself - is an inefficiency to be minimized, but this effort isn't in any way related to the problem at hand (writing a program to do something) so it's unwelcome mental overhead.

I like puzzles as much as the next person, but not so much when they seriously impede the solving of a bigger more serious puzzle, nor when the sub puzzle solving is an exercise in the minimization of something bad rather than the elimination of it.


Traditional languages are pushing and dropping data from the stack every time a function or procedure is called, they just give it another name: parameter passing. Making it explicit, at least Forth tries to reduce the need of copying things from the stack to local variables.


I don't think the stack in a traditional languages is the same as stacks in stack machines and languages. It's a lump of memory that gets allocated to a thread for stuff, and the allocation is indeed done in LIFO fashion, but I believe access to individual memory locations in the allocation is random. Name is the same, LIFO and all, but the mechanism granularity makes it quite different.


It is true that traditional languages lump local variables and return information in the same stack. But the difference is only in the way this is handled by the program. In traditional languages the programmer has no idea how parameters are passed in the stack and the compiler does everything. In Forth this is made explicit, but on the other hand there are no formal parameters to worry about (notice that Forth can use local variables if you want, it is just not the idiomatic way).


If you're programming in Forth and you find yourself doing lots of stack manipulation, you should at least consider using local variables. Once you accept that there will be times when variables make more sense, you'll be surprised at how infrequently you actually need them.

Edit: Be aware however, variables dramatically reduce composability (in any language actually, most just aren't very composable to begin with). By using local variables you essentially turn a procedure into a monolithic block


Pure stack machines don't have local variables. They have at most two stacks and that's it.


I was under the impression we were talking about Forth, not a pure stack machine


Both. And under the hood, Forth implements a virtual stack machine on top of a non-stack machine, which is inefficient.

I guess I'm just trying to counter all the mythos and happy talk surrounding Forth and stack processing in general. In the end Forth is a highly (too?) simplistic language, a product of its time, and nothing all that special or powerful. It does a lot of stuff poorly and the code tends to write-only. The syntax is so loose that the entire dictionary has to be searched to know you're dealing with a number. It's no mystery that we aren't all coding on it now.


I used to go to the NASA Forth User Group meetings in the early 90's to hang with compiler and language nerds. Chuck Moore is a genius problem solver. His Starting Forth is one great example of an introductory language text.


Starting Forth by Leo Brodie is an excellent book, considered one of the best programming books ever written.

https://www.forth.com/starting-forth/


Sorry I recalled the author wrong. I'll leave your correction. Thanks.


The "Do Not Speculate!" idea reminds me of the practice of delivering "the kitchen sink" when only a small program that does one thing would suffice.

It reminds me of when, e.g., a 1990s-2000s Windows user needed a small program that does one thing, and in order to get it, the download from Microsoft was an installer with hundreds of megabytes of unneeded binary files. It was not possible to download only the single program, which was no more than 1-2MB in size.

It reminds me of this quote: "The problem with object-oriented languages is they've got all this implicit environment that they carry around with them. You wanted a banana but what you got was a gorilla holding the banana and the entire jungle." - Joe Armstrong

It reminds me of bloat, of giving users things they do not need. A problem that continues to this day.

Instead of bloated, complex UEFI, I wish something like OpenBIOS/OpenFirmware/SmartFirmware/OpenBOOT would be available for today's hardware. I have played around with fgen on i386 and read some of the history. They even provided courses on how to write drivers. Besides OLPC, why did this not catch on?

IMO, Forth is the most elegant and flexible language for initializing hardware.


This is what I like about Rebol & Red...pity Rebol didn't catch on more. The full Rebol download is a single executable (no real install) and can build GUIs, and so much more in a very small amount of code.


I'm always amazed when I see things like Red, with a 1MB executable with an interpreter, compiler, GUI, networking libraries, parsing libraries, and more all built in.

Or the QNX demo disk back in the day. OS, GUI, web browser (with javascript), full clustering support, and more, on a single 1.44MB floppy.


I've heard of the QNX OS...that seems magical.


Not to mention Red can cross compile for multiple OSes and the parse dialect that is basically generalized, readable regular expressions.

My personal theory why Rebol/Red didn't succeed in the mainstream is steep learning curve, it's hard to discover how to do things if you haven't already done some substantial programming work in it. While built-in help is good you need to know what you're looking for. (Delphi had better help in this matter, with examples etc.).

By the way I wonder why Pecan's post is [dead], it doesn't seem to be violating any rule.


Open it (click the duration), and click vouch. Arc has some (necessarily-)undocumented flagging code, and it frequently backfires.

The worst-case scenario is when you open the account in question and every single comment is grey... it hurts to see. (In that case, the account itself has been flagged. Not quite the same as reddit's shadowbanning, but only technically not the same.)

It's bittersweet that the moderation here is paid: there are AFAIK only two people (but don't quote me on that). There were some hazy plans to give some users fractionally elevated capabilities at some point, but it just hasn't happened yet.


Thanks a lot for this tip! Actually I'm amazed on how little workforce is necessary to keep this place running. I'm browsing with showdead on and rarely see the flagging backfires like this one.


> didn’t succeed in the mainstream

Is that true of Red? I was playing with it the other day and it looked really promising, and I was under the impression that it’s still quite new and hasn’t reached 1.0 (my point is that it might not be at a stage where we can write it off).


> it’s still quite new and hasn’t reached 1.0

According to the main page [0] the project was unveiled 6 years ago. For comparison Rust (a random example) started 7 years ago [1] but compare results for "Rust" and "Red/Rebol" on Algolia [2] (sorry no deep links). It's clear Red is niche.

I'm not writing it off. It's definitely one of most interesting languages and runtimes but it's super hard to capture people's attention nowadays when you don't have support of large companies (Go, Rust, .NET).

[0]: http://www.red-lang.org/search?updated-max=2011-03-29T15:38:...

[1]: https://en.m.wikipedia.org/wiki/Rust_(programming_language)

[2]: https://hn.algolia.com/


Hmm, you're right. I didn't realise it was 6 years old!


Yea, sorry bout that. Red isn't 1.0 yet and I have very high hopes for it. I meant just Rebol (ancestor of Red) in my earlier post and corrected it.


Kdb (the weird line-noise functional programming language) is similar. The trial version is a 200KB binary.


Isn't Kdb the database you can program in Q now, but also has K the PL too? I program in J, which is similar. I don't consider it line noise, but I get the drift.


[edit] yes, KDB+ is the database product. Q is one interpreter that interops with it. K is another.

I am not 100% sure but I think Q is actually K plus a standard library of "Q words" so you can write K directly if you want.

I think the reason for Q is as a more familiar interface for people with SQL experience.


The author of Kdb is Arthur Whitney. He is another extreme programmer who has had a huge influence on fintech and made boat loads of cash I would presume.


Just want to say that it's awesome to see Rebol/Red and K mentioned in a comment thread about Chuck Moore.

It's interesting because Rebol and K have almost totally opposite philosophies about how software should be structured. K emphasizes flat namespaces and leans heavily on the primitives and only a handful of functions. Rebol encourages writing DSLs to fit the language to the problem.

But despite different philosophies, they both allow incredible programmer productivity. I think a lot of this productivity comes from adherence to "Do not speculate!" and therefore not having a bunch of non-useful functionality or boilerplate for the end-user programmer to have to manage.


Agreed. Very different, yet similar philosophies if you squint hard enough. Aaron Hsu did an AMA on here and put it on YouTube showing his GPU compiler for APL code written in APL. The entire thing is maybe 5-10 pages of one-liners and looks darn elegant to me. During his lengthy talk on HN he shows you don't need zillions of abstractions if you can keep all the code in a few screens. He uses Notepad (not Notepad++ even) as an editor too. To me, in a way that makes sense as having to use a gargantuan IDE to manage your gargantuan code base seems wrong on some level. Forth users just take that sentiment one step further and only even include the primitives in their language that they need.


Relevant hn link:

https://news.ycombinator.com/item?id=13797797

See also the top comment by Dang, with a link to the original thread.


He also helped in the development of J with Roger Hui and Ken Iverson, the creator of APL, by writing a one page interpreter for J over an afternoon or two. I am always drawn to prototyping and writing solutions in J. I am not a web developer, but for mathematics and statistics, it is amazing. I've always wondered how similar and different K and J are on a deep level, but I have always stuck with J.


Norman Thompson has a J book I'm reading and it is mind melting.


Yes; see the 5th-last paragraph of http://archive.vector.org.uk/art10501320 for a concrete answer to this. (Getting to said paragraph by way of all the paragraphs before it is, of course, highly recommended.)

Yes, I'm mad that the subject of the above URL is still, to quote the article, "dark."


> It reminds me of when, e.g., a 1990s-2000s Windows user needed a small program that does one thing, and in order to get it, the download from Microsoft was an installer with hundreds of megabytes of unneeded binary files. It was not possible to download only the single program, which was no more than 1-2MB in size.

That might have been more to do with how hard publishing code to the web was back then, an internal process problem, which discouraged tiny stand alone exes from being put online.


It has always been possible. Sysinternals was doing it 20 years ago. The catch is you had to build with Win32 and not the framework du jour.


wxWidgets has very low overhead. I think I have done some stuff that's also 1-2MB and much easier to code than plain Win32.


i don't buy it. dpkg was released in 1994, and that was designed to tame the nest of people passing around small executables on the internet.


> i don't buy it. dpkg was released in 1994, and that was designed to tame the nest of people passing around small executables on the internet.

The problem is corporate policy. In the yee old days, publishing content to MSDN was not easy. Bill Gate's rant[0] on this topic gives some insight.

Grabbing small EXEs is now a lot easier. See: https://docs.microsoft.com/en-us/sysinternals/

[0]http://blog.seattlepi.com/microsoft/2008/06/24/full-text-an-...


I knew which rant that was going to be before I even clicked on the link. Great memo. Really refreshing Software companies need leadership who actually dogfood the software they make and who care enough to rant when quality, usability, performance, etc. has gone to shit. Too many seem to simply have their noses in their competitors' feature checklists, so their E-mails are all "Company X has Feature Y. Why can't we have Feature Y??"


designed to facilitate, rather


"Do not speculate" or YAGNI is really difficult to do past a certain point. You have to vanquish your own FUD one piece at a time. Nobody really does that unless they are really forced to.

Ignore the occasional statements in forum threads about how Forth is so powerful.

Forth is not powerful. It is ugly, stupid and mean.

Because of this, you simplify, simplify and simplify again, up to the project's specs if you can.


> Forth is not powerful. It is ugly, stupid and mean.

That's a statement of someone who doesn't understand Forth. Of course, one can write ugly code easily. Any language with raw power (Asm, C, Forth, Ada) needs a disciplined developer to get beautiful and maintainable code. Forth code can be really beautiful if the vocabulary is well thought out.

I consider Forth the best choice for tiny iOT devices, Firmware, and other essential things which can be kept simple and stupid, following the KISS principle which works very well in many cases.

Our vastly proliferated software security problems come from complexity which developed by despising the KISS principle. Many developers didn't choose the simplest way, they chose the most convenient way. Now, we realize that keeping things simple and stupid is not that bad after all.


I've written a standalone Forth system that could boot from a floppy and recompile itself. I've ported the core interpreter to C so I could use libraries. I've rewritten it using various threading techniques. Up to that point it was just toy systems because i never did something useful with those. Then I rewrote a Forth interpreter than would play really nice with C. That thing right now is around 30k/3KSLOC. This time I wrote various utilities I needed for my job using it. One of them is a log server that runs 24/7 that provides web interface, and I consider making it send emails... So I think I do understand Forth a little bit.


Doesn't your latter post contradict your previous post? If Forth is not powerful how did you manage to do such things with Forth, and why did you stay so long with Forth despite its "ugliness" and "meanness"?


"powerful" too often means "it does things I don't really need or that I don't even understand; but hey, I might use them someday" - or at least that's how it sounds to me.

"ugly, stupid and mean" is actually intended to the beginners or to those who want to try it. That's what most systems are out-of-the box, especially for the minimal ones (bigger systems like gForth might be a bit better).

That's their starting point. R> and R> are hideous so at one point they'll want to replace them with "pop" and "push". Stack juggling looks bad so they'll have to learn to minimize them. And they'll have to get used to RPN. There's no GC and not even heap allocation (unless ANSI-Forth added it as an extension since the last time I checked?), so they'll have to deal with the lack of memory management that are in every other high-level language. There's neither static nor dynamic type checking so they won't get any warning if they dereference a character. Instead, it will crash again and again without any stack trace.

So they'll have to deal with all this ugliness, stupidity and hostility that they won't expect "enough" because Forth fans praise it often without a warning about the fact that it takes a really long time to fully tame such a rabid language.

Why did I stick with Forth for so long? I made it progressively more beautiful for me; I learned to fight complexity and factor, which indirectly prevents mistakes; I implemented warnings in my system for mistakes that are hard to find (e.g. a redefinition) and are easy to check for.


> "ugly, stupid and mean" is actually intended to the beginners or to those who want to try it.

When I encountered Forth first (6502 figForth) I felt it quite convenient. At that time there were only two other options - Basic and Assembler. Maybe that most developers today are too pampared by all the available IDEs for other languages. It's the same attitude why so many people stick with Windows instead of learning OSX or Linux, despite all the flaws of the Windows ecosystem.

> lack of memory management that are in every other high-level language

As already mentioned, I consider Forth suitable for tiny systems, iOT, Firmware etc. I would never use Forth for big software. I would not use even C++ for that. In that area Ada, Rust or Nim works much better for me.


Forth is an amplifier. Bad programmers will write very bad programs, good programmers will write very good programs.


> It reminds me of this quote: "The problem with object-oriented languages is they've got all this implicit environment that they carry around with them. You wanted a banana but what you got was a gorilla holding the banana and the entire jungle." - Joe Armstrong

What Joe Armstrong (who I admire) complains about is actually what I love about OO languages. And for me that's the true essence of what OO is, and how it differs from non-OO. People keep trying to frame OO in terms of mutability (completely orthogonal) or C++/Java coding practices (C++ doesn't define OO because it has objects, anymore than it defines FP because it has lambdas).

The real difference, the real paradigm shift is in data + functions vs objects. And I maintain that objects are more powerful and better idea than data + functions. The banana can't be peeled without the gorillas opposable thumbs.


I'd love to see if in 50 years we're closer to Smalltalk, Haskell, or something completely different. I'd love to say something radically different, but I have a feeling static types will continue to increase in popularity. No real evidence, just a gut feeling. Thanks for posting your feeling on OO. In some ways I've always felt very restricted writing OO (even in Python), but learning Smalltalk is making me see it from an entirely different perspective.


> I'd love to see if in 50 years we're closer to Smalltalk, Haskell, or something completely different.

I guess in my mind FP and OO are orthogonal. If I write an immutable object and I send it messages that are referentially transparent, am I not doing both? Would it help FP people if they thought of such objects as "a record of partially applied functions"? So I hope both. I'd love if there was a statically typed, high level pure OO language without all the cruft of Scala or C#. Maybe I have to wait for the next programming fashion cycle for that to happen.

> Thanks for posting your feeling on OO. In some ways I've always felt very restricted writing OO (even in Python), but learning Smalltalk is making me see it from an entirely different perspective.

I often wonder if I need to coin a better term for "OO" for what I am talking about. Alan Kays whole schtick was that objects contained both data and procedures, but by combining them they became something new, and more than just the sum of their parts. That's the powerful idea to me. But language changes, and OO in 2017 just means "Java".


For me your last sentence is not an argument why OO is "more powerfull and better". The banana can be peeled very well by anyone or anything with an opposable thumb (a function). No need to limit yourself to a gorilla (an object).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: