Hacker News new | past | comments | ask | show | jobs | submit login

At least right now - I do believe, fairly firmly, that repeatability and precision are requirements for a programming language.

And I'm not saying that because I think those things are inherently more valuable (they might be - I don't really know), I'm saying that because a programming language has to interact with hardware. And at least our current generation of computational hardware requires precision and repeatability or it doesn't work. And I don't mean doesn't work as in "fails". I mean the literal foundation of the space is built on logic gates - Small devices that have very specific, repeatable, and precise outputs for a given input.

The requirements for precision and repeatability are SO ingrained that it's genuinely hard for us to introduce real randomness to the process.

---

So stepping back a moment - I think the issue at hand is really how we define "programming language". I see them as tools to control hardware that has fundamental requirements (at least right now) around repeatability and precision.

This uses languages that do have those properties to create another abstraction layer which no longer provides them (or does provide them, but through a black box that hides or obfuscates how they're being applied in a manner that simulates not providing them, it's actually really hard to tell based on the content of the article)

It means that there's a fundamental gap between the capabilities of the two.

---

A programming language can program hardware. This is an application for turning stories into visual output written in languages that can control hardware (Go and C#).

It's a cool application in much the same way that NetLogo is a very nifty toy to introduce beginners to language/concepts/terms that are used in programming.

So in that sense, I think the creator has mostly hit the mark he's going for. But it's not general - it can't step outside of that space. It's more akin to a domain specific language that has to operate within the context of his specific editor/application.

So just like Photoshop is not "a programming language", I don't really see this as a language.




Respectfully, these are all requirements you are projecting onto the design and implementation of programming languages based on your preferences. Like you said, you "see them as tools to control hardware that has fundamental requirements". Your definition of programming languages is predicated on your view of languages as tools. Yes, it is important for tools to be predictable, repeatable, and precise. But not everyone uses programming languages as tools, so there is no requirement for them to be tools for their applications. Just because the underlying hardware is a predictable machine, doesn't mean the language has to use it in predictable ways.

For instance, I could write a language that has all the trappings you would expect from a PL: a parser, compiler, syntax, semantics, code gen, etc. But the execution of language constructs depends on the time of day the program is compiled. e.g. An if statement compiled in the morning doesn't behave like an if statement compiled at night. Would that be a good tool? No. Would it be a programming language? I don't see why not. It's a programming language in every sense except for an arbitrary constraint you've placed on it based on your particular expectations.

> But it's not general - it can't step outside of that space. It's more akin to a domain specific language that has to operate within the context of his specific editor/application.

There is no requirement for a PL to be generally applicable. Domain specific programming languages are in fact programming languages. Moreover, I would also argue that any language you call a "general" programming language is in fact a "domain specific language". You just have defined your native domain as the "general" case, and any language outside of your native domain a "DSL".

For example, "general" languages like C are in fact specifically tailored to the domain of imperative programming on a von Neumann computer architecture. When you take C out of its target domain - e.g. into parallel programming in a non von Neumann architecture, it suddenly becomes very cumbersome to express programs. Other languages you might call "domain specific" can very easily express programs in that domain. e.g. the dataflow language Lucid. People native to those domains would call those languages "general" and C "domain specific". It's all a matter of perspective.


> For instance, I could write a language that has all the trappings you would expect from a PL: a parser, compiler, syntax, semantics, code gen, etc. But the execution of language constructs depends on the time of day the program is compiled. e.g. An if statement compiled in the morning doesn't behave like an if statement compiled at night. Would that be a good tool? No. Would it be a programming language? I don't see why not. It's a programming language in every sense except for an arbitrary constraint you've placed on it based on your particular expectations.

But isn't this still a specific and repeatable behavior?

You're defining a language feature that I have no issue with here. I agree that it doesn't seem all that useful, but it's not at all in conflict with my definition.

> For example, "general" languages like C are in fact specifically tailored to the domain of imperative programming on a von Neumann computer architecture. When you take C out of its target domain - e.g. into parallel programming in a non von Neumann architecture, it suddenly becomes very cumbersome to express programs. Other languages you might call "domain specific" can very easily express programs in that domain. e.g. the dataflow language Lucid. People native to those domains would call those languages "general" and C "domain specific". It's all a matter of perspective.

I feel like this is really the heart of the discussion - If we are to assume that a language is to eventually be expressed on hardware that has been designed from the ground up to perform boolean logic, I don't see how we avoid the requirement that the language deal with boolean logic.

Lucid is fine by me - it was literally designed to be a disciplined, mathematically pure language. That it happens to use a different architecture than a central CPU and registers has little bearing on its ability to perform maths/logic.

Basically - Is this language not just a less capable subset of a "general" language? Because even the author has explicitly stated that it almost certainly won't be able to achieve even simple tasks such as parsing a document, and even a basic calculator was a "maybe".

So I can certainly understand that it may not be relevant to parse a file in some contexts/cultures, but I can't help but wonder how you can possibly hope to build a framework that explicitly avoids those concepts when the whole foundation has to be built on the things you're trying to avoid. The abstraction has to leak by default, or be inherently less capable.

Now - There may be some interesting room to consider hardware that isn't based on gates (AND/OR/NOT and all their various combinations) but this isn't that.

Which brings me back around to - isn't this just making the rules into a black box? They still exist, but they've been obfuscated in a way that makes them much less apparent? Handy for teaching, but ultimately limiting?


> But isn't this still a specific and repeatable behavior?

Depends, maybe it chooses the time zone to calculate night/day randomly.

> They still exist, but they've been obfuscated in a way that makes them much less apparent? Handy for teaching, but ultimately limiting?

Right, and that’s okay. Languages that are handy for teaching but ultimately limiting are still programming languages. Being good at parsing files and writing calculators is not the bar for being a programming language. HTML and CSS are still programming languages even if they’re not used to write parsers. Excel is still a programming language even if it’s not used to write servers. LaTeX is still a programming language even if you can’t easily write games with it. People don’t reach for C to write web pages, or budget their finances, or publish their manuscripts. This doesn’t make C less of a programming language.

Datalog, Coq, and Agda are three languages off the top of my head that are not even Turing complete, so you’re not going to be able to express all programs in them. If not being able to express a parser in Cree# makes it not a programming language, is Datalog not a programming language?

Coq is a limited language for theorem proving. Is it not still a programming language? Actually, now that I think about it, “general purpose” languages like C are ultimately limited by their Turing completeness to not be good languages for theorem proving. So this is another area where “general” has some caveats. In other words, Coq being “less capable” than C allows you to do things in Coq that you can’t do in a “general” language.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: