Hacker News new | past | comments | ask | show | jobs | submit login

One could also interpret the "failure" of compiler writers to do this as evidence that the "DSL paradigm" for programming itself is, at best, extremely limited.

One obvious problem is that the semantics of a DSL arent extendable in that DSL, making it highly brittle. So, eg., with parsing, as soon as you need bespoke syntax-error handling, etc. etc. you're leaving the world of any DSL you can create. Since "bespoke" here will, ultimately mean, a semantic specialisation that no generic DSL will support.

At the point where your DSL enables these abitary specialisations, it is just the host language.




In Clojure, the adage says "When building DSL, favor data over functions over macros". When I write a DSL I start by writing a sample of it as pure data (think of Clojure as a data processing oriented language with embedded JSON (to be exact, the format is called EDN)). Then I write the parsing function. What I do though is to make sure the parser is extensible. To do so I cast any data element that this function parses into a corresponding function, except if that element is already a function. This way I can directly write custom functions within the DSL. This is very handy when handling conditions and their composition through function combinators. I can just reuse the language boolean logic (if, if-not, switch-cases etc) and don't have to reimplement it withing the DSL. Once the DSL data has been parsed into functions, they are then composed through classical function composition available in the language. Sometimes I also wish I had access to the DSL through macros, mostly to optmize those DSL. The situation I have found myself to need this required me to impact the compiler's state in some way, so I didn't carried my investigations forward regarding this subject, and I'm very interested in the development of Lux/luxlang since it comes with macros that are passed the compiler's state monadically.


The DSL paradigm is extremely limited, because there are almost no people who are good at writing programming languages that work for developers instead of the other way around.

XML didn’t fail because of some technical limitation. It failed because it tried to turn us all into language designers. At worst JSON asks you for a schema, and that was an afterthought in both cases.


The problem with XML was slow parsing and the horrible syntax of XSD schemas. Turning attributes into elements is also a pain in the ass. And for a lot of applications turning whitespace into elements is unnecessary.


Those are confounding factors. If you made any other product with the same goals, but faster parsing and better schema syntax, probably by integrating the two, then you'd still end up with the same mess because defining a language is worrying about a million corner cases and not being able to fix more than half of them once they ship.

It's a hard problem. Probably the hardest.


XML didn't fail.


That’s why DSL-in-ruby is a moderately popular technique; the host language is expressive enough to host a DSL without a custom parser.


DSL in Kotlin is popular as well.


Give me an xml/json schema any day over this.


I fail to see how those things are at all alike.

An xml/json schema is for validating a serialization format; a DSL-in-ruby is a programming language (to be fair, it's likely a mess, but I've definitely seen it be less mess).

When you look at business rules / requirements, it's common for 99% of them to be trivial. Of the remainder, 99% are only moderately complex.

If you have, say, fewer than 10k distinct rules, you probably don't have more than one that's really wicked.

However, if you have 100k rules, it's more likely than not that you have quite a few that are going to completely wreck your xml/json model.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: