Hacker News new | past | comments | ask | show | jobs | submit login
Dark’s new backend will be in F# (darklang.com)
347 points by nikivi on Nov 3, 2020 | hide | past | favorite | 257 comments



Does anyone else see the irony here? The Dark engineers switched their core language with another because the open source ecosystem wasn't broad enough to cover their needs. And yet they're asking their customers to make the same bet on their proprietary language and platform. I mean just look at this FAQ[1]. There isn't a clear path out of vendor-lock in, a terms of service, or even a guarantee that the platform won't disappear without notice.

- [1] https://darklang.com/language


It's interesting to contrast with Docker, who similarly pitched an exciting new development model & ecosystem. In Docker's case, it was indeed non-proprietary, built on existing standards, and has now got to the stage where you can now swap Docker out for competing alternatives to avoid lock-in entirely.

On the other hand, while that model went great for Docker the technology, it has not gone well for Docker the business. I imagine Dark might see Docker as a cautionary tale of why _not_ to go down that road.


The problem for Docker wasn't that per say, Docker itself was the free part that got everyone interested it's just that their paid for tier went head to head with Kubernetes and lost.

Had Kubernetes not existed they would be doing just fine economically as they would be the dominant platform.

They got beat by competition.


Disclaimer: I work for Red Hat on OpenShift so I'm biased, but this is just my opinion.

Yep I agree. I recently helped a customer to move from Docker EE to OpenShift. They were frank about the pain points they experienced and why they were moving. It really came down to competition. K8s solves the problems they were having in a better way. Now with so much momentum behind K8s, nearly everybody is re-platforming their offerings on top of K8s.


Honest question as an outsider: What did Docker introduce that wasn't better addressed by other technologies? E.g. wasn't reproducibility the main point and already properly solved by nix?

By the way, it's "per se"


Dockerfiles are far simpler than the Nix language. Likewise, sure, FreeBSD had jails first (and for a long time!), but they are more difficult to set up.

Docker allowed reproducibility to be accomplished by a person who likes the idea of it, but will shelve it if it doesn't work within an evening. This describes me, for example. I think it took off for that reason.


Nix has a multitude of drawbacks the biggest being learnability.


god yes. Friggin' Nix is such a great idea, but such a pain to start using...


I mean you could say the same for vim, but I still learned it and am now reaping the benefits.


Yes, but most people don’t bother.


IMHO, it was the whole package: One tool that could configure and wrangle all the various kernel namespaces in order to make containers work.

They also benefited significantly from the splash they made and the tech excitement factor. I still run into people in my consulting work that are fairly new to containers, but brand recognition on Docker is through the roof. Everybody has heard of them even if they don't know what it can do for them.


As Apple have spent the 21st century demonstrating time and time again, a more usable form of a well-established idea is better.


You don't have to use the nix lang, Dockerfiles are much easier to get started working with.


Even prior to k8s, Red Hat and Cloud Foundry (amongst others) were pitching better container orchestration tooling that Docker offered (e.g. OpenShift 1 wasn't based on k8s).


At this point, Darklang just seems like an over glorified side project. I cannot see any real business creating an entire backend system out of darklang- the learning curve, vendor lock-in, and volatility of the project is not worth the risk.


> because the open source ecosystem wasn't broad enough to cover their needs

I don't really think this matters. Dark is intending to be "batteries included" and relieve developers of the need for a huge ecosystem with multiple libraries for everything.

> And yet they're asking their customers to make the same bet on their proprietary language and platform.

I always use very popular open-source stacks.

That said, if given the choice between a tiny FOSS language and a tiny proprietary one (that might be profitable), I'll always choose the latter.

Small languages are extremely vulnerable to extinction regardless of who maintains them. But if at least one person's full-time job is to maintain a language, I feel much more confident than if 1-5 people are maintaining it part-time.


> That said, if given the choice between a tiny FOSS language and a tiny proprietary one (that might be profitable), I'll always choose the latter.

(Not to put words in parent poster's mouth, just for reader clarity)

F# is no more proprietary than Rust. F# was started by Microsoft Research but is now an open source language 'owned' by the F# Foundation and licensed under the MIT license (https://github.com/fsharp/fsharp).

(The historical Visual F# compiler was developed by Microsoft per http://research.microsoft.com/en-us/um/cambridge/projects/fs...) but AFAIK today Microsoft's small full-time F# devtools team work on contributions to the open-source compiler in addition to their work on VS tooling.


If we're going to be frank, the Venn-Diagram of people who build F# tooling and libraries and the people who work for Microsoft in some capacity on F# are nearly a circle. No one would ever use F# divorced from .NET, and it's kind of a red-haired stepchild there.

It's a fun language to work in, with some neat features, but C# is gobbling up every compelling advantage inexorably.


Apparently many don't have an issue with some key Haskell developers also being on Microsoft pay check.

Also Microsoft alongside Facebook is looking for ex-Mozillans to join their Rust teams, then what?


Guido Van Rossum also works there, so what's your point?


No he doesn't. He worked at Dropbox starting in 2013, but retired last year.


Yes he does, as a Distinguished Engineer.


I think C# is becoming F# a lot faster than F# is becoming C#. I think people who really learn F# are setting themselves up to be productive for a long time.


> Dark is intending to be "batteries included" and relieve developers of the need for a huge ecosystem with multiple libraries for everything.

If it's actually going to be 'batteries included' in any meaningful way, then it will have to piggy-back off of one or more of these 'huge ecosystems', which invariably leads to the ability to integrate external pieces of these systems when needed. Which just makes it a wrapper around existing 'huge ecosystems'.

There's a reason why established languages such as Java, C#, C++, JavaScript, etc. have multiple libraries for the same problem - these libraries have different features, tradeoffs, designs etc. Same reason why existing projects get forked and new projects get started. Same reason these 'huge ecosystems' have frameworks, design patterns, idioms, best practices, etc. Competition, innovation, new designs, new ways of solving existing problems is a good thing.

I don't see dark as anything more than a very new framework, with lofty promises and zero evidence towards the idea that it will be better than existing systems in any way.

> Small languages are extremely vulnerable to extinction regardless of who maintains them. But if at least one person's full-time job is to maintain a language, I feel much more confident than if 1-5 people are maintaining it part-time.

A tiny FOSS language at least has a chance of getting some maintenance/bug fixes from the community. What happens when your proprietary language dies for lack of traction? Also, nothing prevents a person working full-time on a FOSS language.


Proprietary doesn't mean it's anyone's full time job.


Is lock-in that big of a concern for people who are using serverless? I have no direct experience here, but my understanding was that all the major serverless platforms are built on proprietary APIs.

That said, this item from the FAQ is nice to see; I'd love to see more vendors not just making promises along these lines, but making them binding:

> if the worst should happen, we will open-source Dark ... We're committed to codifying the specifics in a legal framework


Yeah lockin is a problem in general for serverless if you need all the cloud vendor's extras. For example, lambda doesn't make sense without IAM, VPC, etc and you often want things to hook to it like SQS, API Gateway, database, cognito, etc.

Instead of lockin, you can run serverless in your own framework though, such as Kubeless or Knative, which could be deployed on prem or in a more generic cloud deployment.


Doesn't that kind of defeat the point of "serverless", though? I mean, I guess if that's how you like to structure your code then go for it, but the primary marketing pitch I've ever heard was around it, you know, absolving developers of the need to worry about servers.

I guess if you've got a large enough environment that you have dedicated devs pushing things to Kubeless/Knative instances maintained by dedicated ops/sysdmins, then maybe that's the niche there?


It depends on what you think "serverless" means.

Is it solely about the developer experience? Or is it about the economic ownership of hardware? Or both?


I was referring specifically to lambda-style "functions as a managed service", which might be a better term than serverless. Managed instance would be serverless by the economic definition.


I agree with this criticism, but I wonder how they could monetize without locking in users. I suppose they could open source their toolchain so anyone could stand up their own Dark framework on-premise and maybe they sell support and/or a managed service and/or "enterprise features" (whatever those are in this context)? Maybe these things will come sometime after the private-beta (perhaps they're just getting a better feel for their product's value proposition before committing to opening any particular piece of it up)?


Or maybe this is why Dark is a bad business endeavor in the first place. I would never bet my company's future on something like this, and the toolchain doesn't seem sufficiently powerful to warrant the risk.

Sorry, but this is no Clojure. Instead of a proper comment syntax you define a dummy variable to a string? Ghetto.


I think it's easy to monetize: make the BCL dual license AGPL+proprietary


This is not about being open-source, but about being popular.

OCaml is a fine language, but it's not a wildly popular one. Should the authors have started with, say, Python or JS, they won't have the problem with support from third parties. Please note how their choice was between open-source Rust and proprietary (CORRECTION: also open-source already) F#, both descendants of the ML family.

When you pick a language and start to feel you're overgrowing its ecosystem, you either migrate off of it (as in the post), or start developing it to help it move in the direction you want. In the case of OCaml, Jane Street and Facebook chose the latter route.

UPDATE: Thanks for reminding that F# has been open-sourced: https://github.com/dotnet/fsharp/



> The Dark engineers

What do you mean. The whole page talks about "I" and "me". That's going to go well...


There used to be a company and a team of engineers. Recently they laid off everyone and it’s down to just a one-person show now.


Sounds like a great time for a backend rewrite!


So, heading in a positive direction then?


This is a great article but I really wish that somewhere on the page it said what a Dark is and why I would want one.

I tried clicking homepage it wasn't there, I tried clicking demo and it tried to open YouTube which was not what I wanted to happen, then finally I found it in the documentation section. The first sentence or two from there, maybe as a subtitle or in the intro paragraph would help I think.


I spent some time last night looking at it. Just to affirm what you're saying, there's hardly any information on their frontpage unless you go venturing into videos and blog posts.

Dark is basically a language that's designed to be serverless out of the box. They've baked their own engine into the language as well as dependencies for various cloud providers, databases (postgres was highlighted a lot), etc so that you can build serverless applications quickly while having no knowledge of operating systems, infrastructure, or really even computer science.

It reminds me of a number of the languages out there that are low-code or no-code. A regular programmer probably wouldn't even recognize some of the syntax and symbolization this language uses.

Here's some examples: https://blog.darklang.com/spin-up-a-slack-app-in-seconds-wit... Most of us have probably built a Slack app by now (or bot) so this is a good example to compare with IMO.

Dark has a mission of appealing to 1B people and they say the first step is to appeal to developers.


Did you figure out what the greyed out "v1", "v4", etc. after certain calls was for? I clicked through a few pages but couldn't figure it out.

Internal versioning as they release new methods on their own libraries? So it doesn't break old ones?


That's right, we version all stdlib functions. I talk about it a bit here: https://blog.darklang.com/how-dark-deploys-code-in-50ms/

When we add a new version of a function, all the calls to the old function are unchanged. The autocomplete shows new users the new version (and not the old version), while people who are calling the old version get shown both.


Is this what you refer to as feature flags in the FAQ?

> Q: Deployless seems dangerous! Won't I constantly break my application?

> A: ... snip ... With Dark's feature flags, you have precise control over how and when a specific feature or set of features becomes live, and for which part of your user base.


No, feature flags are used to wrap an expression and conditionally choose another expression. Its a fancy if expression. In Dark you add one by selecting `add-feature-flag` from the command palette.


The Segway of programming languages.


Oh definitely an innovative project and I think F# is such a good choice for the purpose.

Hope it works out!


To me, the most interesting part of Dark is their dedication to tightening the feedback loop in website development.

It caches recent requests to your site, and as you make changes it constantly re-runs those requests showing you the resulting live values in code. This is close to something like Excel where code changes are instantly and visually applied to the input data.

There's incredible promise here, and I believe similar fast-feedback paradigms will become the mainstream programming experience within 30-40 years. We've already had a taste of this with Jupyter Notebook's ubiquity across data science; despite its many flaws it lets people iterate just a little faster.

As for Dark itself, it feels they put too much emphasis on language and not enough on ecosystem. I wish they had focused on a fresh new IDE experience for Django or Rails rather than additionally building a entire language and syntax from scratch.


Good point! I quickly updated the blog post with a blurb. Was not expecting this to be #1 on HN!

> Welcome HN! If this is your first time hearing about Dark, check out the website[1], our What is Dark [2] post, and How Dark deploys in 50ms [3] to understand what we're about. Thanks for checking us out!

[1] https://darklang.com [2] https://blog.darklang.com/what-is-dark/ [3] https://blog.darklang.com/how-dark-deploys-code-in-50ms/


> This is a great article but I really wish that somewhere on the page it said what a Dark is and why I would want one.

I feel exactly the same. It looks like you have to watch some videos that explain what it is and what it's for and I don't have time. If they can't describe it in a concise paragraph then I'm not sure why I would care. Why is this ending up on the front page of HN 2 days in a row - apparently it's only because they're dropping OCaml. Ok, fine you're dropping OCaml and moving to F# for your... whatever it is.


I’ve used it quite a bit and you make a valid point.

The appeal it had to me was watching one of the videos in which he talks about inherent versus accidental complexity.

I’d say eliminating all the extraneous accidental complexity you see in the entire process of building software is what it aims to do. That really resonated with me.

Deployment, code versioning, IDEs, infrastructure. So many of these things can be replaced by a simpler holistic toolset.

I’d say for simple use cases they’re well past the 80% mark. And it really is enjoyable building something in this paradigm.

The challenges are obviously the next 20% which will be where all the really hard edge case parts come.

I’m really looking forward to seeing it evolve.

It’s worth watching the introductory talk, even if you never intend to build something with it.

Accidental complexity is something I’m sure all engineers battle with. It’s nice seeing a genuine attempt to eliminate it. Even if it has ironically come along with some of its own accidental complexity. (E.g. not having enough libraries)

I’m hopeful that they’ll continue to battle all the right parts of accidental complexity and stay aware (as they are) of where they’re introducing it


idk, I clicked Homepage and the blurb was right there. Looks like a language / environment to build serverless apps.


One thing that makes it hard to grok is how large the scope seems to be and the cognitive bias many devs have developed against new platforms due to being burned in the past. It reminds me of a throwback to 90s era RADs that wanted to encompass everything for everyone without really having clear intent or at least a platform company backing them to snare enough users to become a new platform. Sometimes there are gems in the rough in things like that like Delphi, and I don't mean to dismiss any of the ideas or dreams, but IMO platform companies are going to continue to monopolize the language space (for instance, I see great irony in they are using .NET because Microsoft basically does what they are trying to do). So that's why I bring up Delphi above, it seems this could be an opinionated framework on top of a .NET language instead of yet another language?


It looks like Dark is trying to be a new language, a backend server, a CI solution, and an IDE! Then there is a blog post that everyone churns:

> Currently, we get 10 user signups a day, and on average, 0 of them stick. We do have developers who love Dark, but not enough. https://roadmap.darklang.com/goals-of-dark-v2

I think the company should focus on just one of those things and it would still be an enormous lift given their size.

> The problems and fixes are across the product. One problem is that undo is slow, another is that you can't put a minus sign in front of an integer, another is that we need to define how namespaces will work in the package manager. The fixes range from adding tooltips in the UI, to adding a type-checker, to making the package manager public.

https://blog.darklang.com/dark-v2-roadmap/

Ayeeeee... this is like trying to carve Mount Rushmore using a spoon.


Also only one person is working on it - everyone else in the dev team seems to have been let go/fired. https://blog.darklang.com/dark-and-the-long-term/


Oh my goodness! This is something that I think you need a sizable open source community to build and maintain. This platform is not close to something you can rely on to build your new startup / business. I don't want to denigrate the efforts of the founders / devs but this is just an insanely huge amount of effort. Entire companies are built to just to provide a small subset of what Dark is trying to accomplish. The employees must be super stressed out! I would focus on maybe just the IDE and deployment. I don't see why you need an entire new language. I could see this being an advanced IDE to churn out server-less APIs plus some closed-source packages to provide magic. But this is just... woah!


Large teams/communities can build good software but I'd argue that great software is created by small teams and by massively talented and motivated individuals. Those are the contexts where people are more open to seeing beyond the incremental improvements towards new paradigms and previously unimagined possibilities.

Dark definitely is ambitious and you're entitled to be skeptical. Personally, I'm really glad this isn't just a fancy IDE on top of a conventional stack. I think we should be doing everything we can to support loonshots and crazy new ideas like this because these are the kinds of things that could change the game for all of us.


Lol


Really wasn't that funny


It reads that there are some remaining millions of dollars, which now support a single employee. Is that a correct reading?


A little over a million. While that should last 5 years, I expect to hire again in the new year, once I have an actual plan figured out.


Good luck. Better to recognise the problem and try to adjust than to just plow on.


Fair. Good luck :-)

Happy CircleCI customer.


> To put it more bluntly, we have been focusing on growth of a product that has not yet reached product-market fit.

I remember communicating with Ellen (the co-founder that has left the company) back in April (2 months before they let everyone go) when they asked for feedback. I said pretty much the gist of this blog post and received a one line answer of 'well, that's like, your opinion, maaan'.

I should feel vindicated, instead I feel sad. Dark got 3.5 million in funding - I'd be very curious to know where that money went (specifically how much the founder and co-founder have paid themselves) because it sure sounds like the founders got to 'try ocaml lol' with 3.5 million of someone else's money and then write blog posts about it that routinely show up on HackerNews, telling the rest of us of their 'lessons learned'.


I interviewed at Dark and spoke with Paul and Ellen. I was pretty impressed by how forthright they were. If you were going to score startups about how clueless they were and how unqualified to gamble with other people money, I would put them near the bottom.


Quite a shitty accusation, isn't it?

Without knowing anything regarding the specific details, I would be pretty surprised if investors wouldn't be ready for both ups and downs.


are you accusing them of stealing money without any proof?


So no product marker fit and apparently no clear idea who the customer is. On the road to failure unfortunately.


Wow, I remember learning about F# and thinking, "too bad it's stuck on Windows, I'll never use it". Awesome to hear that .NET Core has really changed that.

How is the incremental typechecking experience, eg in vscode? (Speed, UX)

Does F# have a prettier implementation or other fast/deterministic/opinionated formatter?


There is iodide for vscode. http://ionide.io

JetBrains Rider also supports it.

Fantomas can be used as a formatter. https://github.com/fsprojects/fantomas


Ionide is not quite there yet when it comes to medium/big sized projects. I switched back to Visual Studio because Ionide simply stops working after a while. Even switching branch requires you restart VSCode. So be aware of that. Fantomas comes already bundled with Ionide and works pretty well. Has a few hick ups here and there as well.


There’s F# language server, which is not as featureful as ionide but a lot faster.


ahh interesting. I'm actually using Rider for most of my work


We use Rider (enterprise license) to develop our F# services. Over the past year, the experience kept improving, small tweaks kept coming, and bugs were sorted out. At the moment, some of our larger shared type definition files are a slow to edit (seems Rider/.net is trying like crazy to look over all the symbols before I've even finished typing), but overall, I had a very good experience with it.

Not aware of any linters out there. Rider might have built in rules to enforce it, but that wouldn't hook into your CI/CD.


Roughly how large of a file and how slow to edit?

Although there are some aspects to performance that are Rider-specific, it uses the same underlying compiler to deliver tooling. So some issues might be solvable at a more core layer.

Disclaimer: I work on F# at Microsoft


Our setup is multiple solutions for each service, and most inherit from "Core" parts. This type def file is about 1000 lines (can't count the number of declarations in there).

What seems to happen consistently: I can delete a label in a type record, no lag. As soon as I finish adding a label (haven't added its type yet), all matter of slow hell breaks loose and there is a 5 second delay before anything shows up. This happens regardless of whether or not the type is used in a low (<20) or high (>100) number of places. The delay is the same in all records.


Yeah, that's a rather large type definition We used to have some issues in the compiler in processing records that are 250 or more labels large, and can process ones with 1000 labels or more. But it's still a pain point, especially if it's generated, since those can get large pretty easily.

The issue you're describing sounds more like a compiler/core tooling issue than a Rider issue though. Would you mind filing an issue here? https://github.com/dotnet/fsharp

We collaborate with the Rider folks quite a lot and so they'll see this if it is indeed a Rider issue.


Rider’s overeager analysis of symbols has been bugging me too, but the latest previews seem to behave a little better


> How is the incremental type checking experience, eg in vscode? (Speed, UX)

Pretty good.

> Does F# have a prettier implementation or other fast/deterministic/opinionated formatter?

It has something but I was not working on MacOS so we are not using it.

https://github.com/fsprojects/fantomas


Fantomas works on my Mac! My employer has been sponsoring work on Fantomas for a while now, and it's really improved by leaps and bounds over the last year. I'd recommend giving it another go.


If you're interested in learning more about F#, I strongly suggest joining the F# Foundation Slack. Unfortunately they make it surprisingly difficult to join, you have to create an account on the F# Software Foundation website (https://fsharp.nationbuilder.com/forms/user_sessions/new) and then "join" the foundation (http://foundation.fsharp.org/join ; you'll see the option to join once you have an account).

There's a large community of very helpful people willing to answer questions.


Having read the previous HN post on this (leaving Ocaml) and this one, I'm going to post a different perspective on this.

As a programmer, I love reading blog posts about why someone made something, how they made it, and what tools they chose to use/create to facilitate their end-goals. Some of the frustrations the OP had with respect to Ocaml are some I - and many others - have experienced as well. Enumerating those constructively can potentially benefit the Ocaml community as well.

However, as a possible target user...

Reading blog posts like these are _scary_. It is very rare that I want to read a "hey, that service you use regularly? yeah, we decided to rebuild it from the ground up using entirely new tools 'just because'".

Will I still be able to work? Should I postpone signing up/paying? What new bugs will appear? Which features I currently rely on will change or outright disappear/cease to work? What edge cases that are currently handled will be forgotten? The list goes on...

There are often very legitimate answers to these questions, and sometimes taking a HUGE step back in order to meet demands you didn't know existed before is what's required. But those requirements, and how this step back will address them, should be clearly laid out to your users.

More importantly, work such as this postpones any current roadmap features (potential) users may be waiting for and have been previously promised. Presumably an analysis has been done on how much they'd be postponed by, and why those features will benefit greatly once done. Sharing that would also be extremely helpful in assuaging concerns.


> It actually has a much better type system, in my opinion.

I don't know how anyone can say that with a straight face about a type system that has both Option & null

(granted Rust technically runs into this, but they go out of their way so that you should never be using pointers unless you want to. Whereas in F# you'll run into this using references, & you'd expect interfacing to C# APIs wouldn't take the care that interfacing with C APIs does)

That said, F# does sound like a good fit for them, so one funny reasoning aside, dive in. I enjoyed F# when I was able to use it


I have been using F# for 5 years and have never once needed to use “null” anywhere in my code - if I were to interact with a C# library I think null-handling is an acceptable price to pay for the interop (similar to weirdness with OCaml/Haskell and C FFI). F# makes that particular separation of concerns very clean and easy to handle. Idiomatic F# is safe against null reference exceptions.

I think you’re needling a small wart in F# due to the .NET requirement, it’s hardly enough to say that F#’s type system is overall bad.


>> I think null-handling is an acceptable price to pay for the interop

Especially given how easy it is to deal with it. Option.ofObj what we use most of the time.


There are like a thousand little details a type system has, and you focus on one thing, the existence of null for .NET interop, and decide this means the type system can't be overall better?


The null thing is a bad example, particularly because it's from the .NET side and C# is focusing on removing that now through nullable reference types, but I'm honestly surprised they claimed F# has a better type system.

F# lacks functors, first class modules, row polymorphism, GADTs, PPX extensions, and likely some others I'm not aware of. GADTs, row polymorphism, and PPX extensions I've wished for in particular because they solve some problems I frequently run into with MVU/Elmish architecture on the front end.

F# instead has inline functions with static constraints (similar to C++ templates), type providers, computation expressions, and units of measure.


> Whereas in F# you'll run into this using references

It depends :)

F#-defined reference types cannot be assigned a null value by default. If your reference is an F# library (or one with F# bindings, of which there are several) then this isn't something you run into. Technically this can be bypassed by giving the type an attribute called AllowNullLiteral, but it is extremely rare to see that in practice because it's so unsatisfactory to use null from a cultural perspective in F#. Once a C# programmer uses F#, something switches on in their brain and they tend to eliminate all possible ways null can creep in as thoroughly as possible. It's neat.

Another aspect where you don't see null is in initialization soundness. In F#, all values must be initialized before use. This is in sharp contrast to C# or Java where you can accidentally access something that hasn't been initialized before. While you can also technically bypass this to inject a null somewhere, it is also exceedingly rare because it's not a default behavior.

F# data types also cannot be assigned a null value, so that's also a place where null doesn't come in.

So it really leaves interop with .NET assemblies, some serialization scenarios, and .NET reflection. The first is really the only time people can still get "surprised" by a _null_ value. The rest are all technically possible, but just tend not to come up.

The reason why this all matters is it's not a binary thing about having null/non-null. Just like it's not a binary thing about having exceptions/vs. not. It's about how frequently they can come about in normal programming scenarios. I can write a simple Haskell program that throws an exception at runtime, does that mean that Haskell is not safe? Of course not, because it's not usually what normal Haskell code does. The same thinking applies here. The ways that null can creep into an F# program are simply smaller than C# or Java and do create this mindset that nullis something that F# programmers tend not to think about that much.


I worked on some Scala codebases, which has this problem. It even lets you end up with Some(null). I call it the two billion dollar mistake.


Was this because the devs didn't know about Option.apply?


> I evaluated and expected not to like F#. I actually quite like it.

It's funny because that's exactly how I felt when I learned F#. I felt pleasantly surprised and never felt like there was anything bad about the language itself.

I just wish it wasn't so deeply embedded in the .NET ecosystem.


> I just wish it wasn't so deeply embedded in the .NET ecosystem.

Kind of amusing, given the whole article is about how great it is that it's embedded in the .NET ecosystem


yes, and I agree with that.

But I also feel like the language is mainly used in a type of business environments that are already tied to .NET infrastructure anyway, and who actually appreciate the interlock.

And it could get much more traction outside of that circle, especially in the open source world.


>But I also feel like the language is mainly used in a type of business environments that are already tied to .NET infrastructure anyway, and who actually appreciate the interlock.

I think that's still somewhat true just because there is momentum carried over from the past, but it's definitely something I see changing in the industry.

.NET used to mean SQL Server and Azure in addition. These days I'm seeing .NET Core with AWS and Postgres.


You can use f# with fable on the javascript ecosystem. And it's quite nice. Better than TS imo.

and now with .net core, its a better cross platform framework.

also, you can target wasm or native image now, no need to install it on the target env.


> native image

Got any examples of this?


> I felt pleasantly surprised and never felt like there was anything bad about the language itself.

After working full time with F# for nearly two years I can attest to this. At worst there's inconvenient things, and as you get deeper into FP you'll wish F# had certain features, but all in all it's an very sane and well designed language. I think the only thing that's caused me any pain is the lack of type classes, and after doing a lot of MVU I wish lenses were a language level feature.


> I felt pleasantly surprised and never felt like there was anything bad about the language itself.

I mostly only miss modules/functors, and being able to use them for abstract data types.


It may seem odd to say it, but given the choice between a runtime managed by Microsoft and a runtime managed by Oracle, I'd be more comfortable with the former.


You might be interested in Ocaml.


Semi-relevant: Dark is being rewritten _from_ Ocaml[0].

[0] - https://blog.darklang.com/leaving-ocaml/


No, they are moving from OCaml because it "has been a little unsatisfactory". If you read their blog post, you will see it's mostly self-inflicted pain: they suddenly discovered vendors don't ship SDK for niche languages, they wrote single threaded code because they couldn't figure out how Lwt - probably the most popular OCaml library - works and apparently couldn't be bothered to ask and failed to understand how the debugger works (like GDB and with documentation shipped with the compiler).

I mean, F# is a very reasonable choice if you want the .Net ecosystem and will most definitely solve the SDK issue but the whole thing certainly didn't convince me to try Dark.


Let's be fair: They're moving away from OCaml because there's a ton of stuff that's impractical about using OCaml, mostly related to the very bare ecosystem tooling that's seen maybe a few years of good light and "Things every language should have by now" features that have been coming soon for 8+ years.


Is the person I'm responding to a member of the Dark development team? I was unaware!


Don't do passive aggressive comment. It's both obnoxious and against the site guideline. State what you have to state.

I just thought your comment was top-level. It's unobvious on mobile.


It's funny you say this when the original article was about the company moving away from OCaml due to a number of issues.


Those issues are primarily unfamiliar tooling and lack of library support, which are indeed fixed by .NET, but could hardly be fixed by anything dissimilar.


Why is it bad that it is embedded in the .NET ecosystem?


Doesn't have the same kind of large and innovative ecosystem around it, like competitors such as the JVM.


It's true the JVM has a large ecosystem, but the .NET one is still pretty substantial. I would also argue there's a fair bit of innovation in the .NET Core world :)


> I would also argue there's a fair bit of innovation in the .NET Core world :)

They need to make it more visible if there is! I think it's pretty rare to see a .NET paper. I can't remember the last time I saw one, where there are tons of major JVM research projects.


It feels like .NET focuses primarily on the library and language side of things. In C# you can write surprisingly low-level optimized code. See the introduction of Span<T> for instance: https://docs.microsoft.com/en-us/archive/msdn-magazine/2018/.... Or huge lists of improvements such as https://devblogs.microsoft.com/dotnet/performance-improvemen....

This is in contrast to the JVM where Java is showing its age, but mountains of work poured into the JVM keep it performant: OpenJ9 and Hotspot, Project Loom, Zero GC, as well as big non-Oracle investment such as Shenandoah GC contributed by Redhat, etc.


I can't say I know what's happening with the CLR itself, but they've shipped several major improvements in the language, server, standard library etc that have brought significant performance gains which make .NET Core one of those most performant stacks on the market.


The innovation in the .net world tends to be by companies in the .NET ecosystem, which historically have not been focused on open source or open research. Improvements in the microsoft world tend to be private.


Yeah. This is probably my only beef with .NET ecosystem. There are a lot of great open source libraries and whatnot, but what you see comparatively little of is core systems a lot of software depends on being written in .NET Core. I'm mainly thinking about all the type of stuff you see in the Apache Foundation.

The language itself is incredible, and just keeps getting better and better with every release.

It'd be neat to see people turning to .NET when writing those kinds of systems, but maybe it's just not cool enough in the Bay Area.


the emphasis is on "deeply".

I think .NET can be quite an advantage, as stated also in the article.

but if I want to, say, set up a dev environment on linux, it comes with a set of quite different dependencies than what other languages use.


I'm using F# and .NET Core on Linux and I actually really like it. I can use my Linux distribution package manager to install different SDK versions side by side (at the moment, 2.1, 2.2, 3.0, 3.1) and the CLI tools automatically picks up the correct version for each solution. I find it superior to other languages like Rust/Erlang/Python where you need to use a tool that manages the SDKs outside of the distro package manager.


What do you mean? There are .debs and .rpms to install .NET Core on Linux just like any other language


Yeah... I thought it's quite simple also. Rider also works on Linux. It's been a great experience for me.


The title is not a surprise, based on the previous post. The one surprise is that he seriously considered Rust.

> "you'll never believe just how much a Garbage Collector does for you!"

Of course, that's why it's hard to see someone moving from OCaml to Rust for any reason other than wanting to use Rust.


That's a pretty weird takeaway considering the long and detailed post on OCaml's issues that motivated a full rewrite: https://blog.darklang.com/leaving-ocaml/

> One of my biggest annoyances was how often OCaml folks talk about Fancy Type System problems, instead of how to actually build products and applications. In other communities for similar languages (ReasonML, Elm, F#), people talk about building apps and solving their problems. In OCaml, it feels like people spend an awful lot of time discussing Functors.

In whatever case, it appears that the author's preference was F# > Rust > OCaml.

I don't have a problem with garbage collectors in general, but that's only one facet in deciding a language. There are lots of more important facets, and OCaml fares pretty poorly at these (e.g., tooling, ecosystem, community, pace of improvement, mindshare, etc).

And for language toolchains in particular, I wouldn't lock myself into a language with suboptimal performance--it's too hard to back out of that decision later, and it's easy enough to recoup some of that pace of development from Rust by just refcounting everything today if necessary (optimize it later).


> OCaml fares pretty poorly at these (e.g., tooling, ecosystem, community, pace of improvement, mindshare, etc).

I think this is unfair and OCaml has a lot going for itself. The fact that it also has an active academic community of people who care about Fancy Type Systems does not strike me as an intrinsic downside.

To me, OCaml has a very different paradigm from most languages, and that can be a big strength. Implementing something in OCaml after you've written out the types can feel like you are doing a duet with the compiler.


Setting up an OCaml build system for the first time is an absolute nightmare. There are a number of tooling issues with OCaml that require a lot of work to figure out and work around. It is not batteries-included, either.

Once set up, though, I agree that it drives very much forward. In compiler writing in particular, it allows you to express the things you care about (ASTs and transformations over them) with next to no effort.


A few years ago this was still true, but since then Dune (originally called jbuilder) has dramatically changed the ease with which OCaml projects can be set up.

As for batteries included, there are several options for foundational libraries; I found there to be too many choices rather than too few when I first started developing in OCaml.


Maybe things have progressed in the intervening years, but I found dune/jbuilder to be complicated and poorly documented. I've forgotten many of the issues I had with it, but IIRC there were a couple of different configuration formats (S-expressions and something else?) and the examples online never worked out of the box.

As for foundational libraries, I agree that there were too many choices, but IMO fragmentation in the standard library is a bad thing (you need to know how to interop between them as you integrate third party libraries that use different standard libraries). Never mind the confusion that creates for newbies.

Moreover, for newbies, getting editor integration working well, understanding the unfamiliar (to put it nicely) syntax, and a myriad of other challenges without good documentation are other significant obstacles.

These are pretty significant downsides, and as much as I like the upside of a nice type system, I don't need it to build a product, but I need good tooling and a good ecosystem and a healthy supply of developers. I can ship a product with Go or Python (languages with impoverished type systems)--there might be more bugs but finding and fixing bugs is manageable--but OCaml presents significant challenges.


> The fact that it also has an active academic community of people who care about Fancy Type Systems does not strike me as an intrinsic downside.

No one is arguing that this is a downside. The downside is that there aren’t many people who care about making useful software, or at least there is a relative dearth of content devoted to that end.

> To me, OCaml has a very different paradigm from most languages, and that can be a big strength. Implementing something in OCaml after you've written out the types can feel like you are doing a duet with the compiler.

I actually agree with this, but as much as I love a good type system, it’s gravy. I can ship software with Go because it has a decent runtime, tooling, ecosystem, learning curve, mindshare, etc even despite its simplistic type system; however, it’s much harder to do the same in OCaml. Also, there are languages like Rust with great type systems and concern about the more ruggedly practical concerns of software development.


I think a GC is almost always the most important facet. Can you name a set of languages that doesn't include Rust, some with GCs and some without, where the most different pairs are the same in terms of having a GC or not?


I don't understand the question. I wouldn't say having a GC is the most important facet--I'd rather have good tooling and ecosystem. Of the mainstream languages without GCs, Rust is the only one with good tooling and ecosystem (or arguable C/C++ have "good ecosystem" but the tooling makes it prohibitively difficult to integrate well). Further, the other GC-less languages have a lot of other issues, such as poor error messages and bad ergonomics. Basically success correlates with having a GC, but this is a classic case of correlation != causation.


> I don't have a problem with garbage collectors in general, but that's only one facet in deciding a language.

In the same sense that the president of the US is only one voice on US policy. If you've been writing OCaml and then you want to change your project to another language, you will most definitely miss the garbage collector. All of a sudden you have to solve problems a second time in a different way, which is something you wouldn't have to do using F#. There's no way I'd shift a serious project from GC to non-GC unless I had absolutely no choice.


> In OCaml, it feels like people spend an awful lot of time discussing Functors.

I mean, if you want to use OCaml and not even bother to want to learn about functors, which are fundamental to the module system, then I don't know what to say. It would be somewhat like wanting to use Java and not learn how to use classes.


The point is that the community’s interest appears to be in abstract language design and not building software products.


I suppose someone who had no exposure to OOP would say the same to a group discussing a object-based approach to solving a problem.

That said, the OCaml community is very interested in using the type system to eliminate entire classes of errors (eg,"make illegal states unrepresentable"). Rust community is the same. Jane Street has built a multi-billion dollar company using this approach. If you view such as "abstract language design", then your loss.


To be clear, the claim isn't that other communities never discuss academic concerns, but that they are also interested in the practical aspects of building software products while OCaml is (more or less) singularly focused on the academic aspects.

Yes, the Rust community is interested in the academic aspects of eliminating classes of errors, but it's also very interested in addressing pragmatic problems. Consider all of the sites for Are We (Web | GameDev | IDE | etc) Yet as well as the remarkable progress they've made in such a short time. Rust's entire history fits in the amount of time that OCaml has been struggling to solve parallelism.

> Jane Street has built a multi-billion dollar company using this approach. If you view such as "abstract language design", then your loss.

That's really great for Jane Street, but it's really not the badge of honor you (and the rest of the OCaml community) think it is that a single company has managed to muster some success with your preferred language in its 25 year history. It also doesn't do much to illustrate that the OCaml community is broadly interested in practical matters--only that one community has done, and they have done a lot of work to make OCaml manageable (like Google has done with Go, except that Go enjoys marketshare outside of Google).


As mentioned elsewhere, I believe that automatic memory management + affine types will win in what concerns masses adoption.

Rust has been the catalyst to force language designers to look more seriously into affine types, but its use can only be justified in "no way GC/RC" scenarios like MISRA.


Overall, I think the big win for Rust is when you need low-level, low-cost, low-memory, high-performance applications, at the cost of anything else. Rust is a great replacement for low-level C.

Unfortunately, those are not the features that you care about most in a compiler. In a compiler, your priorities are ease of extension, ease of specification, and ease of transformation. The last thing you want in a compiler language is to worry about ownership when trying to implement expression-rewriting optimizations. Conversely, OCaml is excellent at these things.


Its use can be justified elsewhere, but it should always be understood that manual memory management is always going to be more difficult.

The problem I see, and the reason I sometimes reach for rust when I don't really require manual memory management, is that nothing else is really better for strongly-typed statically-compiled self-contained executables.

Go: it's got the convenience and packaging and ecosystem, but an absolute shit type system.

F#: carries around the baggage of the CLR, and the best tooling is Windows-exclusive. Type system is good but not as good as others.

Scala: absolutely amazing type system, but lots of unneeded stuff too. JVM packaging is a huge pain, and pretty heavy weight. Native compilation options are few and have slow executables in comparison.

OCaml: tiny ecosystem, multicore is always one year away, some weird ergonomic choices.

Haskell: lazy, IO has terrible ergonomics, and the community consists of pie-in-the-sky type theorists.

I would kill for a Dotty-like (Scala 3) native programming language that could omit all of the bullshit that was required to fit into the JVM and work with java libraries.


The state of natively compiled GC'd languages is actually pretty bad, at least ones with an ecosystem/job opportunities.

You either have to run into the same problem as dark when it comes to libraries and ecosystem (ocaml, haskell, D, etc), use go and give up any nice language features, or run on a VM and deal with all the issues of packaging/weight (clr, jvm).


When packaged alongside the binary, VM for all purposes is just the language runtime,e.g. natively compiled Java.


I just don't think natively compiled Java is there yet. It's not trivial to get it working and the ecosystem isn't ready for it yet.

There are certain things that just work (i.e. quarkus, which I have a web service running on 13 mb of ram), but I have never been able to successfully compile to native on my own because of library dependencies. Getting things like Weld or H2 to work is pretty difficult.


It is there since 2000.

AOT compilation has been a feature of most commercial JDKs, specially the ones targeted at embedded development like PTC and Aicas.


jlink potentially helps with weight.


Don't forget SML. A wonderful, simple language with great compilers and multithreading, but little tooling I'd library support.


To me, it looks like the sweet spot might be RC + weak references, which in terms of ease of use is close enough to a tracing GC as to not matter, + the ability to tune away the RC ops in specific circumstances, for example using affine types.


One reason to use Rust is speed. For example https://www.phoronix.com/scan.php?page=news_item&px=Facebook...

Of course, not all projects have this as a high order concern, but some do, and it can make sense to use Rust for those.


This does not mean much without the context. On the top, it is usually the libraries where most of the speed is wasted. In Java you can create HFT programms yet an average Hadoop project is 10 - 50 times slower in terms of latency for a specific operation. It is not a problem with the JVM of Java as a language as much it is a problem with code quality in user libraries.


I don’t think dotnet is faster than rust or anything overall, but ASP.NET Core is 6th is the techempower composite rankings and first or second I believe in the plain text benchmark. This is excluding the large amount of performance work that has gone into the soon to be released .NET 5.


dotnet and Rust are essentially neck and neck on plain text. On any benchmark that actually does work though, dotnet is nowhere to be found (less than half the performance of actix).


Yea, actix is definitely faster than ASP.NET Core and in many benchmarks it is 2x performance wise. My only point really was that ASP.NET core offers better performance compared to almost any other widely used web framework out there. Additionally, performance improvements have and continue to be a focus, with steady progress to show for it.


Speed is also a reason not to use it, as programming and compilation are relatively slow in it.


A bit peculiar they selected a .NET language after running into a lack of libraries in OCaml. I'd be perfectly happy to use C# or F# the language, but I still find .NET lagging behind the JVM/Python/Go ecosystems. Two large OSS projects I'm using for work are Scylla and Apache Beam, both of which don't have native support for .NET, and I'm sure we can find plenty of others.


> but I still find .NET lagging behind the JVM/Python/Go ecosystems

It probably depends on what libraries you seek. In the enterprise software world the only options for a vendor's library/SDK are often .NET or JVM (and asking them "Do you have a Python or Go library" will probably elicit a response of "What in tarnation do snakes and board games have to do with programming?").

As for whether it's CLR or JVM, the question's usually answerable with:

    if vendor.preferred_os == 'Windows':
        'CLR'
    else:
        'JVM'
And since Windows is still pretty commonplace in a lot of smaller enterprise software shops, CLR ends up being the norm for a lot of the COTS products out there.


Its not necessary to speculate, the blog post about moving away from OCaml specifically talked about cloud provider SDKs (AWS and GCP): https://medium.com/darklang/leaving-ocaml-fce7049a2a40?sourc...


I was talking more in general, beyond Dark's specific needs. But yes, in Dark's scenario the library availability issues are clearer.


Go?! I don't think there's even a comparison here -- the .Net ecosystem is huge compared to such a newcomer. I've been looking at .Net libs (for F# and C#) and JVM libs (for Clojure) recently, and I've came across way more old, crufty, poorly-documented Java libs than the .Net ones.


As an aside, this article made me want to try out F# on Linux. I followed the steps on Microsoft website and was basically up and running a toy program in 10 minutes. Then I spent next 40 minutes trying to figure out how to configure ionide on VSCode to respect my line breaks and not make if/else statements a super-wide line. Considering both are Microsoft products, this wasn't a great developer experience. Or maybe I'm spoiled by the rich formatting customization provided by IntelliJ's formatter for Java/Scala.


I had a similar experience. The syntax formatter is https://github.com/fsprojects/fantomas, and there are some controls, though not as many as I'd like.


I tried fantomas by installing it and then using the vscode plugin to use it but it kept crashing when I tried to format with it.


Note that Ionide, while the recommended VS Code plugin, is not a Microsoft product (it is mostly the work of Krzysztof Cieslak).


I know - however, it is mentioned in the official steps[0] for configuring F# with VS Code.

[0]: https://docs.microsoft.com/en-us/dotnet/fsharp/get-started/g...


From yesterday’s story about leaving OCaml:

https://news.ycombinator.com/item?id=24974907


F# is probably one of the best languages out there. but this again seems misguided, considering F# doesn't get the love from msoft that C# gets. you might as well use C# since it's slowing becoming F#.


While C# is slowly adopting features from F#, the biggest value that F# provides over C# isn't its feature set. The biggest value is having good default behavior and how it nudges programmers to favor predictable and composable solutions.

There are a million features and defaults in C# that constantly undermine a programmer's attempts to create good compositional code. If you and your team is disciplined enough to avoid falling into the traps, that's good. But a lot of teams are not that disciplined and tend to churn out the same buggy OO code with the same subtle bugs caused by the same default semantics.

Also, F# is a pretty powerful language for language nerds. There's a reason there are literally half-dozen F# to JavaScript compilers, not to mention Erlang, CUDA, PHP (I know), WASM, etc.

Until they start removing features from C# (Maybe something like a strict mode), it will never be F#.


good default behavior and how it nudges programmers to favor predictable and composable solutions

Agree. Mark Seeman calls these the "Functional pits of success"

https://www.youtube.com/watch?v=US8QG9I1XW0

I think the toughest thing about F# is that it doesn't offer as many Google-able easy solutions, so I'm stuck using it for utility and client libs at work. Trying to get a whole team onboard is a non-starter. It takes a good 6-12 months to adjust the mindset and most people are just looking to come into work, find their tough questions on StackOverflow and head out for the day.


Reactoring and reuse is a bit of a pain compared to functors or typical OO code in C#, etc., esp. moving from idiomatic F# to to more performant code, when this is needed.

For example, I may use a List at first, as it's natural to start with and offers niceties with [H|T] pattern matching or List.filter() somewhere, then if I decide to use a .Net List (ResizeArray) or Array or Set instead, I'm screwed. The pattern matching needs to either change syntax or go away, and all the List.Foo() calls need to change to Array.Foo's or whatever -- or I have to use Seq.Foo() everywhere to get extensibility even I don't want to pay for lazy evaluation. Long time F# fan here! -- but still some sore points.


If you already have OCaml, the transition of OCaml->F# is a lot smaller than OCaml->C#. Once you move your core stuffs over to F# (with necessary changes), you can more easily make a decision from there of whether any particular new work should be F# or C#.


At this point I mostly view C# as an "unsafe" F# for all practical engineering purposes. With some amount of discipline, one can drive a very functional C# codebase and fall back to imperative regimes when it makes sense to do so.

Most of our platform infrastructure code is imperative, but as you get higher up in the abstraction hierarchy, you will start to see more things declared as functional or data-driven. We would really like for everything above our service/platform layer to be functional, but there are some areas where it is a better engineering choice to go imperative (at least for now).

I have always thought of imperative vs functional as a very mutual thing. The imperative code is the glue between your fictional/business reality (functional world) and the real world (CPU/OS/Networks/etc). Functional code can serve as a perfect model of the business if the imperative code (platform) supporting it is well-engineered.


Currying and partial application and a major reason to use F# over C#


It is true that C# is gradually adopting features from F#, but F# will always be a much better F# than C# is. C# is great in its own right, but if you prefer F# why not use the real thing?


When looking for employees, using F# will surely help filter for the type of employees they would want.


the advantage of F# over OCaml is that your ecosystem is way larger... you can always 'snarf' c# libraries.


Indeed but not only C# but all of .Net's. So that gives you quantity. But more importantly (at least to me) is that the quality of libraries is much higher than those of the JS (which is what I'm more familiar with) ecosystem.


and you can use threads


burn


sure, but c# ecosystem is nothing compared to jvm, rails, go, js etc..


Huh, I haven't had any library issues with c# for years. Although your right, the js and jvm ecosystems are larger. I don't think it is an issue.


There is definitely a quantity vs quality issue in the JS world. I don't think it's fair to say JS' ecosystem is better than C#'s because it's bigger. There's a hell of a lot of shit code on NPM.


Sure, but there's no equivalently mature OCaml-like language for the JVM (Scala doesn't count)


Asking as someone with no OCaml experience but who quite enjoys Scala, why does it not count?


They are radically different languages, even if they're from the same family. Scala is about as similar to OCaml as Rust is to OCaml. Rust and Scala are a world apart from OCaml and Standard ML in terms of language complexity.

And then F# literally has an OCaml compat mode because they are so similar (F# has a bunch of convenient syntax extensions if you don't mind not using the compat mode).


I don't know OCaml, so I can't give you a great answer, but I know that OCaml has a different type system that has some features that Scala 2 lacks, such as union types.


OCaml doesn't have union types. It has what it calls variants, and what is generally known as Algebraic Data Types. So does Scala 2.


Anyone here used dark? Any anecdotal experience report you could share?


It has great potential. I think the unique integrated web ide, tracing, function level versioning and deployment model are game changing. Anyone not going in this direction with their FaaS products should stop what they are doing and reevaluate.

That said, his motivation for rewriting it is the right call. Dark does not have enough of a standard library for third party integrations (among other issues) or a way to easily contribute them which makes it tough to build in the large.

I’m really rooting for Dark - even with the rough edges it feels like the future and how I want to eventually build software.


Yes. The initial PR was a bit misguided IMHO, and left a big "huh?".

The futureofcoding interviews were much better, what they are doing is really interesting, though I am unconvinced that all the stuff they're doing is actually required to achieve their goals.

For example, they claim that a structured editor is required, but if per-method granularity is good enough, then a normal method browser would work just as well. As far as I can tell.

https://futureofcoding.org/episodes/043


Did not use it, but looked at it quite a bit. Felt like Force.com and the Apex language for the modern age.


OCaml to F# transition is more incremental than to Rust transition because so much of the existing code base could be re-used.

.net core got better, faster and almost parity with the upcoming release.


I read good things about F#, including that it is a first class citizen of Microsoft... but I have worries that it could become second class if some financial decision deems it not worth the effort to support for them.

From a functional perspective, I would think the two primary candidates would have been F# with its .NET ecosystem or Clojure with its Java ecosystem. Both are similar in more ways than other languages, especially considering the rich set of libraries available...

But to me the deciding factor would be ClojureScript for the front end. They stuck with ReasonML, if I read correctly, because they already had so much code (50k lines?). But at the same time, the previous post complained about the build tools. Plus it's still Ocaml...


F# is 15 years old and there has been a "F# foundation" for past 5 of those.

If your concern about Microsoft abandoning F# is genuine then perhaps that can re-assure you.


I keep reading good things about F# but at the same time it seems to fly under the radar, with not enough learning material out there

I did love this one article http://tomasp.net/blog/2018/write-your-own-excel/

EDIT: Submitted the link here https://news.ycombinator.com/item?id=24980325


I agree in the difficulty of finding abundant, good examples for F#. I'll throw out some plugs for ones that I liked.

* https://pragprog.com/titles/swdddf/domain-modeling-made-func... Uses F# as implementation language, though I cannot recommend this book enough for general type-safe design modeling regardless of the language.

* https://fsharpforfunandprofit.com/ Website of the author of the book mentioned above. He also has some good Youtube videos on the topic.

* https://www.demystifyfp.com/FsApplied/ A book that will walk you though developing a webserver, and also introduces you to Paket for dependency management.

I don't develop F# (or .NET) professionally, but I had fun when I played around with it. I found the interop with C# very seamless as well.


I find Microsoft's documentation pages on F# to be a great reference as well, particularly if you're already well versed with .NET

https://docs.microsoft.com/en-us/dotnet/fsharp/


Thank you!!


The author, Paul Biggar, was the founder of CircleCI which does use Clojure(Script). I don't know if he has written on why he chose not to use the same stack for Darklang.


He didn't list Clojure(Script) in programing languages in his CV (both significant experience and others) https://paulbiggar.com/paulbiggar.pdf


The CV appears old. His recent position is listed as starting July 2010 to present, but looking at his LinkedIn we see his finished at Mozilla in 2011. https://www.linkedin.com/in/paulbiggar/

Can read about how CircleCI's continued use of Clojure in 2017: https://circleci.com/blog/tips-for-optimizing-docker-builds/ And again in 2019: https://circleci.com/blog/update-how-circleci-processes-over...


I wrote that in latex a decade ago, and don't know how to update it anymore!


I've worried about Microsoft's commitment to F# in the past, but as of late they seem clearly committed to signal boosting the language as a way to grab mindshare from people who are averse to C# for whatever reason. I recall from their F# Jupyter notebooks demo they specifically implied they're trying to steal Python's lunch in the data science world

At this point though I believe the F# community is active and resourceful enough to carry themselves even if Microsoft directed resources away from the language. One of the things I noticed early on about the F# community is they tend to be mindful about their investments, and like to mold existing, established work into something more ergonomic for the F# community. The Giraffe, Fable, and Bolero projects are all wonderful examples of this mentality.


Clojurescript creates lots of overhead on the runtime, I wouldn't recommend it for UIs that need to do more than very lightweight processing on the client and be really snappy. A 50k LoC codebase(as in Dark) in Clojurescript will be a nightmare. To scale a Clojure(script) codebase to that size requires lots of documentation and discipline(and lots of tests!), it can me made but it's not easy.


> Initially, I expected to go to Rust. Rust has excellent tooling, great libraries, a delightful community, etc. But after spending about a month on it, I can't say I like writing Rust. Especially, I don't like writing async code in Rust. I like a nice high level language, and that's kinda what you need when you have a project as big as Dark to build. And Rust is not that. I'll publish "Why Dark didn't choose Rust" next.

I'm interested in the next post of the author elaborating a bit more why Rust wasn't an option here. As I understood, the author has just discarded Rust because wants a "nice high level language" (aside async writing) which is something relative IMO.


I got a chance to start using F# for the first time for some cryptography course assignments and supremely enjoyed it. Having a language that's easy to script with, has access to .Net libraries for everything and is easy to run on any major OS is kinda great.


F# Is a good choice for cloud orchestration. MBrace was doing something similar to this atleast about 5yrs ago. I've written a ml pipeline entirely in F# with deployment.

My 2p re: Dark Rather than creating a completely new language, Dark could just provide an API or a framework and use F# as the deployment scripting lang.

F# is succinct and easy to learn and work with for the customers (DevOps). Beats yaml anyday.

Writing a new language that is production ready is no small task.. takes years to get the syntax, stdlibs, and tooling right. By that time cloud computing may evolve into something else entirely..


I'm really enjoying the trend in the F# community these days of using computation expressions for configuration. Means you can get great IDE support and type checking with configs. Some examples:

1. https://compositionalit.github.io/farmer/

2. https://twitter.com/Cody_S_Johnson/status/132322777503415500...

3. https://github.com/UnoSD/Pulumi.FSharp.Extensions

4. https://github.com/SaturnFramework/Saturn/blob/73855f08d9c50...


How is the dotnet virtual machine compared to the java virtual machine? (memory use, performance, startup time)


ASP.NET Core (Linux/Kestrel/C#) has the fastest plain-text performance on Tech Empower benchmarks (7.3M /rps):

https://www.techempower.com/benchmarks/#section=data-r19&hw=...

It's also one of the more full-featured web frameworks benchmarked.


I haven't used the JVM in a long time, but .NET Core starts up pretty damn quick from cold IL. I believe the tiered compilation approach has something to do with the fast startup + insane web benchmarks.


The reason for benchmark performance isn't related to the JIT only but rather of .NET being more "pragmatic" with memory slices(usable with stack allocs,mmap), value types and in/out/ref passing. These little things together makes the .NET code run with far less GC pressure and more cache locality like C++ code without making the code non-idiomatic (Yes, A JVM system can be written like QuestDB to achieve similar perf but it won't be idiomatic Java code).

Then the Kestrel server + new ASP.NET Core runtimes takes advantage of it (for example the new JSON parsing API's are built to directly consume byte memory w/o sacrificing too much extensibility).


.NET has a Global Assembly Cache[0] which I think (please correct if I'm wrong) is used to pre-compile the system frameworks and so makes startup faster.

[0]: https://docs.microsoft.com/en-us/dotnet/framework/app-domain...


Only the legacy Windows .NET Framework has a Global Assembly Cache which doesn't exist in .NET Core.

Although the cross-platform .NET Core runtime does get pre-installed with a number of default Microsoft packages in its local NuGet package cache.


Why doesn't jvm cache the jit compilation results?


Did Microsoft Research publish papers on how the dotnet virtual machine pulls this off?


Comparable. Most likely a toss up based on your exact program characteristics and running environment.


Depends.

Java has several implementations between open source and commercial vendors.

Likewise .NET has several ones.

So depending on the benchmarks and the set of chosen runtimes you can tick the boxes either way, depending on what one wants to prove.

Hence why I rather work with both.


My experience has been that there's a lot less tweaking on smaller (<4 GB) .NET VMs than equivalent Java VMs, but I don't have experience on anything larger.


What about using F# just for the compiler and C# for the rest of the backend? That way, the friction of using libraries goes away, for example, using asp.NET core or other third party libs will be smoother. Though joggling between two languages may seem to have higher cognitive overhead I think you are already doing this by having to call C# APIs in lots of places.


There is no friction to using C# libraries in F#, it’s just that you will be using the imperative / OOP features of F# rather than the functional programming ones.


There's overhead in reading documentation, ex: all of asp.net core docs and examples are in C#.


Does your proposed solution remove the need for reading documentation?


No, the overhead will be in how to best map the F# to the C# code all the time while reading C# docs versus just not having this mental juggling in C#.


There's barely any overhead to consuming c# libraries from f#. You type "open SomeLib" instead of "using SomeLib;" and "let x = SomeClass (foo, bar)" instead of "var x = new SomeClass(foo, bar)". You check return values with "match Option.ofObj value" instead of "if (value != null)".


There's some level of friction in working with databases. Not insurmountable but painful.


I was recently making a similar language decision about a project that I've foolishly started. Since I wanted my code to be easily portable and also (for my own use) wanted it to be able to run on iOS and MacOS, I ended up selecting C++ as the language [1]. I've got about 20 years of C++ evolution to catch up on and there are a lot of things that I haven't had to think about in those twenty years (I spent a couple days making sure I grokked universal references and why I would care), but as I've explored what libraries are available for my use case, I think it's the right choice.

[1] https://www.finl.xyz/2020/10/21/choosing-a-programming-langu...


If I may recommend, try to find and read a copy of John Reppy's Concurrent Programming in ML. The book is a little bit of an eye opener if only for that.

Hopac is a library that offers an excellent delivery of this model of concurrent programming on .NET and has top notch F# primitives.


Guys, does it make sense to switch from C# to F#? I mean, C# adds more and more F# features each year and I think they will probably be on a par with F# in 3-4 years. By the way, native immutable records will be added to C# next week in C# 9.


IMO it depends especially if your coming from C# - sometimes yes, sometimes no for existing code bases.

On the features you mention while some of those features are being added to C# (and not all of them are planned to) from what I've seen they often aren't as powerful/useful as the F# version, are not as consistent/concise due to legacy syntax or don't interact with other features as well. F# is also taking C# features so its not like you will be left behind either (e.g. Span). For me more importantly the underlying defaults are a feature that can't be ported which IMO F# wins here.

C# is a fine OO language having worked with both extensively, just F# IMO is slightly better with less ceremony than typical C# code. Is it worth switching? It depends on your problem and where your starting from. In this case they are switching from OCAml so F# probably makes more sense. If your starting from a clean state it all depends tbh what you find easier to learn - IMO many JS/Python/Go/etc programmers if trying .NET Core might find F# easier than moving to C#/Java from their own personal preferences - its often called a "typed Python" by users. For C# users probably less so; but learning it might change your C# code style for the better.


I'm rooting for Dark, it's such a breath of fresh air, really hope this works out!

I hope HN can forgive Pul for choosing F# over Rust :)

Does anyone have experience porting code from OCaml or other MLs to F#, how does it work out?


I'll preface this by saying I'm not even sure what darklang is - I didn't find a concise description of what it is and it's for, only some videos which I really don't have time to watch right now... But doesn't moving to F# pretty much mean that the users of darklang will be forced into the Windows ecosystem?


F# is open source and cross platform now since .Net Core


If a technology nobody will ever use changes backends, does it make it to #1 on hackernews?

Turns out this is an easy one, yes.


No one was more surprised than me!


I’m trying to make the same decision right now. I’m the first product engineer at a AI startup and started using node.js but... meh (i use crystal for stuff and have exp in ocaml, elixir, ruby)

I’m leaning towards f# or Scala. I love static typing.

Since it’s a data science, machine learning company Scala makes more sense.


Woa, ya those initial choices for an AI startup seem so wrong. Scala sounds most reasonable, I'd say even consider Java to be honest.


I honestly don't see a huge value proposition in Scala, now that Java has reduced pain points around FP. Scala has higher-kinded types, but you see people tie themselves in knots with those as often as not. It is just such a huge language.

Scala 3/Dotty sounds like it will be a big improvement.


Java doesn't have one of the fundamentals of typed FP--algebraic data types with exhaustive pattern matching. And just in general--Scala, despite its warts, can be very succinct and packs a lot of power in simple-looking code.


No, sorry. None of the languages I listed are being used at the startup (except node). These are just what I’ve used previously.

Everything is in python but I’m building out the main user-facing app to interface with all the models.


Before choosing Scala I think you should watch this talk: https://www.youtube.com/watch?v=uiJycy6dFSQ (note: he's not a troll, he worked 5 years with Scala)


Legit thought this company was dead. Glad to see they’re still trying.


Thanks! I think?


My bad. I think this tweet from a few months back confused me: https://twitter.com/paulbiggar/status/1275433005822214146?s=...

How many engineers/employees are you guys now after the restructure?


Ah yeah, that confused a lot of people. It's just me at the moment. Planning to scale up a little bit next year once I have a solid plan of how to get to product-market fit.


I appreciate the vision - most apps I see these days are glorified CRUD sprinkled with modest business logic and still end up being 1000s lines of code, testing, and configuration apiece.


Such posts are really annoying and meaningless to a large extent. Even worse this proves that their own language is not stable to invest in. I see no business value proposition too. Seems more like a bunch of folks geeking on stuff. Also heard from sources that the founder is quite an arrogant guy and not easy to work with.

It's sad that investors hedge their bets on such wonky projects without understanding how this translates into a use case.


Curious how this compares to js_of_ocaml or Rescript that gives ocaml the JS ecosystem?


The thing that killed any interest I ever had in Dark was the license[1]. It’s beyond vendor lock-in, it’s suicide at an undetermined date.

tldr: Help build it but don’t use it.

-[1] https://github.com/darklang/dark/blob/main/LICENSE.md


Dark is a proprietary product with source available for the runtime. https://en.wikipedia.org/wiki/Source-available_software

AFAIK their eventual target market is the “οἱ πολλοί”.

No need to be so negative, you are just not their target market.


It's great I didn't invest a moment with this vaporware.


Nice


Dark seems nice but since it's going .net it should codegen to a .net core 3.1 application.


LOL


Thus reducing the number of people who care about it from 24 to 7.


HN needs a like button!


Why would anyone pick this over say Python.

Call me pessimistic , but I can't imagine the appeal of a closed source proprietary language when so many good open source ones are free.

It just seems like really nasty Vendor lock in.


We're trying to do a new thing. If people don't like it, that's fine. So far, people love the concept (no infrastructure, trace-driven development), but it hasn't been complete enough, so that's what I'm working on.


Anyway I can just use Python which I already know ?

The base concept is cool, I was more caught off guard by having to learn a new language here


Assuming competitive pricing it seems like a high brow alternative to a small LAMP app. I used to contract for a telco in my country and we had a shared server with a bunch of virtual hosts for various purposes. Data capture forms, family and friends schemes, corporate social responsibility stuff etc.

The core systems stuff was all Java enterprise inter-operating with off the shelf Oracle and TIBCO stuff. Building those small once off apps into the main infrastructure would have required involving the bigger suppliers who were more expensive and would be stonewalling you with change requests before long.

If you weighed the risk and determined that you were happy with these non-revenue path applications being subject to that risk, I think it might be ok. That said using something like Heroku is close enough and at least your application code is mostly portable.


> I can't imagine the appeal of a closed source proprietary language

I really feel like you're focusing on the wrong thing.

To me, the pitch is that Dark is FaaS-native (meaning you don't have to bend over backwards or use complex tooling to get a good write/debug/deploy experience).

The language itself being new and proprietary is actually a drawback. They can make it very friendly and productive, of course, but the other features are what should really sell someone on using Dark.


maybe because F# and OCaml are vastly superior languages compared to Python. Mind, I said language (not eco-system)


Your comparing apples and oranges.

I mentioned Python as Dark appears to be heavily influenced by it. Python is meant to be dirty and fast to code in.

To be honest, I've never used a functional programing language. What Unity does is essentially allow you to call various apis , with C#, in the game engine, which is C++ base.

But it's real C# and I can hop over to my day job and utilize it. Dark isn't a portable skill set.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: