Hacker News new | past | comments | ask | show | jobs | submit login

I think the problem is we keep using the same crappy tools because people are scared to be an early adopter.

Meanwhile, there are things sitting on the shelf that solve these problems in a principled way and make things simpler:

- ML languages

- Nix

- Bazel

- Erlang

- Rust

Some tools that are great and have large adoption:

- Git

- Terraform




> people are scared to be an early adopter.

ML was first developed in 1973. Ocaml in 1996 and SML in 1997. Great tools which haven't been popular in 20-40 years probably have something beyond fear of early adoption inhibiting them.


I'd say herd behavior and network effects are the main issues.

Do a quick search on YouTube on "what programming language to learn". You'll find video after video using popularity as a primary or even the primary factor to base the decision on.

Non-technical management tends to do the same, based out of the belief that languages are all pretty much equivalent and a desire to hire easily swappable "resources".


Yaron Minsky joined Jane Street in 2003. The success of Jane Street and the use of OCaml there is well known in finance circles. Finance is full of people who don’t exhibit herd behavior - many firms have a culture of studying fundamental causes and taking contrarian positions. That begs the question - why did OCaml not get widely adopted in finance?

Jet was purchased by Walmart in 2016. Mark Lore from Jet was appointed as CEO of Walmart to turn the company around, particular its technology operations (to compete effectively with Amazon). Jet’s tech was mostly developed in F#. Yet, Lore did not push for its adoption widely within Walmart.

IMO explaining away the failure of ML languages to gain market share over multiple decades as “herd behavior and network effects” is lazy.


> why did OCaml not get widely adopted in finance?

I think it has been pretty successful. The bigger success story in finance is F# though, which takes a bunch of ideas from OCaml.


I think it's tooling. But tooling follows from adoption in most cases.


Agreed. The stuff "sitting on the shelf", as your parent comment said, have problems too (eg tooling). They might solve some problems, but are far from the silver bullets we are looking for.


For your typical scaling org, I think data layers are often the main issue. Moving from a single postgres/mysql primary to something that isn't that represents the biggest hurdle.

Some companies are "lucky" and have either natural sharding keys for their primary business line, are an easily cacheable product, or can just scale reads on replicas. Others aren't, and that's where things get complicated.


Tbh, That's why for our new projects we've completely ignored relational databases. They're a pain in the ass to manage and scale poorly.

DynamoDB, on the other hand, trivially scales to thousands (and more!) of TPS and doesn't come with footguns. If it works, then it'll continue to work forever.


This is funny to me since modern relational databases can get thousands and more TPS in a single node. My dev machine reports 16k TPS on a table with 100M rows with 100 clients.

> pgbench -c 100 -s 100 -T 20 -n -U postgres > number of transactions actually processed: 321524 > latency average = 6.253 ms > tps = 15991.775957


Yep. And 98% of software written today will never need to scale beyond 10k TPS at the database layer. Most software is small. And of the software that does need to go faster than that, most of the time you can get away with read replicas. Or there are obvious sharding keys.

Even when thats not the case, it usually ends up being a minority of the tables and collections that need additional speed.

If you don't believe me, look through the HN hiring thread sometime and notice how few product names you recognise.

Most products will never need to scale like GMail or Facebook.


How do you handle cases where strict synchronization is required?

Banking app is a classic example.


I think Nix will take a long while to increase adoption, because it is hard to learn. The language is new, the concept is new, the mental model is new. But you need to master all of those to use the tool effectively.

Same goes for the other items in your list. Git had enough time and force behind it, and I believe the other tools will succeed as well. But it will take time.


I would add E[1] language to that list.

[1]http://erights.org/elang/index.html


> ML languages

What are those?


I suspect he means Ocaml, sml and bucklescript.


agreed. ML [1] is short for meta langauge. Because it is not very C-like syntax by default, it can feel exotic and alien to most of today's unix graybeards, 90s enterprise, aughts startup folks, and even the JS crowd.

see also its 'newer incarnations' Scala, F#, Nemerle...(don't slay me)Rust(ish)

[1] https://en.m.wikipedia.org/wiki/ML_(programming_language)


Oh, I see. Thanks!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: