If you look at redplanetlabs github repo, there's a ton of low level manipulation of 'JVM bytecode assembly language', e.g. https://github.com/redplanetlabs/defexception/blob/master/sr..., for projects that aren't even compilers, one would assume their "compiler" does this even more so.
The founder has released successful large projects in the past, e.g. Apache Storm used by Alibaba, Yahoo, Twitter, etc. Probably just need to wait a bit longer.
You can write 'type signatures' easily in clojure code, there's dozens of options, they're just only checked at runtime (and usually relegated to developer builds). if you combine that with good code coverage, it stops being a weakness and starts being a strength, because you declare data types with far more specificity & precision than a static type system.
As rich likes to say, the type signature of your stereotypical statically typed language, e.g. reverse :: [a] -> [a], doesn't have a much information content.
I write clojure, and still 'think in types' & category theory.
> the type signature of your stereotypical statically typed language, e.g. reverse :: [a] -> [a], doesn't have a much information content.
It's a little bit unfair to imply any alternative to bleeding-edge clojure is going to have a simple 70's Damas-Milner type system. Today, static types can describe a lot more. Refinement types (e.g liquid Haskell) and/or dependant types (e.g. Idris or Agda) can describe and enforce the desired properties for reverse statically.
I'm not downplaying anything, it was the parent who was dissatisfied with static checking of reverse. Personally, I think any static checking is better than none at all. Statically typed languages are perfectly capable of checking additional contracts/properties at runtime too (as Clojure does).
Firstly, I was being a bit tongue-in-cheek, but secondly you seem to be saying that a "70's Damas-Milner type system" isn't already an extremely powerful type system. I would say that it is!
I was just trying to combat the common narrative that static types are "what Java has". There is so much potential for static typing that has not been explored by industry (e.g structural typing).
(That common narrative is also quite negative, Java’s type system is surprisingly strong and can express many more things than people give it credit for. See jooq, or several FP java frameworks)
Jooq is not a good example as it is essentially dynamically typed. Queries on the relational algebra can be statically typed, with structural typing and a few type-level features like sets. There are lots of papers on this.
It is quite statically typed, look at the examples. You might be thinking of the stringly-typed version, but it can generate classes from DB inspection and use columns and whatnot with its respective type throughout a query, failing to compile when a group by value is incorrect! It can’t verify everything what a SQL engine can of course, but it is quite remarkable in my opinion.
You can find tons of used 10-40Gbps NICs on ebay for $50 or less. For a simple point-to-point connection with both sides having PCI-E slots that's going to be easier & cheaper to set up than thunderbolt will be. It'll also be more reliable & more practical since getting 40Gbps over Thunderbolt requires extremely short cables, whereas you can get DAC cables for 40Gbps QSFP or Infiniband up to 7M or so. And there's no confusion around what the cable can actually do, unlike the nightmare that is finding the right USB-C cable with the right signal integrity requirements at the right distance for what you actually need/want.
This has not been my experience -- I use both thunderbolt 3 and Mellanox ConnectX-3 40gbps cards, and have been looking for an excuse to add a Mini M1 to the homelab.
The cables and cards for PC Thunderbolt retrofit are far (like 3-5X!) more expensive than the MLNX cards and optical cables.
> The cables and cards for PC Thunderbolt retrofit are far (like 3-5X!) more expensive than the MLNX cards and optical cables.
I added new Thunderbolt3 PCIe cards (Gigabyte Titan Ridge) to two of my 2012 Mac Pros for around $100 each a bit over a year ago and the prices have dropped since, so I don't get how that is 3-5x more expensive than MLNX cards. Expense was way way lower a factor than the finesse it took to get MacOS to support them for me, but it was the only path that worked for my needs (different need than OP article) and so I endured the pain of getting it working.
I like Salvia the most (not for the faint hearted) partly because of how unusual/odd experience it is it doesn't imprint me a bunch of spiritual beliefs afterwards. I feel like I learn more about the mechanism of perception from it; albeit in an uncomfortable way.
Isn't tesla's competitive advantage that they don't buy their batteries from the current supply chains? Even the total world's supply of lithium ion batteries wouldn't be enough to provide for model 3 program.
That and they have some of the best battery scientists working for them. John Goodenough was a co-inventor of the lithium ion battery and works for/with Tesla on refining their chemistry and battery tech. JB Straubel is also widely considered one of the world's foremost battery experts, who happens to have been at Tesla longer than Elon Musk. Then there is Jeff Dahn and his university's partnership with Tesla on longevity, etc. Tesla is a battery company who happens to sell cars (and giant battery packs for electric grid storage).
I give Musk a lot of credit for giving JB Straubel the latitude to just Get Shit Done. It’s clear he’s adopted the “hire smart people and do what they tell you” mantra, and it’s paying off.
Entirely agreed! I was just pointing out that Tesla's "business moat" is in their people, who literally lead or created the industry which they're dominant in (battery tech for their electric vehicles).
Tesla are transitioning into building their batteries in partnership with Panasonic in the biggest factory in the world. Do you mean the raw materials rather than the batteries?
Om.next shows it's strength when you're working on a complex application 5k+ lines of code. A core abstractions of om.next (and graphQL) is to express data dependencies of your react classes in a query language (e.g. component X depends on data title, author, created, updated), then the logic around how to fetch data, refresh data, cache data or access cached data, update data, deduplicate data when it's needed in many places, optimistically update client-side data and wait for server-side data to synchronize is all disentangled from the logic on rendering complete data, missing data, data being loaded, reacting to user events, etc. In a smaller code base expressing these data dependencies would just seem tedious (writing graphQL or query expressions), but it pays up big time in larger applications.
I absolutely agree that larger apps are complicated, and that Om has thoughtful solutions to those problems, at the right level of abstraction. But I found it an absolute _pain_ to get going, and at every step it presented me with so many different primitives I felt lost.
If I had to compare it to any other tool, it'd be git. I love git. I use git every day. Do I understand the underlying mental model of git, or the 348 different git commands? Hell no.
Yes, I think those are all valid points/weaknesses. While I don't personal struggle with understanding the internals of git or om.next the learning curve is no joke. The community is aware of this, but it's still alpha software they're working on it.
I actually think of om.next more like a library than a web framework. As a library it has many degrees of freedom in how it can be composed and i imagine the author prioritized flexibility/generality over concreteness, and a lot of work is still left up to the user in filling in the blanks of a full fledge web app, untangled (https://untangled-web.github.io/untangled/) is one project tries to fill in all the missing gaps and expose a framework API, much better documentation, more concrete than om.next.
I'd have to agree. Reagent was able to expose the beauty of the concept to me more clearly than Om. When I approached Om, I found it to be on a completely different level with regards to things to understand and absorb.
Having said that, I'm also pretty sure that Om yields bigger payoffs further down the road as far as I can tell.