> I hope in the process of doing it we will find new ways of doing things.
HTML & CSS themselves have become a major bottleneck to quality and creativity. The arcane layout model, the baggage of backwards compatibility, the cognitive dissonance — played out over decades of design-by-committee — between "this is a document engine" and "this is an app engine."
The rendering-thread-is-the-main-thread architecture of JS plus the JS garbage collector cause jitters & frame drops that most people don't consciously recognize, but everyone subconsciously recognizes. These little bits of jank are why consumers can recognize webview-wrapped apps vs. native apps.
Don't get me wrong — HTML and CSS and JS have brought us far, and the zero-trust execution environment that is the browser is an amazing feat of humanity.
I hope the "new ways of doing things" you describe include a major innovation on HTML and CSS and JS. WebAssembly makes this possible today — and I have dedicated the last few years of my life to proving this concept[0], and I hope others explore similar avenues. We deserve a better substrate, and this can be done without reinventing the browser.
I disagree. There has never been a better markup language. HTML allows precise control of atomic elements that make up a component. It has been the best thing for accessibility we ever came up with.
I can imagine no better way for readability and accessibility than a text-first representation of content, which is what HTML is. I will even argue that having the modern web emerge from a text-content first approach is the best thing that could have happened to us.
All other content representation infrastructures that weren't text first, like java applets or flash died because they were no match for HTML.
Agreed! Markup-as-content-backbone is one of many things HTML/CSS do extremely well, and I agree not building around something like this was a major weakness of Flash.
Silverlight sought to do this with XAML, but it had other problems (required a plugin right as plugins were on their way out; CLR was a huge dependency; too locked down and proprietary under Ballmer leadership)
Can you imagine a text-first content representation, which is visually editable with the UX of a vector design tool? This is the rabbit hole that drives our work on Pax.
Care to point us to a better combination than HTML/CSS when it comes to layouting flexible interfaces?
Don't get me wrong, I also could imagine better markup languages for that purpose. But everything I have seen in the wild was worse in multiple, show-stopping ways.
Maybe instead of coming up with a new thing, we just need a clear way to solve the pain points and put the solutions into CSS4 and HTML6
Both Figma autolayout and HStack / VStack are careful subsets of flexbox; not quite the same thing. Consider whether a marble statue is "just a subset" of the block that contained it.
None of these is a perfect technology, but I believe the most practical answer to the question driving this thread[0] is to study/understand the landscape, then build towards a better future.[1]
[0] > Care to point us to a better combination than HTML/CSS when it comes to layouting flexible interfaces?
> Alternate layout engines for the web might be a fun experiment, PhD thesis, or talent retention program, but it's not practical.
Flexbox was once an "alternate layout engine for the web," as was Flash player, as is Figma. Framer, Retool, and Squarespace all offer alternate layout engines tailored for visual building. All of these seem practical to me.
1. not Figma or Flash: it's not practical or performant to manipulate HTML/CSS to achieve the creative freedom of a vector design tool[0]
2. the rest of these that build on HTML: not a single one of them exposes that code for manual editing, so they're not developer tools and their "alternate layouts" are proprietary + locked away.
The root issue: HTML was not designed to be a substrate for design.
[0] I don't claim this casually; I spent several years seeking to do exactly this with github.com/famous/famous and https://www.haikuanimator.com/
`Waiting for fonts.googleapis...`, pax website was blank for a couple of minutes, might want to reconsider making third-party fonts non blocking on your website so it doesn't appear broken/empty when a third-party CDN edge is slow.
It's one of the core things that has enabled the web to be as useful as it is. One of the things that draws people in, and keeps them using it.
Yes, there are problems with what we have. But if you break compatibility, you'll either not be adopted, or part of the crowd that the audience yells at for taking away their favourite things. You'll kill efforts and bury knowledge bases.
I don't think the world should abandon HTML, nor break backwards compatibility across the HTML spec. The first cars drove on roads designed for horses, and horses are still around. At no time did we gather a committee and decree that horses were deprecated.
Taken to an extreme, "don't break backwards compatibility" has an insidious failure mode, which is "don't innovate." The car could not have come about without a willingness to break backwards compatibility with horse drawn carriages, plows, mills, hitching posts, etc.
The adoption of a radically new technology like this is voluntary, collaborative, and progressive. Provided it offers enough value to exceed the switching costs, there's no need to kill efforts or bury knowledge bases.
Doesn't the rendered page run in a different thread to its JS? If the website does use some SPA framework to lazy-render the components, load the new view, or execute some logic, maybe then, but when I say run `(() => { const s = Date.now(); while (Date.now() - s < 10000); })();` and try to scroll the page, it doesn't freeze.
Interesting project. Have you tested any non-XML markup languages for the user interface declaration so that metadata doesn't take as much space as data in
<Text>Hi {firstName} </Text>
and also requiring double the number brackets
There are modern cleaner markup alternatives like kdl
We chose an XML-like syntax because it's clear & explicit & established (HTML) — you can know where you are in a hierarchy clearly at any point thanks to closing tags. The major downside is verbosity, both for reading and for typing. Pax's closing tags compile away so they aren't transmitted across the network like they are for HTML.
We predict 95%+ of Pax will be written by machines, especially visual design tools and LLMs, so verbosity becomes less important (LLMs may even benefit from explicit closing tags.) We're innovating on multiple axes, including language, so we chose to make the syntax itself as boring and uninnovative as possible.
But clarify is exactly what's lacking because it's verbose, it obscures content at no benefit to simple matching bracket highlights
Also, will it be read 95% of the time by machines during design? That's not what the demo shows with the side-by-side xml and output, which I expect is a much more common workflow than 5%
The innovating part is what puzzled me and prompted this question - why cling to the old garbage when you're doing new design?
Point taken! And yes, especially for developers (our core target demographic,) you are right that hand-writing should account for >5% of code volume.
Syntax is fairly contentious. "One man's trash is another man's treasure," on your note of "old garbage." It's hard to please everyone, however:
We could offer syntax "skins," since the data storage mechanism is a layer separated from the AST (thus different ASTs/syntaxes could de/serialize from the same persisted data.) So folks who want closing tags can have them; folks who want a YAML-like format could have it; folks who want a KDL-like language could have it.
At the language level, Pax's distinguishing characteristic is that it's the union of an expression language and a markup language; this is the reason we couldn't use an off the shelf markup language / parser. But again, KDL or YAML could be extended with PAXEL to make pax-kdl and pax-yaml flavors alongside pax-xml.
We had to start somewhere, and starting as close as possible to the markup lingua franca (HTML) made sense to us. It'd be a dream for Pax to be loved / adopted enough that we or anyone else cares to make a syntax skin.
I won't claim Pax is better than any of these, but that we seek to solve different goals.
Unlike Flash: driven by a markup language, fully open source, no plugin required, solves a11y, compiles to native apps incl. mobile
Unlike Silverlight: fully open source, no plugin required, no heavy VM
Unlike JavaFX: no plugin required, no heavy VM, visual builder is a vector design tool
Unlike Flutter: designed for web (small footprint, a11y out of the box); first-party and foundationally integrated visual builder, visual builder is a vector design tool
Thanks for explanations. However, instead of making yet another framework, wouldn't it be better to contribute to Flutter (e.g. optimize its wasm target, add a vector visual builder), so there is one strong open cross-platform alternative to HTML/CSS/JS?
Pax's primary goal is "designability" — to enable a vector design tool to read & write the same content & behavior you may write by hand. (.pax files)
Our solution requires language constraints — a hermetic separation of concerns between the declarative description language (.pax) and a Turing-complete programming language (starting with Rust.)
These language constraints and rendering requirements[0] are so core to our solution that building anywhere other than the systems level was not tenable.
[0] Specifically, the runtime must render in "design tool coordinates," to enable a vector design tool authoring UX. This authoring experience must also be extremely fast, like Figma's. The rendering engine must be designed around this requirement and neither HTML/CSS's or Flutter's were.
Swift is pretty next level compared to Rust. Rust does seem like a fairly constant level of slow and linearly slows down as the project gets larger. Swift tends to have huge degradations on seemingly innocent changes that make the type inference go off the rails. Although at least if you spend the time to track these down you can fix the worst offenders.
Pardon me for not getting your context, but are compile times a big issue in software development? I have never programmed professionally and all my experiences with code is from a couple of classes taken in college a decade ago.
When doing UI things, a common workflow is to have code and the app side-by-side, make a small tweak in the code, recompile/hot-reload, look at he result, repeat. A long compile time makes workflow a pain.
In general, you're right. But there are at least 2 times where they're absolutely vital- anytime you're dealing with a UI and data exploration in data science (since you make a lot of frequent, small changes in the goal of fine tuning something.) Everything else, best practices has good testing and partial compilations to make it moot. There's probably some other contexts that make it valuable, but I've never had to deal with those.
There's been a lot of work on this in the last few years with Windows support, preliminary Android support is also being worked on and should appear at some point in Swift 6.
There probably won't be a cross-platform UI layer and many Apple frameworks won't work, although apparently the Foundation Swift rewrite already works. For a lot of companies though, simply being able to share business logic will be a big plus. At my current company we have a lot of talented Swift people and existing code written in the language so being able to share anything would be a huge advantage.
I'm biased but I personnaly find the language really productive to work with, runs fast enough for my needs and let's me target more and more platforms.
Actually, it sounds like they only just decided to use it in the future once Swift 6 is released. I would guess much of it will still be in C++ for a while.
For everything. Swift is—at least in theory—a general purpose language, it's not exclusively for Apple technologies.
> First off, Swift has both memory & data race safety (as of v6). It's also a modern language with solid ergonomics.
> Something that matters to us a lot is OO. Web specs & browser internals tend to be highly object-oriented, and life is easier when you can model specs closely in your code. Swift has first-class OO support, in many ways even nicer than C++.
> The Swift team is also investing heavily in C++ interop, which means there's a real path to incremental adoption, not just gigantic rewrites.
> > First off, Swift has both memory & data race safety (as of v6)
> But v6 is not released yet, right
As of Swift 5 there is zero data race protection and the process model (DispatchQueues if memory serves) is woefully. No advantage over fork and much more convoluted
I am well clear of the Apple development world now (thank goodness) but the tools were of very poor quality, albeit very nice looking, as of this time last year
For the sake of my friends still in the world I hope Swift 6 is better, but I fear the dumpster fire that is Xcode is too far gone to rescue.
The comparison with Rust demonstrates the utility of "design by committee ". Rust is far from perfect, but feels like the future where Swift feels like warmed up Objective C
> As of Swift 5 there is zero data race protection and the process model (DispatchQueues if memory serves) is woefully. No advantage over fork and much more convoluted
Swift Concurrency is the replacement for Dispatch and has been around since Swift 5.5 in (IIRC) 2021. It’s a completely different system, uses lightweight tasks (a la tokio in rust, or goroutines in go, etc), has a concept of “Sendable” for thread-safety a la rust’s Send, async/await, and a native `actor` type, among other things.
Swift 5.5 didn’t get all the way towards rust-style data race safety due to a few things they had to make warnings instead of errors (to avoid breaking existing code), and introducing keywords like `@preconcurrency` when importing frameworks that predate Swift Concurrency, to facilitate incremental adoption. They’ve also been adding more checks in each minor release to tighten things up.
IIUC Swift 6 is mainly going to turn all the warnings into proper errors and tweak some defaults so that you get proper data race protection on a default compile.
Point is, it’s totally inaccurate to say that Dispatch Queues is all that exists in Swift 5. You’ve had much better stuff for a while now (although SC still has a ton of issues worth discussing.)
> Swift 5.5 didn’t get all the way towards rust-style data race
When I experimented with it it was trivial for one thread to interfere with another. So Swift got nowhere towards data race safety. Still stuck in the 1990s
I know not what you mean "Swift Concurrency". When I was doing it all we had was DispatchQueue which was an obfuscation of `fork`. Quite shameful really.
I think the main point is that Swift is a failure.
"although SC still has a ton of issues worth discussing" once I would have cared, but this year (decade, century) I am just very glad putting meat in my fridge no longer depends on those rouges from Apple who treated me so badly when I was sweating so hard making software for their platforms (not to mention paying them so much money). In 2024, for a company like Apple, for their flagship developer offering, why would anyone still have "a ton of issues" with it?
Apple is now an example of why popularity is a terrible metric to estimate quality of technical offerings. What a shame. How far the mighty have fallen
> Okay, so you're saying you don't know what Swift Concurrency
I just looked it up. It is Swift's version of async/await. That is a different thing from threads. I know what that is, used it a lot, because using threads was such a nightmare in Swift.
> language with many compile time guarantees of thread safety
From two separate threads you can access the same memory. No trouble (apart from crashes memory corruption....) at all.
Async/await is always a bad idea, and without a garbage collector it is a nightmare (look at the mess Rust has gotten into). Whatever, async/await is no replacement for parallel programming with threads. It is a different beast.
> I just looked it up. It is Swift's version of async/await.
Since you “just looked it up”, maybe don’t make blind assertions about something you clearly don’t know very much about?
It’s a lot more than async/await. It is a way to offer compile time guarantees about thread safety (through Sendable, which is part of SC), it’s an actor model (allowing shared mutable state to be isolated, leveraging async/await for coordination so that callers can pause if the actor is currently servicing another message) and a bunch more stuff.
I explained all this in my post you replied to, maybe read the whole thing before making wrong claims about stuff you spent 1 minute looking up?
> Whatever, async/await is no replacement for parallel programming with threads.
Is it not for the vast majority of use-cases?
Sure, you can use async/await without parallelism, via a single-threaded runtime to just get single-threaded concurrency, but with a multi-threaded worker-pool async/await-like tasks or fibers I think mostly cover the use-cases you'd have for parallelism?
You have to make sure that you e.g. don't starve other tasks via having no yield points in a task that does a lot of computation (if you're doing cooperative tasks which Swift is doing iirc), but that's not a big one, and can mostly be solved by the runtime too (e.g. Go had cooperative fibers for a long time, until they chose to introduce preemption).
Async/await may or may not be a replacement for highly concurrent and parallel programming, depending on what is the execution model of the async runtime.
If Swift's model is anything like .NET's lightweight Tasks + async/await or Rust's async/await Futures and Tasks as implemented by Tokio or async-std, then it is such replacement.
> Whatever, async/await is no replacement for parallel programming with threads. It is a different beast.
Have you missed Tasks and Task Groups as well? And Actors? For now, they are an abstraction over threads, and IMO a good one. It’s actors + structured concurrency, borrowing from Kotlin‘s Coroutines and sprinkling some Erlang on top. Additionally, in Swift there is AsyncSequence + AsyncStream, a (woefully incomplete) Kotlin Flow alternative.
Xcode 16 rewrote much of the autocomplete behaviors and the new engine is blazing fast in terms of UI blockage compared with 15, and stable. Everything to do with tagging etc not just completion itself.
I don't know if this is irony or you've been in Apple ecosystem way too long.
I had to do a very simple macOS app for my personal consumption recently and XCode dx is nothing to write home about. The only reason I finished the project there was that I couldn't set up vs code quickly for Swift/Cocoa project. I had to endure the slow compilation time, slow reaction time of the IDE UI. You make a change you keep seeing squiggly lines for a while, it's as though the UI is booting up each time. It was a horrible experience coming from IntelliJ and VS Code daily experience.
Computer is 32Gb Apple M1 Pro. Imagine what will happen on some 8Gb i5 macbook.
Swift is useless outside of macOS/iOS dev… the thing doesn’t even have namespacing. I don’t know for what reason someone would use it outside for Apple environment.
You can disambiguate two types with the same name from different libraries, e.g. `Factotvm.URL` and `Foundation.URL`. Do you mean something more full-featured? You are not prefixing types with three letters, if that's what you think has to be done.
I don't know if it's still the case, but there was an annoyance where you couldn't have a type with the same name as the package. But that is hardly a lack of namespaces.
Objective-C had some minimal adoption outside of Apple (probably due to NextStep nostalgia), so if Objective-C managed to get some traction, Swift will do it to, probably.
However, Apple's history is very much stacked against Swift becoming a mainstream language outside of Apple's platform.
HN majority doesn't like hearing that ladybird et al might just be wandering around, even if the goal is catnip for the bleachers, and we should be skeptical this is the year of multiplatform Swift, because it wasn't last year if you actually tried it. Or the year before last. Or the year before that. Or the year before that year. Or the year before that one.
I’m slightly more ambivalent than you about it. Swift is a nice language and has better ergonomics than C++ and I imagine a Swift codebase might find more contributors than a C++ one (maybe I’m wrong about that!)
I also think it’s separate from the dream of “multiplatform Swift”. For that you need a healthy package ecosystem that all works cross platform, Swift doesn’t have that. But a lot of Ladybird is written at a low enough level that it won’t matter so much.
Problem is Swift engineer supply is low, there's not a viable business case to learn Swift because it's not actually viable cross-platform for development unless you have $X00 million to throw at your macOS/iOS team to build it from scratch, platform by platform (to wit, sibling comment re: Arc Browser)
So best case we're looking at: Swift isn't ready yet, the next major version will be, and we can't build UI with it, so we'll put in the effort in to bootstrap a cross-platform ecosystem and UI frameworks. Or maybe we'll just do our business logic in it? It's a confusing mess that is irrational. Even with great blessings of resources. ex. $X00M that Arc has obtained one incremental platform after a year. And "all" they had to do was Swift bindings for WinRT and connect it to the existing C++ engine.
All of this is easy to justify if we treat it as an opportunity to shoot for how we wish software could work in theory, instead of practice. I hope I'm wrong but after being right the last few years, I'm willing to say it's naive wishcasting out loud, even though its boorish. I see it as unfortunately necessary, a younger me would be greatly misled by the conversations about it on HN. "We've decided to write the browser in Swift!" approaches parody levels of irresponsible resource management and is a case study in several solo engineer delusions that I also fall victim to.
It's genuinely impossible for me to imagine anyone in my social circle of Apple devs, going back to 2007, who would think writing a browser engine in Swift is a good idea. I love Swift, used it since pre-1.0, immediately started shipping it after release, and that was the right decision. However, even given infinite resources and time, it is a poor fit for a browser engine, and an odd masochistic choice for cross-platform UI.
> Problem is Swift engineer supply is low, there's not a viable business case to learn Swift because it's not actually viable cross-platform for development
The Swift business case is that in many situations native is strongly preferable than cross-platform. Excluding some startups that wants to go to market super fast and consulting companies that have to sell the cheapest software possible, usually the benefits of native outweighs the ones of cross platform.
For this reason now there are plenty of companies of all sizes (faangs included) that build and maintain native apps with separate iOS/Android teams. There are very good business reasons to learn Swift or Kotlin in my opinion.
Right -- no one read my comment and thought I meant Swift was unnecessary or businesses don't use it. Contextually, we're discussing Swift for cross-platform dev.
Well I was trying to be kinder than just leaving you downvoted and confused why. I guess I shouldn't have bothered, my apologies. Hope your week gets better!
I wonder what convinced Andreas Kling to abandon his own language Jakt [1] in favour of Swift.
In the long run, it would be good to have high-level languages other than Java that have garbage collection (at least optionally) and classes, and that are still capable of doing cross-platform system development. I don't know if Swift fits that bill, besides cross-platform ecosystem (a la Java), submitting the language for ISO standardization (not just open sourcing one implementation) would be a good indication of being serious about language support.
> In the long run, it would be good to have high-level languages other than Java that have garbage collection (at least optionally) and classes, and that are still capable of doing cross-platform system development.
One of the major differences between ladybird as part of serenity and ladybird the separate project is using 3rd party libraries. When what you are building are is for fun and you build everything yourself it makes sense to also build a language.
Ladybird as a separate project has the goal though of something usable in the shorter term. So similarly with switching to 3rd libraries for things I don't think it makes sense to spend potentially years first building the language before building the browser.
You might be powerfully dry here, imparting a Zen lesson. If not, or if you, dear reader, doesn't see it: it is worth meditating on that there was a language, an OS, and a web browser. Then, on common characteristics in decision-making that would lead to that. Then, consider the yaks that have to be shaved, the ISO standardization / more than one implementation hints at this.
It’s not really a question of fairness. The existing codebase is C++, the new stuff is Swift. Hence the comparison.
I’ve written both Rust and Swift while being an expert in neither. I wouldn’t say Swift has no pluses in comparison, reference counting is often a lot easier to reckon with than lifetimes, for one. I’m actually curious what a large multithreaded Swift codebase looks like with recent concurrency improvements. Rust’s async story isn’t actually that great.
I agree. People's perspectives differ. It abhor `async/await` in Rust. It has poisoned the well IMO for asynchronous Rust programming. (I adore asynchronous programming - I do not need to pretend my code is synchronous)
But that is taste, not a comment on poor engineering!
The lack of the borrow checker in Swift is what makes it approachable for newcomers, as opposed to Rust which is a harsh mistress.
But reference counting is such a silly idea. Swift really should have a garbage collector with a mechanism or subset to do that very small part of programming that cannot be done with a garbage collector. That would have been design!
I fear that Swift is going to get a borrow checker bolted on - and have the worst of both worlds....
Reference counted objects have deterministic lifetimes, like Rust or C++. Garbage collected languages don't have that. Essentially a reference counter automates some* of the ideas of the borrow checker, with a runtime cost as opposed to a compile time one.
The great thing about automatic reference counting is you can elide the incrementing and decrementing a lot of the time. Stronger guarantees, such as saying "this object is single-threaded" lead to even more optimizations.
I don’t think Apple would want to sacrifice the determinism that refcounting provides for garbage collection. iOS apps are still more buttery smooth than Android in many cases.
I mean this year we did have the porting of Arc browser to windows. I use it my gaming PC and it is starting to feel like it has a similar level of polish to the MacOS version.
Ladybird exists and is funded because the main author is an extremely skilled software developer with a background in building web browsers.
He also built a totally from scratch operating system, and built a very productive and skilled community around it, which branched into building their own web browser as SerenityOS has an 'everything from scratch' policy. How is Serenity not an interesting feat of engineering?
You seem to call it a 'toy' because of where the development is at - this is grossly unfair criticism. By your measure, is all software that isn't at a 100% finished state a toy?
Why is it weird that a company that built their entire business on the web wouldn't want there to be a web browser monopoly controlled by an advertising company? Shouldn't you fund projects that will be beneficial for your company in the future?
I think they were referring to the version of Ladybird taking advantage of Swift v6. I agree in part that it is only nice idea until a production ready binary gets released; right now it is just experimental.
Is there a good software engineering resource about the complexity of a web engine? I mean, we all know that is complex but what are the critical areas. Performance is one, compatibility another.
This is a good post; as far as Ladybird is concerned while they may have started things as fun, they seem to have taken a turn towards seriousness recently.
The problem with these smaller webbrowsers is that you have nobody to sue if the browser turns out to leak your personal information or credentials etc. Therefore it is better to stick with browsers made by big corporations.
I'm curious what exactly you think suing Google or Apple in such a case would accomplish?
On the (long) odds that you get all the way to a successful class-action suit, the lawyers get rich, and the class members eventually end up with a free year of credit monitoring
(we have seen this over and over again in settlements for high-profile data breaches)
This is true for any software though, so if we push that reasoning to its extreme, no one should ever use any free and open source software, which I find a little bonkers.
Some people just get warm fuzzies at the idea of theoretically being able to sue someone if they need to. Usually these are people who've never actually looked into what it would take to even start a lawsuit.
According to https://servo.org/about/ Servo currently passes ~60% of the web platform tests. Does anyone have experience how far that subset gets you on the open internet?
libweb was started in 2019. Anyway, I’m really sour that 90% of the discussion on a thread about the incredible project that is Servo are wasted on Ladybird.
Ladybird was mostly developed by one person (Andreas Kilng) with no financing whatsoever ,Servo was mostly developed by a team who was being paid with Mozilla money.
That was true at the start, but it's most definitely funded development now and that's what will probably get them over the finish line.
If the development work went into debating, specifying and expressing required behavior as a written spec more exactly (beyond w3c specs and towards the more pragmatic reality of what current browsers actually do) then very long term we can probably have engines that are AI built [or just more easily developed by humans] from a combination of the written specs and the set of tests they need to pass.
Using AI for adversarial development (e.g. one group tries to break and hack it, the other group defends and refines) could get interesting and wasn't really an option before. Anything that's now available to reduce the human resource cost of development could make a big difference.
> So Andreas working on webkit means he has no browser engine experience?
Who said "no experience"? (except you, of course)
I've said, an I repeat myself so maybe this time it'll work, Andreas had no money whatsoever, while Servo was developed inside Mozilla that poured millions of dollars on it and created a dedicated team to build it.
It makes all the difference in the World, the actual experience on building a web browser is irrelevant, given the initial disparity of time, money and resources available.
It makes all the difference between a random guy building a working twitter clone and Meta building a working twitter clone.
The first one is an amazing accomplishment, the second one is a mehhh at best.
it has been underway for much longer and was built with people with actual browser engine experience
The key point is that Ladybird was developed by one person (not people) with some browser engine experience over a realtively short period, using only personal resources. While Andreas Kling worked on WebKit, his experience wasn't at the level of building an entire engine, which is evident from his videos. Experience alone isn't enough; he learned much of what he needed while developing Ladybird. While Andreas Kling is talented, many other developers on his team were equally skilled and yet he's the only WebKit developer I am aware of who built a browser on his own.
A task that not long ago was considered too hard to tackle, he proved it can be done even by people with relatively modest experience on building a browser.
It should be highlighted that Andreas main skills are his tremendous communication skills and the way he builds a mental model of the problem he's trying to solve, not his past WebKit experience (he wrote an entire OS, before building a browser for the OS he built, as a side project)
The web standards have changed since the introduction of Acid3. The updated Acid3 test is here: https://wpt.live/acid/acid3/test.html and modern browsers should score 100/100.
Edge being a Chromium based browser and somehow scoring lower is genuinely hilarious. Congrats Microsoft, on taking something almost perfect and only slightly running it.
Edit (with Firefox): the above text is from Verso. Logging in works (though the session is not stored across restarts), as does both commenting and editing comments. Since space can't seem to be entered, word wrapping doesn't work with text entered in Verso, though it does seem to work when there is text with spaces (eg, this edit). A cursor also doesn't appear for me, making editing a challenge.
I have a branch here https://github.com/servo/servo/pull/32619 that makes this significantly better (by greatly improving flexbox support and adding CSS grid support).
New Reddit rendering “properly” is as useful as a chocolate fire guard. Do you want to view this content on the app? You Can Only View This Content In The App, THE APP is the WAY, JUST FUCKING DOWNLOAD THE APP!!!
Can you imagine being on the team responsible for the mobile web Reddit experience? Having to implement a feature that tells every user, repeatedly, that their work is subpar and to please use something else.
The newest Reddit - not the React version, but the Lit version that's replacing it - uses very, very modern HTML and JavaScript. I'd love to see Servo get to the point of rendering it correctly!
TIL. new.reddit seems to be a return to old.reddit in a lot of ways. It also seems really really fast unlike the current www.reddit. If they remove some annoying ads I might even be convinced to stop using old.
The only downside is that new is still being developed. Old is nice because it is shielded from their post-success incompetence. Whatever bad ideas they decide to implement will hit new first.
new.reddit seems almost... sane. Like they kicked out whoever pushed/implemented reddit aka the current version and figured out that old.reddit had a lot of virtues.
It is so crazy. I normally use "reddit.com" but a few weeks ago i followed a link to an article on "old.reddit.com" and saw I had notifications. I clicked the item and it was all notifications from days ago. So the read status of notifications on old reddit and current reddit aren't linked?
Most pages, logged-in and logged-out, seem to be using the Lit version for me. I don't think it contained many visual changes, so you have to look for the presence of web components.
Pretty amazing how a company can overengineer a basic feed that displays images and text (and ads). New Reddit is a dumpster fire, can't imagine another rewrite is going to fix it.
Yeah, one of the weirder things is that if someone blocks you, you are not allowed to reply to any comment in the thread (for example, if someone posts a toplevel comment and doesn’t reply to anything, and you reply to some comment a dozen layers below and have an exchange with some third party, if the toplevel commenter blocks you, you can’t reply to anything including comments left by others in response to your own comment) and you get a generic “internal server error”-type pop up with no information about why the request failed.
This is most likely a server “feature” and thus a rewrite of the frontend won’t fix it but it seems like at very least the frontend could display a sane error message (assuming the backend forwards some information about the cause of the error).
Rust can absolutely crash. It crashes ("panics" in Rust terminology) in a memory safe way but it's still a crash (please don't try to redefine the word "crash" to be more specific than it actually is).
And Rust can still have unsafe code so it can crash in memory unsafe ways too (though it is very unlikely unless you're doing things very wrong).
> And Rust can still have unsafe code so it can crash in memory unsafe ways too (though it is very unlikely unless you're doing things very wrong).
From when I grokked the code a bit (back in 2017) there was a non trivial amount of unsafe code, especially related to integration with SpiderMonkey (the js engine) so it wouldn't even be particularly surprising to see segfaults in servo, unlike most rust projects.
Most crashes I've seen trying this out seem to stem from panics. As an end user I would call that a crash, but from a programmer standpoint you could say it's an unexpected normal shutdown.
I just compiled it following the Linux instructions. I honestly thought Servo was better than this. Maybe there's something up with the Linux builds? I'd much rather use Ladybird in all its pre-beta glory than this.
There are two window title bars, one by the OS and one from the application. The text in the URL bar is misaligned and is shifted down by half its height. There's a black bar between the browser chrome and the web view. Entering a domain name without http or https and hitting enter crashes the entire application.
Clicking refresh spawns a new window that sort-of-but-not-really shares the same website being rendered.
Very few websites work. Anything with a cookie banner just plain breaks. I can't tell how to edit the URL bar after failing to load/loading a page. Google.com is very wonky. The search box on Google doesn't seem to take space bar for some reason.
For the websites that do work, rendering is very fast and scrolling is pretty smooth. I can see the potential, but there's a lot of work to be done.
In other exciting Servo-browser news, Servo and Redox OS have submitted a joint proposal to fund the porting of SpiderMonkey and WebRender to Redox: https://www.redox-os.org/news/this-month-240731/
One big reason to want change is to change the funding model that currently supports the existing browser projects, which isn't great and is threatened too...
Does this do anything to improve the browser as a user agent? That is an agent that obeys the user over the server. None of the current browsers are user friendly and none are scriptable except by experts wielding large external programs.
It should be possible to write a simple shell script to navigate the web, to log in to web sites, to extract information. Or something like Visual Basic.
that's really an interesting case, most puppeteer and playwright deployments are using chrome. If Verso will be faster they might have an advantage in crawl/scrap/QA.
"Let me throw shade on this open source project that does incredibly ambitious thing X and that tons of people are devoting lots of time to, by suggesting they should instead do this other esoteric thing Y that 99% of users don't care about but that I, the entitled power user, think they should be doing instead."
It'd be nice if people stopped recommending "yet another new and exciting package manager for Windows" that you have no idea of whether you can trust it or not.
Git, Python, llvm, cmake, and curl all have perfectly normal windows installers available from their own websites, and if you're a programmer who has to, or chooses to work on Window, it's a good bet you already have either most or all of these already installed, making the job of completing your bonus objective probably one, maybe two installs at most.
That was my favorite part of this (especially since the build didn't work). Whatever scoop is, it actually worked and installed those things without any complaint... might be my new favorite package manager
I’ve typically used chocolatey. I think it typically uses the regular installer for a package, just in a headless/unattended mode configured by the maintainer.
Ideally, winget would be the way to go, but I haven't had good luck with it. There are a number of limitations and it doesn't feel like a polished and curated repo. It just feels like a wrapper around chocolatey, which has its own problems.
Microsoft is the reason alternative package managers exist
I'm sure, but it's also "yet another package manager" because everyone has their own favourite package manager. It's a nit, but it's nice when a README.md goes "you need X, Y, and Z" and doesn't pretend you need a specific method for that. Tiny phrasing change, zero real world difference of course, but it's a nice little sign that the folks running a project know that you know what you're doing on the OS you're working with (either by choice, or by paycheque =)
Scoop is the main package manager I've been using for years on windows apart from chocolatey. Dunno many others aside from the official windows one: winget.
I've been using Windows as dev platform since it was called "DOS but don't look too closely", I know chocolatey, ninite, winget, and powershell's own built-in nonsense, and yet had never head of scoop until just now. So... that really just tells us that any application manager we think is popular, ubiquitous, and the obvious choice is still really just a niche program =(
Contrast that to brew on MacOS: ever non-devs know about brew.
And from all of those, which I'd heard of, I think scoop is the only one to allow a package author to just create a git repo, and publish a package that way.
Like brew does. That why I use scoop.
Chocolatey would rather charge money for that, for some reason, and people are still willing to donate them their free time.
I don't understand what the advantages are over Servo's inbuilt web browser¹. So far they look the same when opening but the inbuilt web browser is more stable (I see rendering bugs with Verso and it panics when entering a domain without the http(s):// prefix).
I'm happy to see work being done in integrating Servo into custom browser chrome. My dream browser would be a servo based Qutebrowser. I can only hope.
Apple makes it relatively difficult to support old versions of their OS. You pretty much have to keep an old computer around and just not update its software. So it might not have been a conscious choice, rather they just didn't go to the extra effort to support older versions.
Their build and testing platform is on Github Actions which supports macOS 12. Furthermore, Apple supports running macOS in a virtual machine or a dual-boot setup.
Have a read about Servo, it is binding with SpiderMonkey as JS engine.
Is there some chance, that servo decomposed from SpiderMonkey? If it is not, I don't think anyone can tell difference between firefox and other browser use Servo.
There's a lot more to a browser than the JS engine. And in general, we would want all of the browsers JS engines to be essentially the same. I think starting with an existing engine is the most logical approach
That still leaves you with at least the layout engine, all the DOM apis, all the networking, multi process model & sandboxing, web extensions support, that are different implementations.
Interesting. Kinda reminds me of the 360 Safe browser that's one of the most popular browsers used in China. It can render a website using WebKit, Blink, or Trident (Internet Explorer)
"Servo is a prototype web browser engine written in the Rust language. It is currently developed on 64-bit macOS, 64-bit Linux, 64-bit Windows, and Android."
So, this browser seems to be about using Rust, and somewhat Mac-centric. Not criticizing, just emphasizing.
------------------------------------------
2. The repository does not explain:
* How far along the project is.
* What are the benefits / points of attraction of the browser (or - perhaps it's more of a proof-of-concept?)
------------------------------------------
3. The project has a highly repressive Code of Conduct:
* Forbidden behavior is open-ended and at the discretion of whoever handles a complaint.
* No due process: Anonymous complaints, in-abstentia proceedings, no right to face accuser, no right to access and review evidence, etc.
* The project leaders/owners presume to forbid community members from interacting with people whom project leaders decided to ban. This is a bit like how when the US sanctions a state, it also strong-arms everybody else to observe its sanctions or themselves get sanctioned by the US.
Bottom line: I would avoid getting close to that project, if the CoC is actually applied. If it isn't - very much recommend removing it.
Their best deployment/installation instructions are for Macs, and for Linux they suggest things like nix-shell or flatpak, rather than straight-up packages for the popular distributions/package management systems. That's my reading of things, anyway.
Running `./mach package` will create a tarball you can use easily in any distribution. Not perfect, but I would say good enough at this stage of the project - remember Servo is not a browser, it's a browser runtime that needs to be embedded. The `servoshell` itself is just a basic example to show how the embedding works.
Interesting but it seems like a super early stage (or maybe Windows support isn't there yet.)
Nigh unusable on Windows (11). Mostly just opens an empty window that has stopped responding. I finally tried running as admin and it works more consistently now. Webview for https://www.google.com looks like a messed up mobile view. My company's website doesn't work at all, (NextJS) so I guess they don't have a js engine yet?
There's still zero day exploits found in chromium, wouldn't using using this put you at a huge risk of running into malware in the wild that this browser can't protect against?
I somewhat wonder—Firefox and Chrome are in a constant race to have the best JavaScript performance.
In general, the sites I want to browse use minimal JavaScript, prudently, if at all, just where it is strictly necessary to add little dynamic features. So, I don’t really care about JavaScript performance at all.
Optimization sometimes introduces additional complexity, which might open up the possibility of security holes (at least it seems to be the case to me, as a not-security-related programmer. I don’t know anything about security on a technical level, so I’m interested in other perspectives on this from people that actually work in those sorts of fields). I wonder if there’s room for a browser engine that ditches performance and just focuses on correctness and safety.
Rendering documents ought to not be computationally intensive, right? Advertisements of blazing fast JavaScript performance make me worry what corners have been cut.
I assume he means something like turning off the JS JIT, not turning off JS completely. IIRC iOS turns off Safari's JIT when in lockdown mode. Ladybird browser also abandoned its JIT apparently due to security concerns. JS JIT is one important example but also in general if you write your code to only focus on correctness and not performance then you will get safer code (all else being equal).
Haha, well that’s what I use now. I think it is the opposite though. I’d like a JavaScript implementation that doesn’t break any sites, but which makes absolutely no security compromises, even if that means they have to give up a lot of performance.
Sometimes, I just have to load a site that has JavaScript running. Or is unfortunate, but some work sites don’t work without it, etc. I’m fine with those sites being slow (I’ll minimize my use of them naturally), but totally blocking them is slightly inconvenient.
Disabling (all of) the JITs is a decent approximation of this. It's very site-dependent as to how much of a performance impact it makes, but for many sites it'll be fine.
Obviously this isn't the same as making "absolutely no security compromises", but in practice most JS-related security exploits go through the JIT iiuc. Your JS will be executed with a safe interpreter, where by "safe" I mean the dispatching and basic value manipulation are going to be simple enough to be bulletproof, and also slow enough to prevent most timing attacks. The underlying implementation of all of the built-in methods is still going to be more vulnerable, but those tend to be relatively safe as compared to JIT-optimized versions of them. They also don't change much, so have been tested for much longer as compared to the JITs that tend to get refactored and rewritten relatively frequently.
Malware generally can't just "hijack" any arbitrary browser. You have to have exploits for a specific browser / browser family / JS engine. Chromium has defense in depth techniques to try to raise the difficulty bar and to try to limit the fallout of an exploit in various parts of the tech stack, but that still requires a Chrome-specific attack in the first place. Attacks for Safari / WebKit / JavascriptCore would similarly require a different vector for the exploit (even if some techniques are common, the specific details should be quite different).
There is a newer class of generic malware that exploits CPU bugs (e.g. Spectre) - are you perhaps referring to that? If so, that's a fair concern but unlikely to matter much in practice. For Spectre itself, I believe the mitigations were applied within the major JS engines directly (or at least for v8 they were).
Anyway, security issues are best compared when there's a lot more attention to your browser. But given that there's a huge amount of exploits that depend on buffer overflows that are simply impossible in the first place with Rust, it's likely that the browser's likely to mostly suffer only from architectural issues & fewer implementation issues whereas other browsers will still have architectural issues and implementation issues to boot that prevent them from addressing it. Yes, newer browser = likely more immature architecture, but at the same time there's fewer implementation issues to worry about in terms of exploiting architectural issues in the first place.
They said some of the code can be rewritten in Rust but majority of the code cannot. Therefore they are stuck with C++ forever. Scary thing to think about.
Do you have a citation for "cannot"? Maybe "prefer not because of non-functional requirements" but if it's a choice between a ~~webpage~~ ad vector loading 10 seconds slower versus the goddamn plague of RCEs coming out of Chrome, I know which one I'd take
>Over the past decades, in addition to large Java and Go memory-safe codebases, Google has developed and accumulated hundreds of millions of lines of C++ code that is in active use and under active, ongoing development. This very large existing codebase results in significant challenges for a transition to memory safety:
We see no realistic path for an evolution of C++ into a language with rigorous memory safety guarantees that include temporal safety.
A large-scale rewrite of all existing C++ code into a different, memory-safe language appears very difficult and will likely remain impractical[0].
Long story short; they are stuck with millions of C++ LOC and they can't transition to Rust completely because of the enormous complexity of various gigantic codebases e.g. Android, Chrome or whatever they want to move to memory-safe language/s.
They said they will try to write as much new native code in Rust as they can plus they will interop from C++ to Rust in order to reduce memory-safety bugs.
Eventually, maybe. In the short term, the transpilers mostly translate from other languages into unsafe Rust - which is a helpful tool during porting, but there is still a mountain of work to refactor unsafe idioms into safe before you really reap the benefits.
Also doing that doesn't solve the standards problem we currently have.
Google is able to push through any "standard". And those standards "unwittingly" help them maintain their search/ad dominance or prevent competitors. For example see manifest v3 or FloC (a.k.a Topics API).
Once they gain enough traction and become indispensible, you either implement them or risk losing users.
are you being intentionally ignorant? the more browsers in popular use, the less control each has. currently Chrome has more usage than the other browsers combined:
They don't have to. But we have seen how Google can play this game of open core & official versions with AOSP and blessed Android versions with playstore.
There will be two classes of products one officially sanctioned version and the others that are used by enthusiasts. Apps or sites in this case may chose to work on one and not on others. Imagine the new wave of "works best on IE" with "best viewed on chrome variations a,b,c". Its not farfetched as some sites already do this.
As much as it is easy to maintain a fork, it is that much easy to give up or change path and accept upstream changes.
I'm healthy at the moment and I'm still not sure I understand. Poetic perhaps, but a bit nonsensical.
Old world as in the past? Older technology? Older ideas? Bad ideas?
Blues, as in the musical genre? Or the feeling it conveys? Are we riffing on it here? Plays strongly suggests music, but the blues originated from specific cultural roots tied to the end of slavery (which is implied even further by 'old world blues').
New world, as in a better tomorrow, or something more akin to a new world order?
Serious Question: With risk of being hated, this is an honest question:
I've never sought out another browser than using pretty much any of the big three...
Just because I have never had any sort of personal workflow/painpoint/interest in any of these other browsers/engines, that frankly I had never heard of and then another thing pops up every few years with yet another new one that I havent heard of -- but they all seem to have lively communities...
The question is:
What is the primary drive/utility that you/others are seeking/gaining with these none FF/chrome/edge things?
Servo is written in Rust so one big aim would be to have better security. I believe it's supposed to also be much better parallelized.
On my part, I'm very much looking forward to an embeddable browser engine. Neither Firefox nor Chrome are interested, and QtWebEngine exists, but takes extensive patching, and so depends on the Qt project remaining to exist and keeping up with upstream.
Diversification and modern codebases. Without these two we’re locked to made in US browsers full of CVEs. I don’t think any nation state can audit Chromium today, not even US DoD.
All mainstream web browsers are bloated and use a lot of resources. I am looking for a tiny lightweight web browser with good HTML5 support but without bloat for older computers. Servo, Ladybird and Ultralight (https://ultralig.ht) are promising. I even started developing Qt Ultralight Browser (https://github.com/niutech/qt-ultralight-browser).
I primarily use Safari on Mac / iPhone because it's integrated with iCloud Keychain. I also use Brave because it's the only browser with which I've been able to screenshare streaming sites for group viewing without getting a black page due to DRM.
Stylo (CSS Engine) and WebRender (compositor) came from Servo and are still used in Firefox. I believe the Mozilla team still upstreams patches as well.
As an acceptable compromise we can separate all punctuation and put it at the end of the paragraph Although some oldfashioned enthusiasts of the Chicago Manual of Style will object But I think its good to bring some fresh ideas to our old and worn orthographyevery now and again,.-…’—.
Or write an entire book without them, then publish a second edition with an appendix of just punctuation marks, for the reader to place wherever they feel is appropriate. Oh wait... Timothy Dexter already did that... https://en.wikipedia.org/wiki/A_Pickle_for_the_Knowing_Ones
Genuinely interested, if the ' represents skipped letters, how does this read to you?
As an aside, if you have the sentence "This is Lewis' reply to the parent comment" the ' at the end of Lewis is used to avoid Lewis's with the extra s at the end.
This is wrong. Servo development has picked up steam after merged into Linux Foundation, and multiple increasingly used Rust projects are all betting on Servo for the long term: Dioxus, Tauri, some others I forgot.
Don’t think this really fits in with HN guidelines.
Seriously, check out https://servo.org , they post monthly updates and the progress seems great. And their repo, too - too bad GitHub can’t display code frequency because there’s too many commits in the project.
If anyone is willing to put an effort in building a new browser, I have just one wish - allow embedding open-source transformers in the browser; available for both the user and websites/extensions.
I mean AI transformers. I understand we have pre-embedded STT, though I will like the option to include my own - say an English grammar GPT and expose it by an API for progressive enhancement of content.
I hope in the process of doing it we will find new ways of doing things.