Hacker News new | past | comments | ask | show | jobs | submit | sprash's comments login

As long as they don't get rid of the Qt dependency the project is a bit pointless. If you are using Qt anyways Qt Webview offers already a superior way to render HTML than Ladybird.


Qt webview is just Chromium in a trenchcoat. The point is to create a new browser engine, not a new Chromium UI (and hopefully much less bloated than Chromium).


GUI libraries and a browser engines are vastly different things. Think of Ladybird as Blink, Webkit or Gecko.


I think the point is the opposite; they have decided to build a new "web stack" from scratch, not just build a new "browser" (or invent a new GUI framework). Hopefully the web engine is not deeply tied to Qt, but you need something in order to draw an interactive window. The article mentions that they will also use existing libraries for things like font rendering. Seems like a pragmatic decision to me.


No, because with the current architecture they can more easily migrate away to different GUI libraries.


On mac there's a native appkit backend, no qt needed as far as i can see.


"Chat Control" is already real. This is just codifying prevalent practice done by a multitude of agencies into law.


Sorry, is there any evidence that E2EE eg in WhatsApp, Signal, etc. is routinely broken? I am not talking about exceptional hacking of phones of individual high-value targets for surveillance by nation-state-level actors, but mass surveillance.


If there was public evidence allowed to be released nobody would be using those algorithms obviously. The point of those algorithms is to make them hard to break for the public and easy to break for the agencies . E.g. None of your mentioned products use quantum hard encryption. It is not far fetched to assume that all the relevant agencies have access to a working quantum computer. But I doubt you need even sophisticated hardware. Most "government approved" encryption algorithms should be considered compromised from the get-go.


I seriously doubt that agencies have more capabilities than the scientific community of mathematicians. Perhaps there are weak points in implementations, but I don't believe any agency has the capabilities to crack encryption, even some of the older algorithms.

There is no evidence that a quantum computer can break classic encryption yet. Even if the agencies tried, they would not have the means to stop the spread of such information.

And finally, we wouldn't get laws like this.


Almost 100% of the "scientific community of mathematicians" is funded by the government. They can't be trusted either. If they want to publish something that is considered to be a "threat to national security" the agencies have multiple avenues at their disposal to "convince" them to not publish.

> And finally, we wouldn't get laws like this.

Codifying covert practices into law has the big advantage to make the whole oppressive surveillance state much more efficient. Gone are the days of "parallel constructions". Also the chilling effects of total surveillance alone might be enough to prevent the opposition to be effective.


Sorry, but these are conspiracy theories without good evidence. You don't think other countries have good mathematicians? And the long arm of your government agencies reaches all of them?


Signal uses post-quantum encryption


Oh, indeed, they're adding a quantum resistant layer. Nice. Not sure it's in production yet.

https://signal.org/blog/pqxdh/


This is why the XRender extension was introduced. There you have antialiasing, all blending modes you could wish for, subpixel coordinates, advanced drawing operations like gradients and it is fast because it is fully hardware accelerated. All working over a very efficient wire protocol. E.g. Cairo uses Xrender as a backend.


It's not all hardware accelerated though is it. Both the X server and Cairo depend on the pixman library, which is a CPU/SIMD optimized pixel manipulation library.

Even xf86-video-intel, the intel X11 driver package, on my system depends on pixman.


Not powerful but extremely simple, especially if you want to avoid JavaScript at all cost:

https://chartscss.org


Wayland is also a "client-server" model. If you use DRI3, X11 and Wayland even do the exact same thing "under the hood". Only the sub msec negotiation phase is slightly more complicated on X11. We are talking about very cold code paths here.

Wayland not only has forced vertical sync it also requires every application to be double buffered. This can be detrimental for some performance metrics. That is why Wayland has worse performance despite being "simpler".


The X11 DRI3 buffer swap mechanism is identical to the one used in Wayland. The fact that Wayland has still worse performance across the board (especially latency metrics) might be an indicator that Wayland is fundamentally mismatched with the needs of a modern graphics stack.


You are basing your conclusions on a single blogpost. Measurements from other people show that X11 wins in some cases, Wayland wins in other cases, there isn't a clear winner.

https://zamundaaa.github.io/wayland/2021/12/14/about-gaming-...


> X11 wins in some cases, Wayland wins in other cases

Even according to your blog post uncomposited X11 wins in all cases (or is tied within the error of one millisecond). It especially sweeps the floor with immediate rendering.


Uncomposited anything is madness in 2024. You have the VRAM, use it and save your battery life / power which is arguably far more important. It being tear-free is a very nice bonus on top of the power savings too.

(And no, composition is not just for 3D cube effects or anything like that. Although it certainly enables them.)

MacOS have moved on, Windows have moved on, Linux should do the same.


No they decided working on X11 would not rake in enough consulting money. So they engineered a completely non-working solution called Wayland that is broken by design and takes years and many consulting hours to fix.


Calling it "broken by design" is an extraordinary claim and requires extraordinary evidences to back it up.

Now, X11 being un-sandboxable is truly something broken by design.


> "broken by design" is an extraordinary claim and requires extraordinary evidences

There are many technical aspects that make Wayland broken by design (like default forced vsync, forced double buffering, a fucked up event-loop for single threaded applications or severely lacking functionality for things like window positioning or screen sharing). But the biggest problem is the design-philosophy: Wayland makes life extremely easy for gate keeping "Protocol Designers" and extremely hard for application developers.

> un-sandboxable

Not true. The quick and dirty way would be using Xephyr. Besides that many access control hooks like XACE are present and standardized in the X11 protocol for many years. Application developers just choose not to use them. So if X11 is not secure enough for you, blame GNOME and KDE, not X11.


> like default forced vsync, forced double buffering,

A few things:

1. Vsync-by-default is the norm. X11 was the outlier.

2. Wayland does triple buffering, not double buffering.

>a fucked up event-loop for single threaded applications

I dunno, I wrote Wayland applications and I did not notice any peculiarities w.r.t. the event loop, at least in comparison with other platforms like Win32.

You need to expand a little more.

>window positioning

I suggest to read up on the Gitlab MR for the in-development window positioning protocol. The basic TL;DR is that window positioning has certain implications regarding tiling window managers and other unusual desktop usecases e.g. VR.

>screen sharing

I just shared my screen this morning.

> Not true. The quick and dirty way would be using Xephyr.

...So what you are saying is that you'd need a separate server running. Thanks for telling me that X11 is unsandboxable.


"broken by design" proceeds to work perfectly, lol, i don't can that broken by design


The real reason people stopped developing Gtk applications is that GNOME tends to declare widely used APIs arbitrarily as "internal interface" and makes incompatible changes even on minor version updates.

It's a constant chase not worth any developers time.


Precisely this, the moment they announced they wanted to break backwards compatibility with new GTK versions every 6 months or whatever it was they made it very clear it is exclusively a Gnome API for OSS projects and nothing else.

They backtracked a bit on that but they'll still replace GTK4 with GTK5 at some point, probably deprecating context menus or whatever else this time. Clowns.


9 years, thats how long between last GTK API break. Never for GLib.


The first release of GTK3 that could be considered stable was 3.20. Either way, I'm talking about GTK4, where the original plan was 6 months of breaking releases, stabilization in version 4.6, followed by starting work on the incompatible GTK5, to be released 2 years after the first GTK4 release. The backlash was so big they at least changed the versioning scheme, but the "break shit every 2 years" is still in full force.


GTK3 did not break its C API. The CSS you could inject was arbitrary and undocumented. They stabilized and documented that in 3.20, yes. I think its very dishonest to say the toolkit was breaking, the vast majority of GTK3 apps didn’t have regressions IME.


It affected desktop environments not called Gnome. Either way, the problem is GTK4 (and onwards), not 3.


I would be interested in a comparision of uncomposited X11 running Xterm using this measurement method.

Xterm outperforms everything "on paper" with typometer but is it really real?


I tried a bunch of terminals over the years.

I never did tests that were as in depth as the OP's blog post but no terminal I've used ever matched how good Xterm feels when typing. It's like you're typing within a zero resistance system.


Not the author, but I did this same project a few years back, these were the results back then: https://jnsn.dev/posts/fastisslow/ with a guide https://github.com/DelusionalLogic/Frametime if you want to replicate it.


This is also not what happened. I remembered that she once was proud of her work and the fact that her image was used so widely. Then FOSS and engineering in general was targeted by a wave of cultural marxist psyops that brought us useless stupid things like CoCs. And within that wave she was convinced by "activist" (=operatives) to demand the retirement of her image.

It's the perfect angle to cause division and cause endless discussions about irrelevant BS. Congrats Psyop successfully deployed! (Probably CCP behind this in order to weaken western Engineering)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: