Hacker News new | past | comments | ask | show | jobs | submit login
Reviving Old X Code (keithp.com)
247 points by fcambus on Jan 3, 2021 | hide | past | favorite | 55 comments



Note: the author is Keith Packard, who wrote a ton of the original X Window System code. I adored that system! The idea of running graphical code on multiple machines, then displaying the results locally, has yet to be surpassed.

"In 2011, O'Reilly awarded an open source award to Packard, as "the person behind most of the improvements made on the open source desktop in the last ten years at least."" -- https://en.wikipedia.org/wiki/Keith_Packard


Keith has been super friendly and supportive in my experience. Back when I started working on a compositing window manager something like a decade ago, I used his xcompmgr as a reference and he was always responsive and helpful whenever I emailed him questions about it.


His talks are also delightful. Extremely brilliant people like him often either have quirks or are not the greatest speakers (don't get me wrong – I don't expect a single human to have every skill), but Packard seems to excel at everything: he's obviously a brilliant engineer, he's a wonderful speaker, he seems to be an all-around pleasant and friendly human, he seems to manage a sane and healthy personal and family life. Just a stellar person!

The single fact that he's doing RISC-V work at SiFive boosts my confidence in what they're doing.


My understanding is that X11 documentation was how O'Reilly got started. It's fantastic that 20 years later they're still celebrating the authors and maintainers.


I attended a couple of FOSDEM talks from him, very nice experience.


Been a Keith Packard fan since the 90s. He's an incredible contributor, and his body of work speaks for itself.


> The idea of running graphical code on multiple machines, then displaying the results locally

What does this mean? It's like a remote desktop or something?


"Remote app" instead of remote desktop would be a better analogy.

You run code on multiple machines (which might not even have any graphics devices or the concept of a "desktop" there) where each program might display graphical results in windows on your machine.


X is a native remote desktop system. It just happens that we generally all run our apps on the same machine as where X runs, nowadays.

Sadly playing around and learning how it works is trickier than it used to be now that X doesn't listen over TCP by default (...because the protocol is unencrypted and unauthenticated... can't argue there), but google "SSH X forwarding" for a reasonable startpoint that's still fairly universally supported (you can even get X servers for Windows that work with PuTTY, and it all works natively on Linux and macOS).


Must resist urge to comment on HN at $ZZZ-o-clock in the morning... pedantic clarification, nowadays X uses MIT-MAGIC-COOKIE authentication generally, but can be configured to do others (I think it can use Kerberos?). I don't think the protocol is encrypted in any circumstances though.


yes, except its up a layer. remote desktop moves pixels, and X is a protocol that distributes the rendering api.

so you start up your app, and query what kind of display you're connected to - there is all kind of cruft about color models and bitplane depths. choose fonts, register for events (mouse, resize, etc), draw your elements and wait for the user to interact with you.

it makes absolute sense in the context of the early 90s 'lets make this a network api' - and almost none from a modern perspective

as bloated as the browser model is, i think shipping code to run on the users machine is a much more powerful model... if you can keep it small and deal with the security issues.


Funny, he mentions the Xaw fonts, I was just looking at something similar. I've been using XTerm with Xft support for years but never really thought about it, how come the text in the menus and toolbars is so crunchy?

I eventually tracked it down, Xaw3dXft operates in compatibility mode by default. A oneline patch enables all the new features, and makes a world of difference.

Here's a screenshot, I think it's night and day: https://twitter.com/taviso/status/1344779126767435776

I mailed a patch to the maintainer, hopefully the next version will fix it!


it looks better, but why did they add all that whitespace? yecch


I'm more surprised anyone is actually maintaining Xaw lol. Does Athena even use it anymore?


>Imagine trying to build Windows or Mac OS code from the early 90's on a modern OS...

I guess, in Windows case it also should be relatively easy?


Yeah, though pure X11 code should still be easier (ignoring anything not related to the APIs at hand of course, like the assumptions the existing code had about 32bit-ness) since the API hasn't changed at all. Win16 vs Win32 is 99% the same, but there are still a few minor differences, like some changes in how WPARAM (which was now 32bit) was interpreted in a few messages.

Mac OS would be a complete rewrite though, no salvaging that.


Early 1990s X11 clients are sometimes broken on recent X.Org setups in my experience, in subtle ways usually involving keyboard.

And that's assuming running under X.Org.

As for Win32, if the application didn't do stupid things like setting bitflags that were "reserved for future use", things apparently work without recompile for basic GUI from 1.01 to Windows 10.


Only on Windows 10 32bit, which according to steam stats, is used by 0.10% of users.


The post above was about compiling not running older programs. In terms of running Windows is better, though in either case you do need to find workarounds (what you describe sounds like such a case).


No workarounds - the 16bit DLLs were compatible all the way up from 1.01 to Windows 10 32bit, and from WinNT 3.0 all the way up to Windows 10 64bit.

The problem is that a lot of software... disregarded the specs. Including infamously using "reserved for future use" areas of bitfields, which for example caused a lot of app crashes during early days of Windows 95 (where certain common API gained a new flag that was not compatible with previous flags, which worked fine so long as you didn't set them concurrently - which all properly written applications avoided.)


The specs might be compatible but the programs aren't, exactly because of the assumptions (or plain old buggy code) you mention. So you need workarounds.

I run a lot of older software, especially (though not only) games and i often have to do such workarounds like wrapper or replacement libs. I've also tried running older stuff on Linux (e.g. i have a collection of Linux game demos from the late 90s to early 2000s) and while most stuff run, you do need to find and install a bunch of missing libraries and wrappers (e.g. for OSS support).


> Mac OS would be a complete rewrite though, no salvaging that.

I bet you could get a NeXT application from the early 90s running without too much pain.


The early (earliest?) NeXT APIs start with NX.

macOS doesn't have support for any of these.

I found this out when I tried porting Tim Berners-Lee's WorldWideWeb to macOS as a weekend hack.


Porting 16-bit assembly is harder but porting C code from Win16 to Win32 is pretty easy.

https://github.com/dbremner/cake is a 16-bit Windows 1.0 application that I ported to Win64; most of the changes weren't required.


It's really amazing that win32 has remained unchanged for so long. I guess they got it right the first time (or rather, Palo Alto Research Centre got it right the first time). And if it ain't broke, don't fix it.


TL:DR version - People who love programming are not shy or afraid of going from just needing to update copyrights to converting K&R to ANSI-C, fixing pointer bugs, adding new widgets and even finding new card images in SVG to make it look all better! Hope these KGames updates land in Debian unstable soon.

Also I wondered from the headline why keithp is into XCode - might be better to have used X11 code instead.


> Also I wondered from the headline why keithp is into XCode

Yeah, dismissed it first thinking it was about appleIDE, really glad it dawned on me it's about X, good article.


Some of the code was from X10!


> A very basic port to X11 was done at some point, and that's what Debian has in the archive today

So, no?


Looking at bug – https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=974011 – there is all this talk about copyright of the xmille and mille computer programs, I don't see any discussion of the question of the copyright of the underlying card game.

Mille Bornes was invented by Arthur Dujardin (using pen name Edmond Dujardin), illustrated by Joseph Le Callennec, and first released in 1954 [0]. Dujardin died in 1964 [1], so using EU copyright term of author's life + 70 years, its copyright would expire in 2034. Le Callennec died in 1988, so copyright on the original illustrations would expire in 2058. (I don't know how dependent the illustrations in the computer game are on the original.)

I'm not sure who currently owns the copyrights to the original card game. But Dujardin founded the Dujardin company which still sells Mille Bornes [3] so I would assume they still do. Since 2007, the Dujardin company has been owned by the French TV network TF1 [4].

The copyright owners to the card game have probably never heard of this obscure non-commercial computer game. Who knows if they'd react to it indulgently or not.

[0] https://en.wikipedia.org/wiki/Mille_Bornes but actually the French article has a lot more information: https://fr.wikipedia.org/wiki/1000_bornes

[1] https://fr.wikipedia.org/wiki/Edmond_Dujardin

[2] https://fr.wikipedia.org/wiki/Joseph_Le_Callennec

[3] https://www.jeuxdujardin.fr/produit/milles-bornes-pegboardab...

[4] https://fr.wikipedia.org/wiki/Dujardin_(%C3%A9diteur)


I believe that game mechanics are not copyrightable. The text and artwork are, but not the rules themselves.

My dad has a naval-themed clone of Milles Borne, which we played a lot when I was a kid:

https://boardgamegeek.com/boardgame/6806/nautic-miles

I liked it so much that when we had a school project to make a game, I made a space themed clone of that!


According to Wikipedia, Mille Bornes is almost identical to Touring, a card game first produced in 1906.

https://en.wikipedia.org/wiki/Touring_(card_game)


IANAL, but I don't think games are copyrightable per se. You may have a registered tradermark on the name an logo, you have copyright over the specific artwork and text in the game, and you may have a patent in some specific mechanics of the game, but the game itself is not covered by copyright.


> IANAL, but I don't think games are copyrightable per se

IANAL either, and you may well be right, but maybe this is the sort of question in which the answer may depend on which country's laws we are talking about? Even if what you say is true in country X, it might not be true in country Y.

> You may have a registered tradermark on the name an logo

Indeed, Dujardin SAS owns a registered US trademark on the phrase "MILLE BORNES" for a card game – https://tsdr.uspto.gov/#caseNumber=72156515&caseType=SERIAL_... – applied 1962, granted 1965, last renewed in 2015 (up for renewal again in 2025).

I don't know whether a registered trademark on a physical card game applies to a computer version of it, but I would suspect it does.


> the answer may depend on which country's laws we are talking about? Even if what you say is true in country X, it might not be true in country Y.

Not really; through WIPO, the US has "rationalized" every other signatory nation's IP laws to match its own. And pretty much every country that has any IP to protect, is a member of WIPO.

Some countries (e.g. China) might not bother to enforce their IP laws as strongly as the US does; but they do still have those very same IP laws, rather than materially-different ones.


> Not really; through WIPO, the US has "rationalized" every other signatory nation's IP laws to match its own.

That's not really true though, there are some real differences between copyright laws in different countries:

* US copyright law appears to recognise the ability of a copyright owner to relinquish their copyright to the public domain pre-expiration, German copyright law denies them that ability

* US copyright law has the concept of fair use; Australian copyright law lacks the concept of fair use, although it does have the significantly narrower concept of fair dealing

* Most EU countries recognise the concept of "moral rights of the author" in their copyright law, US copyright law generally speaking does not (it does for copyright on visual arts works under The Visual Artists Rights Act of 1990, but not in general)

* EU has the concept of a database right, which protects the contents of a database from copying even if the work of assembling was mere "sweat of the brow" involving no originality; the US does not, and mere "sweat of the brow" does not meet the copyright standard in the US (see Feist vs Rural Telephone Service Co)

* US law says that accurate photo-reproductions of public domain 2D works are themselves public domain – see Bridgeman Art Library v. Corel Corp – there is no independent copyright in the photo since there is no originality involved (selection of camera angles etc), and mere "sweat of the brow" is not enough for a copyright to exist. It appears – although not completely settled – that UK law, by contrast, has a weaker threshold of originality, such that mere "sweat of the brow" is enough. This was the crux of the dispute between the UK National Portrait Gallery and Wikipedia. Since it never went to court, we can't know for sure what the UK law actually is on this topic

* UK law provides special protection to specific works. For example, even though Peter Pan is out of copyright, there is special legislation requiring commercial productions based on certain Peter Pan works to pay royalties. Likewise, there used to be an "eternal copyright" on the King James Bible and Book of Common Prayer – that "eternal copyright" has been abolished, but still applies by a grandfather clause until the late 2030s. By contrast, the idea of specific works being legislated special IP protection is rather unheard of in the US (outside of trademark law, wherein the US does it too)

There's a lot more fiddly little differences. The Berne Convention, WIPO, TRIPS, etc only mandate the broad outlines of IP law, the fine details are up to each country and different countries do different things. Every country has its own statute law and case law and differences inevitably emerge


Fortunately, Xmille graphics have nothing in common with the original ones (which are really adorably old-fashioned and oh so typically '50s).


I'm not saying we're going to bring this guy's site down... but we're totally going to bring this guy's site down.

Just going to leave this right here... https://web.archive.org/web/20210103182219/https://keithp.co...


Good call, it’s down for me. Thanks!


A few years ago, I rebuilt my projects from my undergrad graphics class in 1991. They were written on 32-bit Decstations running ULTRIX, and worked after only a few changes on FreeBSD/amd64. Most of the changes were actually just to fix up some header file includes. I was surprised that that they worked just fine.


I've never seen a UI described as "rustic" before, but I love it.


He's a pretty good writer. To me, it read as a feel-good programmer story. I loved reading it.


I'm wondering a few things regarding network transparency:

- Are vector graphics drawn by the X toolkit, or is the bitmap transferred in uncompressed form to the X server for drawing?

- The same regarding antialiased fonts: does the antialiasing happen at the client, or the server-side?

I would tend to guess that both happen at the client side, which ends up transferring bitmaps to the server. There's hardly any advantage to X's architecture if you end up transferring bitmaps...


Since he is using the Cairo library everything is rendered locally and transferred in uncompressed form to the X server for drawing. The same is true for all font engines which use in the case of Xft the Xrender extension.

I never understood why they didn't include proper font rendering and Cairo into the server. This would not only solve a lot of issues like mixed dpi rendering and vastly improve network transparency but would also eradicate duplicate effort and fragmentation in the toolkit space.


>Since he is using the Cairo library everything is rendered locally and transferred in uncompressed form to the X server for drawing. The same is true for all font engines which use in the case of Xft the Xrender extension.

So this is not necessarily true, it depends on how you're using Cairo. If you're using Cairo's xlib surfaces, those will try to accelerate with XRender if that's available: https://cairographics.org/manual/cairo-XLib-Surfaces.html

As far as why the toolkits never got on board with it, some do actually use XRender in certain circumstances, but it's hard for all applications to make use of it properly. Most of the advantages go away if you have any images coming down the pipeline at all, or if you need to do any text layout or reading back from the pixmap, or if you have any GL/Vulkan usage in your application, etc. Plus historically I've heard the implementation of this was very inconsistent and on some servers it actually resulted in a slower rendering path, although this should not necessarily be the case if you're using the current Xorg with glamor enabled.


> those will try to accelerate with XRender if that's available

The XRender extension does all the rendering client-side. The one exception are glyphs which are rendered by the client only once and than stored in the server.

XRender also does not solve the problem that a line from A to B will/should look completely different on displays with different dpi / color space / subpixel arrangement.

> if you have any GL/Vulkan usage in your application

The original implementation of OpenGL on X11 was GLX by Sgi which serialized all the OpenGL commands and was completely network transparent as a result. Considering the fact that all the communication with the GPU is serialized via PCIe anyways there is no reason this shouldn't work with reasonable performance today when done right.


>The XRender extension does all the rendering client-side.

I'm not sure where you heard this, but it's not correct, you might consider checking the extension spec and the implementation in the server:

https://www.x.org/releases/X11R7.7/doc/renderproto/renderpro...

https://gitlab.freedesktop.org/xorg/xserver/-/blob/master/re...

The reason why it's not used in all cases is because it doesn't map 1:1 to cairo and there are things missing, e.g. curves probably being the most important. You would have to ask Keith for a definitive answer why this is since he wrote both XRender and Cairo. (I assume it's because Cairo already included the necessary tessellation steps at the time)

Indirect GLX does work great for simple use cases, once you start to add textures into the equation the amount of bandwidth used would probably make it unusable over the network. There has been no serious work in getting indirect GLX working past OpenGL 1.x. I think another one of the main problems is that nobody really came up with a good way for GL extensions to work well there, the server needs to manually add an implementation for every extension you want to use.


What would be the fallback in your world if the server didn't support the new ops? If the fallback is client rendering, why would anyone want to risk using the (unknown) server implementation when they can just rely on the same (and tested) client code everywhere?


X11 has a strong reference implementation hence there should be no need for a fallback. If you really would want to launch a program which needs the new protocol on an old/outdated server for whatever reason you could use a wrapper like Xephyr/Xnest as fallback. (Besides, Xorg already has many optional extensions and if one of those extensions is not present the program simply fails to launch. The fallback problem has not been a real problem so far because X11 development is very centralized.)


Using Xephyr in that way is essentially the same thing as client rendering. (Xephyr draws the whole framebuffer and then sends that)


No it is not the same thing because in the hypothetical Xephyr case the client doesn't need to know who or what executes the rendering command. The client also doesn't have to implement its own rendering as a fallback.


From a client perspective it is still the same thing if you're using cairo since that includes the fallback code already. But if you're making XRender requests directly then yeah it wouldn't be. (I don't know of many applications that do this, ideally it's something you want in the toolkit or drawing library)


I've been building ANCIENT Unix programs from the tuhs site, almost everything just works on modern Debian. Even the odd X11 program builds and runs fine.


live xmille, great job keith!

one if the new cards says vehicle prioritaire. given card names are in French should be vehicule prioritaire i guess....


Or even véhicule prioritaire.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: