Hacker News new | past | comments | ask | show | jobs | submit login
Quake 1 port for Apple Watch (github.com/byteoverlord)
336 points by IdeaVoid on Nov 19, 2022 | hide | past | favorite | 87 comments



Reminds me of when I got GTA 3 working on an Android watch. Here's s video of it in action: https://twitter.com/Martin_Adams/status/860604357028577282?s...


You mean you managed to install the Official Android port of gta 3 on an Android device? How much actual coding did this involve?


This is so cute and cool! While I don't know, whether I want to play long Quake sessions on my watch, it shows that the Apple watch is a quite powerful compute device. Actually, it should be more powerful than most workstations of the 90ies, as it has a dual-core 64bit Processor.

It also shows, how much the watch and some other Apple devices are held back by the software restrictions. Basically any software released in the 90ies should run easily on the watch. Of course, due to the small screen size and the lack of a keyboard, a totally different UI would be needed, but if people were really free to experiment and distribute, a huge field of new software could be opened.


TBH I would say in the case of the watch, it's held back much more by energy budgets than software; the software is just enforcing that. It's cute that you can run Quake on it, but for how long before it killed the battery? 20 mins? You can see Apple pushing the bounds of what it can get away with and still maintain acceptable battery life from generation to generation.


Yeah you can’t directly use Metal or MetalKit on the watch, so you can’t do completely custom rendering of anything in the GPU (which is why this port has to use software rendering), but even if you could would probably just be blowing through the battery in under an hour.

SceneKit is available. Not sure what the battery life is like when it’s used for anything other than displaying a simple 3D model briefly (e.g. like the fitness medals).


Who is stopping you from doing what exactly? What would be true if you were not stopped? Who is not free to experiment and distribute? What does the huge field consist of?


I second this - people keep making this claim, but I haven’t heard anyone articulate what would come into existence, especially since whatever it is could presumably be made on Linux.


Apple only allows running software on Apple devices if Apple has signed it, and the SDK for watchOS has some artificial limitations.


> it shows that the Apple watch is a quite powerful compute device.

I would have honestly expected the watch to be able to play much more than Quake.


I was half expecting this to have gyro controls so you could look like a madman while "arm looking".

Really neat, in any case.



By the way, it looks like you forgot to disable flux or similar before taking some of the screenshots. They are an bit orange.

To continue on this topic, blue light filters may look pleasing to some but it’s not sure whether they do have an effect:

https://onlinelibrary.wiley.com/doi/full/10.1111/opo.12406


> To continue on this topic, blue light filters may look pleasing to some but it’s not sure whether they do have an effect

I don't know if they have an effect or not, but I much prefer having it on. It feels easier on the eyes. When I turn it off I immediately feel a kind of discomfort.

I don't believe it actually has an effect on my health. But the warmer colors definitely have an effect on my comfort.


I had constant migraines until I started wearing blue light coated (plain glass, no prescription) glasses. Before that I was wearing dark glasses indoors on gloomy days.

Certain lighting (especially fluorescents) and just about any screens always set me off.


Even if they don't have the claimed sleep-related benefits, surely they can't cause any harm (except for slightly orange screenshots!)


True. We can tolerate orange screenshots if it makes some people feel better. Even if it’s a placebo effect.

One study found out that people using such blue light filters tend to have more screen exposure at night. Perhaps they think the filter allow them to watch more the screen, which may not be good.

https://pubmed.ncbi.nlm.nih.gov/32197951/


Or do people use blue light filters to try and remedy their preexisting extra screen time?

(I don't have access to the article text and the abstract just seems to point the correlation)


> Lastly, we analyzed the use of blue-light filters, according to an answer to a simple question: whether participants did or did not use a specific filter for filtering blue-light on their screens. Only 10.6% of the sample (N1⁄474) reported using filters, whereas 622 parti- cipants did not. The most prevalent means of filtering blue light were f.lux (Windows) and Twilight (Android) software. No significant dif- ferences were observed for all sleep-related variables mentioned in the previous analyses. A statistical trend was found for the duration of sleep on workdays (489 vs. 461 minutes, t1⁄43.595, p1⁄40.058, Cohen's d1⁄40.23), meaning those that used blue-light filters slept in average approximately 28 minutes longer on a workday than those that did not use any means of filtering blue-light. No other differences were observed. An interesting finding, however, is that the group of people who use filters had more (albeit without statistical signifi- cance) total screen exposure (8.6 vs. 8.3 hrs) on average and more screen exposure on PC (4.3 vs. 3.6 hrs) and mobile devices (3.6 vs 3.4hrs), but less exposure to TV (0.7 vs. 1.2 hrs).

The article is quite positive about blue lights and presents excuses for its findings in the discussion. They ask for a better study with more participants and more samples.

It also cite an interesting paper about patients with insomnia, so not the average hacker, who benefits from blue light filters to sleep better: https://www.tandfonline.com/doi/full/10.1080/07420528.2019.1...

(I have access to these papers from work but there is a very convenient and famous website that we aren’t supposed to name on HN).


People don't use filters just for sleep-related benefits. My optician recommended me to stay away from blue light and sunlight because of my eye conditions.


f.lux don't usually change the color of screenshots if you have a proper grafics card. It changes the color on the output to screen.


they absolutely have an effect, I know firsthand


Quake 1 was roughly 80mb once installed. Why this version is taking 1gb?


What Moore giveth, Gates taketh away...


Bill Gates had a hand in Apple’s Watch OS?


Let us consult the Book of Jobs.


So, Lord "Objective-C" Swift's binary blob size is harder to beat than Quake?


looks like music (.wav) files...?

from the github:

  > Extract game music from the gog game files:
    bchunk -w “game.gog file location” “game.cue file location” track

    (Music tracks will extract in to current working directory (track02 -track11.wav).)


Original soundtrack was streamed from the CD. Seems like compressing to mp3 would be a big help


Decoding would probably be an issue. It’s already on the cusp with the current audio files, mP3s would probably throw it over the edge.


Assuming for the sake of argument that playing back mp3s during gameplay is beyond this device, resampling the wav files to halve the sampling rate when extracting them certainly isn't. You'd get half the storage back, the audio presumably wouldn't suck any worse than it already does on that little device, and it should all work since the 44.1Khz sampling rate isn't baked into the wav file standard (but who cares, you've got the Quake source code anyhow).

It'd be an interesting exercise to see how much you could lower the sample rate without it sucking. On second thought, you could convert stereo to mono first, without even worrying about the sampling rate, and get half the space back. But I'm sure you could do more.


Quake port for ________ is one of the most guaranteed HN #1s out there.


As it should :)


Quake port for ________ is one of the most guaranteed HN #1s out there.

"Quake port" is the new "Beowulf cluster"


In my most recent experience, BeOS does as well. Just imagine if someone did a new Quake port for BeOS...


Nice work!

How about Quake on Amazon Echo + Alexa? Build it like a Text Adventure but use NLP + Computer Vision to describe the scene and you shoot and move by describing what you want. Great for people with poor vision or has a nostalgia for the 80s.


It makes me think that doing Zork on an Alexa shouldn't be tremendously hard.

I don't know diddly-squat about programming for Alexas, but if my local TV station can do it, anyone probably can.


Quake 1 port for Apple Watches that uses software rendering and has mostly working audio playback. Runs shareware and registered versions of the game with optional “cd” audio.

https://www.youtube.com/watch?v=89TAt72eYt4 (Series 5 gameplay video)

This port started from the original Quake Watch port by Tomas "MyOwnClone" Vymazal.

Changes by ByteOverlord:

Save and load game on watchOS Music playback ("cd" audio) Camera look and tweaked controls Autosaving options Map quick select and cheats screens Automatic native resolution on watches


How long can I go before I have to rebuild the app to my watch?

This is one of those things I'd love to install and play around with like, twice a year, but I suspect I have to reinstall it every 7 days to keep it working unless I pay an Apple dev tax.


Wow! How long does battery life last while playing?


Not yet tested. Will inform if there is time to test.


And here I am just hoping someday for a quake3 or Team Fortress port to latest MacOs…………..



I'm so thoroughly impressed with the speed!


Serious question - why? Quake 1 is 25 years old game. It ran good on first Pentium. Apple Watch CPU is much more powerful.

I’m not saying it’s not cool. But it should be fast.


I suppose because I've seen my Apple Watch struggle with the most basic of things. Like displaying a text message, or scrolling through a full set of apps. It's never felt like a speedy device, and seeing something render so smoothly like this gave a contrast to how I usually feel. That help?


The first gen watch was very much like the first gen iPhone - a cool tech demo on underpowered hardware.

It was supported far too long, and with its final updates it was barely suitable as a timepiece. It couldn’t handle the most basic of tasks, like receiving messages or playing MP3s on your AirPods.


That is quite surprising given that a Pentium 166MHz had zero issues playing mp3s in winamp, 1998 hardware.


1998 hardware meets 2022 code


At least watchOS doesn't run Electron.


Yet


Hell you could play mp3s on a 486 if you put winamp's decoder in half quality


Maybe, except Pentiums were the current generation when winamp came into the scene, hence why I made a mention to it.


Did it have a five gram battery?


How do you think it would keep BIOS settings stored in RAM between power cycles?


More like 1995 hardware.


Could be, the date I mentioned is when I could afford buying a P166, and I didn't bother to check the interwebs for the release date, now I have gone and found it, depending on the exact variation from P5 or P6, it could have been released between 1995 and 1997.

https://en.wikipedia.org/wiki/Pentium#Pentium


A game similar to Quake coded in a modern high level language would probably be too slow on that watch.


Indeed. Currently playing Prodeus which is made by a couple of people using unity. Runs with about ~100 FPS whereas Doom Eternal hits about twice as that while looking much, much better.

In fact, there's a recent game called HOAT or something. Built by a single guy using the Quake engine. Had to return it because it was running with 30fps on my machine.


It's not using the Quake engine, or it wouldn't be running at 30 fps on current hardware. It's probably Unity or something more modern and bloated.


If the parent comment meant HROT, it’s using a custom engine written in Pascal.


Yeah, that's the one. Custom engine in Pascal. Now, where did I get this Quake engine idea from.

Thanks anyway


Because the source code was published years ago, giving more life to the game.

This is the power of Open Source.



There is no way you can complete Quake on Nightmare using Apple Watch.


The code is so simple to read. Thanks for the inspiration.


Yes!

Now I know what I’m doing tonight.


Amazing... and what awesome documentation on the repo.


[flagged]


This comment reads like it was written by a GPT


GPT-2 no less.


You may not be the intended audience.


This made no sense at all


A truly marvelous proposition, which these comments are too narrow to contain.


Maybe you're thinking of differentiable rendering?

https://blog.qarnot.com/an-overview-of-differentiable-render...


I have some experience with ML and I have no idea what you're talking about. It kinda sounds like neural architecture search and sparse models created using weight pruning. Lots of people are working on both of those things, but IMO the latter (if that's what you mean by "Can you add weights non uniformly?") is a dead end for most use cases where you have some sort of accelerator or deep learning instructions available like on every modern desktop, laptop, phone, and server. Models that use their weights "efficiently" in terms of FLOPs tend to perform like crap on real hardware.


I’m unfortunately the sickest I’ve been in years, so this will have to wait. Maybe it’s part of why my comment sounded strange.

There is an idea here, and it’s a mistake to dismiss it out of hand. Adding weights non uniformly during training (not after) is the key to smaller models that outperform present day GPT3.

A sketch of the algorithm is to start with a 2x2 block of weights, sum the gradients across 10 training steps, then subdivide the quadrant with the highest delta.

Doing this recursively is prohibitive, which is where megatexture comes in.

Many advantages. At runtime you don’t need weight compression because you can simply switch to a lower miplevel if running on a phone. Different accelerators during training can focus on different areas of the network. Weight dropout is an automatic feature. Etc.

If you’ll excuse me, it’s back to hugging the porcelain bowl.


You mention this technique has been shown to outperform GPT3, do you have a citation for that? Would love to read more details about this interesting concept.


they didn't say it was shown, he stated that those to be developed models would do that.


"Adding weights non uniformly during training (not after) is the key to smaller models that outperform present day GPT3." seems to greatly imply a certainty of result that has already been discovered. Though this commenter is familiar to me, and I know he has made silly claims in other threads throughout the years here.


I would hope so. It’s in my name!

I ended up calling an ambulance so I’ll postpone this until later. Feeling a little better but a full explanation will have to wait.

The answer is that of course it’s not proved yet, since no one has implemented it (or at least efficiently). It’s fine to be skeptical.

Current techniques are blocked by the technical challenge of getting 10GB+ to fit on a pod. Very few people have those skills. If there’s even a chance that this will work, it’s worth exploring, so I will be.


Sounds kinda like progressive growing except you're not doubling the resolution uniformly. See ProGAN and its successors. You'd still need to add a large block of weights at a time for performance reasons.

Edit: Ah I checked your profile and you already know all this. You probably should have mentioned that lol


Network compression is already a thing that is studied and forms of it are already used in production neural net models where latency / cost is important.

The way pruning works is not like how a "megatexture" works.


I think he's talking about lossy compression with some sort of high-importance areas where losses are minimized.


For those that prefer not to leave a mess on their file system, nor have default permissions changed and security degraded, nor have Google Analytics snooping on them, would prefer access to upwards of six times as many packages, would prefer an easy choice between binary install and full source build including dependencies built from source, and would prefer a more recent version (1.6 < 1.6.1), and/or are a recovering alcoholic, innoextract has been available on MacPorts[1] since January 2019.

[1] https://www.macports.org/install.php


> For those that prefer not to leave a mess on their file system ... innoextract has been available on MacPorts[1] since January 2019.

Speaking of making a mess, instead of somewhat vague sniping, you could have made it clear that your beef is with Homebrew, rather than risking the impression that it's with this neat Quake 1 port. Because at the moment you have the top rated comment and it looks like you're slagging off this project rather than making a potentially valid argument for Macports over Homebrew.

The project author probably chose Homebrew because they already knew how to use it, it works, and they were more interested in making their Quake port than with evaluating different package managers. People make these kinds of choices all the time, particularly with side projects where they only have limited time to work on them.


> it looks like you're slagging off this project rather than making a potentially valid argument for Macports over Homebrew.

It looks like you're personally attacking me rather than speaking to any argument I may have made.

> The project author probably chose Homebrew because they already knew how to use it, it works, and they were more interested in making their Quake port than with evaluating different package managers. People make these kinds of choices all the time, particularly with side projects where they only have limited time to work on them.

As I made no mention of the author nor the project, and no one but the author knows the reasons for their choices, this is a mind-reading fallacy wrapped in a straw man.


    brew analytics off
And in general it does everything as user not su for a few years now. (Actually a good idea if using it before M1 to brew dump, remove installs and brew, then reinstall from the dumped brew file.)

https://sparanoid.blog/749577


> brew analytics off

The issue is that by default on install Homebrew uses Google Analytics.

> And in general it does everything as user not su for a few years now.

Homebrew always did everything as the admin user and not as root, and that was always the problem. It mungs permissions of /usr/local; all files, not just directories. And it's not so much of a package manager as it is a needless frontend for git. MacPorts, alternatively, respects default file system ownership, is a full-featured package management system, and like pkgsrc, is based on the FreeBSD ports system.

And when uninstalled, MacPorts doesn't leave a mess. With these 3 commands, it leaves no trace (with the exception of the /opt directory in the event it is used for something else):

      sudo port -vfp uninstall --follow-dependencies installed

      sudo port -vfp uninstall all

      sudo rm -rf /opt/local /Library/Tcl/macports*
If the day ever comes, good luck uninstalling Homebrew.[1][2]

[1] https://github.com/Homebrew/legacy-homebrew/issues/48792

[2] https://stackoverflow.com/questions/32895800/cant-reinstall-...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: