Hacker News new | past | comments | ask | show | jobs | submit login
Apple is building the metaverse substrate (codevoid.net)
206 points by grork on June 21, 2021 | hide | past | favorite | 230 comments



Something that's dawned on me for the last few years is that Microsoft basically settled on the idea that the modern operating system is effectively done, which I suppose is true, but then they stopped creating technologies to build on top of Windows all together.

It was as if they said, OK, here's the supermarket, now vendors, create stuff and we'll stock the shelves.

Except everyone publishing software today that isn't a game developer is targeting the web. Which means the best software you're going to get for Windows that isn't long-lived incumbent software like Creative Cloud is going to be... over Google Chrome (or Microsoft Edge).

OK, but Microsoft is in the best position to create great, integrated, first-class technologies that build on how great Windows is. But they don't. I have mixed feelings about this, because when Apple does it, they basically put people out of business.

But here's the thing, in 10 years time, you're going to have Windows, which is still just, and will continue to be, Windows. And macOS, and iOS, and tvOS, and every other Apple OS, is going to be so much more. And they already are so much more.

For those who haven't experienced it yet, or can't afford Apple products, they're missing something that just hasn't happened in computing in any other period of time I can immediately think of.

Or, idk, I'm blind because I think Apple products are so great. The lattermost is probably more likely.


I think Stratechery accurately called Microsoft's strategic shift: https://stratechery.com/2018/the-end-of-windows/ around the time that Nadella showed up to save them.

I think Apple is better positioned with total vertical integration to lay the ground work for the next platform (AR) and has been doing so for years now. They'll ship some hardware when the time is right.

They've repeated this approach since the iPod successfully, never first to market, but laying the ground work before shipping the best in class product.

FB (Zuck specifically) recognizes the next platform and Oculus is a bet to win it. Their issue is they don't have their own phone OS, the Oculus platform they have to build up from scratch (which they've done a decent job doing). They also have a brand issue (I personally dislike their ad-driven business model).

It'll be interesting to see what hardware Apple ships - the UI potential for AR is enormous and very cool. I know Zuck sees this and is public about it, Apple is acting in such a way that they definitely see it - they're just quiet about it until they ship.

Looking at tiny glass displays will be a funny anachronism of our time.


That AR will be visual first is the eventual obvious thing, but the soft start is audio based. Siri and transparency mode on airpods is the unacknowledged AR present that hasn’t been fully unveiled.


I’ve an iOS app I wrote a while ago which provides the functionality of game audio middleware (something like FMOD I guess) basically for sounds placed at locations in the real world: tracks the users’ location and orientation relative to all the sound sources and spatializes the audio based on that. You can make and layer together sounds that follow coordinate paths, sounds that ‘wake up’ at/after times, are triggered/stopped by proximity, do sample accurate playback at different pitches/volumes for each ‘grain’/chunk (as far as possible on iOS devices), randomise order in a couple of different ways/follow looping playlists for variation and so on. It uses standard ideas nicked from game audio to save on the download/disk/memory size of sounds (though there is obv. trade off with CPU/RAM and perhaps other OS stuff going on I don’t understand). It’s basically kind of an amateurish game audio engine overlaid on the ‘real world’. It’s likely got loads of holes I haven’t detected/am unaware of in the code, probably poorly structured/commented/badly named functions/variables and so on as I am primarily a guitar teacher/music producer/sound designer for my ‘day’ job and it’s the first thing I’ve written in Swift/really just an experiment to see if I could get it working, but I’m kind of wondering what to do with it/what it might be useful for… I made a fully functioning demo. that is a historical recreation of an area of the city I live in with lots of evolving/generative sounds in it: eg. horse’n’cart trotting past, blacksmith, pubs, trigger zones for a narrator voice/music playing back ‘2D’ etc etc …I don’t really know what to do with it now… I think it might be sim. to ‘Microsoft Soundscape’, but I’ve not actually tried their app yet… Do you think anyone might be interested if I put it on Github? (…never used Github, though I used SVN/other CMS as a sound designer on games before… though I know it has a social component…) I guess it’s trivial for most ‘proper’ programmers to make something sim. as it just uses the Apple native API/swift functions (sorry if this is not the correct terminology)/as low level as they go on the sound stuff anyhow, nothing custom?… The sound ‘definitions’ are all XML/.plist basically at the moment, so it’s not user-friendly for sound designers/musos/general public, though I could make some sort of graphical front end that generates the xml for it and uploads sounds etc, though I know nothing about network coding and it looks scary!! I guess Apple could come out with some superior system tomorrow… Anyhow, if anyone reading this is at all interested/has any advice (even if it is to get knotted! :P) that would be gratefully received… Apologies for the overly long and spammy post…


Sounds cool to me. The ability to select your preferred ‘layer’ or collection of sounds for that environment would be good. I imagine a Wikipedia for real world locations could resonate with a lot of people.


Thanks! Yes, you could basically select from various downloadable ‘audio experiences’/layers for that area… The localized user-contribution based audio wiki/encyclopedia type-thing is an interesting idea... Even people telling stories or guiding people around an area… I did think about how users might be able to upload content to a location for others (like an audio grafitti-ish type thing), but I imagine the moderation/networking/storage part of it would be tricky to get right…


Yeah definitely put it up online with a license to get folks to look or contribute to it!


Thanks Karrot_Kream - much appreciated! …I’ll try and do some research about the best way to go about that/tidy it up some/update it for iOS 14 first… (think it was last built for 13)


Reading your HN profile, sound like you might find office hours.global interesting…


IMO this is insanely underrated—people think of AR visual, but it could easily be “a voice in your head”, which AirPods come very close to accomplishing.


FB has 10,000 ppl working on VR and AR. Yes, they are serious about it.

https://www.theverge.com/2021/3/12/22326875/facebook-reality...


> the Oculus platform they have to build up from scratch

It's Android, right?


I think it's not as much the 'OS-platform' as it is the 'ecosystem-platform' the poster was thinking about.


It's Android.


Well, it's easy to get excited fantasizing about the future, but your predictions are about as likely to occur as the flying cars we're flying around right now.

In fact none of the major platforms we use today was just predicted and constructed. They've evolved and shown their benefits naturally and not entirely in expected ways. No one actually planned the web to be an application platform. It was a university paper exchange program. And the Internet before it was a military communication network.

Your predictions about the grand future of AR remind me of the excitement around VRML couple of decades ago.

"Looking at stale 2D web pages will be a funny anachronism of our time" we thought. Turns out making existing content more fancy in 3D wasn't that useful, it actually was more cumbersome both to create and to use, so 3D web pages died before they even had a true chance to live.


You can see the direction things are heading.

The iPhone was a UI step change improvement over previous 'smart phones' and the app ecosystem came out of that.

The ground work being set in the OP's post is about getting things ready for hardware that can then take advantage of it.

It's possible to make predictions based on trends and the capability of hardware that becomes possible when it previously wasn't: https://www.youtube.com/watch?v=sTdWQAKzESA (Also see: Douglas Englebart's the Mother of All Demos: https://en.wikipedia.org/wiki/The_Mother_of_All_Demos). Xerox PARC too - computing history is filled with examples of people pulling the future down because they recognized what was possible.

Just because 3D websites are a bad UI doesn't mean looking at little hand-held glass displays is the best one. Likely in AR we'd still pull up flat 2D websites a lot of the time, you just wouldn't need to pull out a little glass display to do it.

Michael Abrash used to have a blog post about the hardest AR problems when he was at Valve (he's at Oculus now): drawing black, and latency - the latter is mostly a hardware constraint - I'm not sure if anyone has solved the former (the magic leap sucked).

If the hardware is possible, the UI benefits seem big.


I don't know if you remember Google Glass was a thing, and turned out that wearing glasses on your face 16 hours a day is far more annoying than those "little hand-held glass display" you're trying way too hard to be dismissive of.

Google Glass didn't die because it wasn't AR, it died because wearing glasses all the time wasn't practical.

Not practical technologically in terms of battery life, weight, and not practical in terms of simply that you don't need to interact with some digital UI every waking moment of your life.

Also while in our imagination we can conjure up virtual displays in AR and use them for complex UIs, actually waving your hands in empty air, aside from being super weird, is also super inconvenient, compared to handheld multitouch glass.

You're not making AR predictions based on "current trends". You think you are. Instead you're trying to draw a straight line from the present reality to your favorite sci-fi movies that have shaped your idea of what the future is going to be like, while also skipping over all the pesky details that can trip up that idea from concept to realization.

In other words, same reason why everyone was dead set flying cars are coming. And yeah, the generic "no one believed in XEROX PARX and no one believed in trains and car engines, no one believed in airplanes" argument was brought up about flying cars. Turns out that this argument is not an automatic win for believing whatever you wanna believe is coming.

Just because someone didn't believe in airplanes doesn't mean I can't roll my eyes at predictions that faster-than-light travel is just around the corner.


Google Glass died because the hardware sucked, the UI and utility were not there. The general magic device was also a failure, but mobile computing is obviously not.

The timing has to be right and the hardware has to be possible - if you're too early it won't work.

Flying cars is a bad comparison - they mostly don't exist in widespread use because of reasons not related to computing. Risk, fuel, control, etc. - even then rich people do have helicopters (though that's mostly different).

Pointing out failed predictions does not imply that all predictions are similarly wrong. In computing - the examples I showed (and there are others) are more relevant.


I remember the one time Sergei Brin said helicopters are surprisingly affordable. Sure if you are like 9th richest in the world that might be true.


Helicopters are surprisingly affordable in that you'd be surprised how cheap a bare-bones helicopter like you'll find all over the outback is. It's still shitloads, but it's less than you think.


> Google Glass died because the hardware sucked, the UI and utility were not there.

And nothing has changed about that.

> Pointing out failed predictions does not imply that all predictions are similarly wrong. In computing - the examples I showed (and there are others) are more relevant.

Most predictions are actually wrong. Let's see the hardware that "doesn't suck", let's see the UI and utility that "are there" and then I'll tell you if we have a winner or not.

Right now we have nothing except bold fantasies powered by sci-fi movies full of cheap hologram VFX.


> And nothing has changed about that.

Yet - it's a prediction based on the capability of future hardware that seems plausible.

> Let's see the hardware that "doesn't suck", let's see the UI and utility that "are there" and then I'll tell you if we have a winner or not.

Yeah sure, it's way easier to make predictions in hindsight after other people have already built it. Even then - when the iPhone launched in 2007 it was largely panned in a similar way to what you're doing now.

Making accurate predictions is hard - I agree with that. If you dismiss everything you'll be right a lot of the time, but you'll also miss every big and interesting change until someone else builds it.


> it's a prediction based on the capability of future hardware that seems plausible.

And nuclear fusion reactors are only 20 years away. That's a concrete goal and it's more likely than some vague new "metaverse". I promise, you'll still be posting on HN when anything resembling a shadow of this is released.


> wearing glasses on your face 16 hours a day

I do this every day of my life.


Me too, but I've never posted photos of myself wearing them in the shower.


Seems like you are just really in deep with Apple. I've used Apple professionally for over a decade now, and personally for a long time before that (Mac SE was my first). When I'm on my personal laptop (windows/linux) I miss absolutely nothing about Apple. Well, that's not quite true, I appreciate the uniformity, but the uniformity costs too much in terms of a walled garden, which is an antithesis to how I think things should be.

I think Microsoft does far more interesting research than Apple, which largely (but not completely) just buys new ideas like Cisco. An effective strategy, granted. I feel MS has a feel for where development is going better than Apple, with VS Code and the purchase of Github.

"missing something that just hasn't happened in computing in any other period of time I can immediately think of" is just hyperbole. To me, Apple has always been the company that takes other people's ideas and polishes the hell out of them - which is amazing, but when I look at my 30+ year tech career, I don't get very excited about Apple.


Keep hearing about “walled garden” as though it’s a bad thing — Archibald Craven doesn’t want Mary to play?

The word “garden” suggests bespoke curation for appreciation and abundance, while “walled” suggests within this boundary the experience is purposefully tended and safe.

The Japanese put them in the middle of their homes. The British write children’s literature set in them. Just look — the very idea of a walled garden is lovely:

In film: https://i.pinimg.com/originals/5b/52/41/5b524125c48b35bfe0ca...

In life: https://en.wikipedia.org/wiki/Great_Maytham_Hall#Gardens

In my experiences with the non metaphorical variety, most anyone who can afford one prefers it.


I think HN and similar value free (as in freedom) and openness of a system a lot more. So "walled garden" is a bit of a buzzword against those ideals.

> bespoke curation...purposefully tended and safe

These ideas are antithetical to what many on HN seem to want. I personally love Apple products, but walled gardens are great when you don't bump into the walls all the time, which a lot of other people seem to do.

In these cases (wall bumping) I think it's reasonable to want alternatives. But there are fanatics and anti-fanatics that I think skew any discussion about Apple.


I realise this is a controversial view here, but in my opinion the real question is really whether you value Apple’s freedom to build what they believe is the best version of their product. You might not care about that, and that’s fair.

Personally I think they should have that freedom. If Apple really had no competition, I would absolutely change my mind. But so long as Android exists, my view is that consumers deserve the right to choose Apple’s garden if they want. And they can choose to leave it. Millions of people do every year. Let the market decide.


If it's anti-competitive at 95% market share, it's anti-competitive at 50%. Your position is hypocritical and wrong. Everyone should follow the same rules, not make up their own.


Apple did follow the rules as they currently exist. There is presently no rule against operating a walled gardens in the manner Apple does. If such a rule existed, we'd all know about it. You would have cited it. Apple's critics would be able to quote the rule verbatim. And Epic's case against Apple would have been a slam dunk.

Don't be embarrassed to admit that you're the one wanting to make up new rules. It's okay to want new rules. Be honest about it.

If you are interested in having your perspective challenged, the commentator for whom I most agree with on this is Hoeg Law, who has published extensive commentary on the Epic vs Apple lawsuit. If you are interested in hearing a view that differs from your own, I can commend his publications to you:

https://www.youtube.com/playlist?list=PL1zDCgJzZUy-TrSXpg6ir...


Really well said. I believe Apple are doing what they think is best for everyone, including their shareholders, within the rules (which sometimes aren't clear and can be easy to break without realising it.)


The Epic case is a slam dunk, Apple has monopolized the iOS app distribution market since the beginning of the iPhone. Epic is just the first one with motive and the money to bring a challenge against Apple.

Yes, after a couple of decades of large companies which have consolidated platforms to a handful of sites and services that the majority of the population uses, yes it's time for some additional rules. I completely support the Cicilline bill.

If you'd like to get out of your bubble, I'd read the open markets institute's briefs in the Epic case and the Cicilline bill.

https://www.openmarketsinstitute.org/publications/epic-games...

https://cicilline.house.gov/sites/cicilline.house.gov/files/...


> The Epic case is a slam dunk

This belief suggests that perhaps you might be the one stuck in a bubble. I've been greedily consuming alternative views on this case but seen very few legal analysts describe the case as being anything other than an uphill climb for Epic.

I have read numerous amicus briefs from the Open Markets Institute, but I'm not aware of them issuing an amicus brief in the Epic vs Apple case. If this exists, can you please offer a link to that?

The latter is a proposal to change the rules and is therefore not relevant to the question of whether Apple were "follow[ing] the same rules," as you put it. Needless to say, I don't agree with the Cicilline bill or any other attempts to frame competition on a platform level. Most egregiously, they almost always carve out special interest exceptions and protections for video games on televisions, but not video games on mobile devices.

Have you got anything else you think I should read?


The tech press could do a better job here IMO, I think this article is ok: https://www.forbes.com/sites/paultassi/2021/05/23/the-epic-v...

Rules can be better defined and updated, so I don't think the Cicilline bill is "changing" the rules. It's just giving the FTC an explicit mandate instead of the hands-off position it has taken so far wrt Apple. The other platforms get some new/updated rules too.

I don't see the "special interest" distinction you're making at all, Apple is free to set the rules they want in their App Store. They just can't disallow alternate app stores like they have been doing.

Apple themselves argued "people buy devices" so why shouldn't we frame competition at the platform level? Why shouldn't people be able to do what they want on their phone instead of needing two phones?

The open markets page I linked to has a link to the Epic brief they filed: https://static1.squarespace.com/static/5e449c8c3ef68d752f3e7...


> I think this article is ok:

That article is utterly rudimentary reportage. There is no legal analysis, or really any depth of analysis at all. You really should seek out specialist analysts like Richard Hoeg who have in-depth, intersectional understanding of both competition law and technology platforms.

https://www.youtube.com/watch?v=I5WS5D6GydY

If you don't like the idea of watching long-form commentary, his speaking style is clear enough that I find his voice eminently clear at 2X speed. Use the YouTube shortcut keys shift–comma (<) and shift–period (>) to quickly change the playback speed.

> Rules can be better defined and updated

In the excruciatingly pedantic context of law, "better defined rules" and "updating the rules" are synonymous with changing the rules. You can spin it however you like, but as the law stands right now, what Apple is doing is legal.

Out of curiosity, do think that Sony, Microsoft and Nintendo should also be forced to dissolve control over their platforms as well?

> They just can't disallow alternate app stores like they have been doing.

Until such time as a law is established to disallow walled gardens of technology platforms, they absolutely can. You are welcome to argue that they shouldn't be allowed to disallow alternate app stores, but arguing for the status quo to be changed is an entirely different argument.

It's also worth noting that the law is absolutely on Apple's side when it comes to licensing Apple's intellectual property, which all iOS developers must do in order to use the software development tools they supply. So even if Apple is forced to allow side-loading of alternative stores, there is absolutely no question that Apple would be allowed to require a percentage cut of revenues from apps developed using their tools—just as Epic is entitled to ask of developers who use Unreal Engine.

> a link to the Epic brief they filed

That is not a brief filed in the Epic vs Apple trial. It doesn't have anything to do with Apple. As far as I can tell, no such amicus brief exists. Yet you seem convinced it does exist and that you've read it.


> but as the law stands right now, what Apple is doing is legal.

Thats an interesting prediction. So you are claiming that the judge in the Apple vs Epic case will not rule against Apple, in any way, on any of its behavior.

That is a strong claim for you to be making. And since you have made this prediction, I will be sure to come back to your comments in a couple months.

Because I am pretty sure that the judge will rule against Apple on at least some issues. Probably not the whole thing. But there will be some things that Apple is doing, which will be ruled illegal.

The real question is, if this happens, will you back off on your extordinarily strong claim, that absolutely nothing that Apple was doing, will be deemed illegal by the judge? Will you admit that maybe you didn't think this all through?


I never made such claim.

It is not inconsistent for something to be legal today and then become illegal after a Judge issues a ruling. Right now, what Apple is doing is legal. It may become illegal in future.

My prediction of the outcome is, I think, a coin flip between the Judge ruling entirely in Apple's favour, or a ruling that is mostly in Apple's favour but requires a minor relaxation of certain rules around messaging inside apps about alternative methods of payment. (However even in the event of the latter, this would not represent any impediment to Apple requiring a percentage license fee when payment occurs through an alternative method. In which case it would be moot.)

For anyone following the trial directly—not just consuming other people's interpretations—it was clear that the Judge was repeatedly seeking some sort of bone to throw Epic's way, slicing the thinnest possible piece for Epic. It all hinges upon what the Judge determines to be the relevant market, and what is considered an acceptable substitute under the Sherman act. Somewhat ironically, Epic is a terrible plaintiff in this respect since their games are available on many platforms.


> then become illegal after a Judge issues a ruling.

No actually. It would mean that it was already illegal. Thats why the judge would rule it that way.

Judges interpret the law. So yes, if the judge rules this way, then this means that you were wrong to claim that Apple's actions were not illegal.

And I look forward to coming back to this post when this happens, so that I can show you the explanation of why the actions were illegal on some fronts.

> Right now, what Apple is doing is legal.

Well if the judge says otherwise, on some fronts, we will be able to look back on this post, and re-evaluate then, won't we?

> become illegal in future.

No, becoming illegal would be if a legislator changed the law. Instead, this case is about interpreting existing law. So if the judge rules this way, then it means that Apple's actions were illegal. They did not become illegal. They were always illegal.

But, of course, I can already see people making up excuses, to ignore the judge, if the judge disagrees with then. If the judge disagrees with them, and she says that the actions were illegal, they have already made up an excuse as for why the judge is wrong.

Personally, I trust the opinion of the legal system. And that means that if the judge rules that some of the actions were always illegal, then that means that the judge is probably right, and you were wrong.

> it was clear that the Judge was repeatedly seeking some sort of bone to throw Epic's way

Ah, so then you agree that some of Apple's actions are illegal, and you take back your previous statement. Got it.

Glad you agree that the judge will likely rule that some of Apple's actions were always illegal.


> Judges interpret the law. So yes, if the judge rules this way, then this means that you were wrong to claim that Apple's actions were not illegal.

The matter in question is distinctly novel and any ruling from the Judge will establish a new interpretation of existing law—laws which date back to 1890 and cannot possibly predict the creation of computer software ecosystems. A ruling could make Apple's past actions retrospectively illegal. But that doesn't make them illegal today, before the ruling has been issued.

> And I look forward to coming back to this post

As do I. I've bookmarked this thread and we shall reconvene when the judge returns her verdict.

> But, of course, I can already see people making up excuses, to ignore the judge, if the judge disagrees with then.

On both sides.

> Ah, so then you agree that some of Apple's actions are illegal

No, I don't. In my opinion, the correct verdict should be entirely in Apple's favour. I disagree with the argument made by Epic that Apple has done anything illegal. But even if the Judge makes some relatively trivial concessions in Epic's favour, it would be a bit rich for pro-Epic commentators to crow if none of Epic's substantial demands are rejected.

To the extent I think Apple should change any of their policies around the App Store, it is for the sake of PR and not law—to protect Apple's business model from the coming onslaught of badly conceived "anti-monopoly" legislation.


> No, I don't

You literally just said that you think that the judge is going to rule that at least some of Apple's actions are illegal. That is what your previous comment means.

If you think that the judge is going to rule against Apple, in any way at all, no matter what, it means that you agree that some of Apple's actions are illegal, currently right now.

So, if the judge in the Apple case, make any action at all, no matter what, on any issue, against Apple, then that means that it is a true fact that Apple was engaging in illegal conduct.

No matter how much you dismiss the judge's ruling, if there is any order at all, against Apple on anything, then that means that Apple's actions were indeed illegal, on the things that the judge says.

> it is for the sake of PR and not law

Well, since you said "not law", then that means that if the judge does anything at all, on any issue, against Apple, then that means that you were wrong about the "law" part of it.

> and in my view the correct verdict should be entirely in Apple's favour

Oh, the excuses are coming out already! Before even the verdict has been made, you are coming up with reasons to dismiss it. As expected, no matter what the judge says, you will come up with an excuse, to ignore the evidence.

What is even the point of pretending like you care about the legal system, if you are going to make up excuses, already, about why the judge's opinion is wrong?


> If you think that the judge is going to rule against Apple, in any way at all, no matter what, it means that you agree that some of Apple's actions are illegal, currently right now.

I find it difficult to believe that you can't see a distinction between what I predict the Judge might do, and what I believe the Judge should do. This is so beyond disingenuous that I'm no longer interested in this conversation. I have deleted the bookmark to this thread and will not be returning in future.


> I have deleted the bookmark and will not be coming back to this thread again.

Don't worry, I'll remind you. But, I fully expect that you will come up with an excuse, when the judgement happens, as for why the judge is wrong to declare that at least something that Apple has done, is illegal.


https://news.ycombinator.com/item?id=27462371

You are being disingenuous in your interactions. It appears you have form in this regard. As this other person said, you seem pretty intent on deliberately misinterpreting as strongly as possible.

I'll leave you to it.


> law is established to disallow walled gardens of technology platforms

Like I said, this is already disallowed under the Sherman act, and regulation will make it explicit. If an OS maker had sued Apple for tying a device to the OS, it would be illegal under the Sherman act too.

No argument on Apple's tools, there are other tools available.

I'm kind of taken aback by your misrepresentation of the open markets institute links I posted. You are being disengenuous when you say there's no such brief.


> I'm kind of taken aback

Wow, did you seriously paste a link without reading it? Open Markets Institute did not file an amicus brief in Epic vs Apple. What you linked to is an amicus brief in the matter of Shah vs VHS San Antonio, a substantially different case which pre-dates the Epic lawsuit. Seriously, click on your own link.

OMI did release two paragraphs about Epic vs Apple in the form of a press release. Did you think that was an amicus brief? I'm starting to think that maybe you don't know what an amicus brief is.

> this is already disallowed under the Sherman act

No, it isn't. You do know that the Sherman act doesn't make it illegal to operate a monopoly, right?

> If an OS maker had sued Apple for tying a device to the OS, it would be illegal under the Sherman act too.

You've just demonstrated that you don't understand what "tying" is in the context of antitrust.


I did wrongly assume the brief you mention (Shah vs VHS San Antonio) was submitted in the Epic case but it wasn't. I did skim it and found it relevant on the subject of tying. All I can say is it's a bit confusing, linking to said brief (on August 3), from their statement on the Epic case: https://www.openmarketsinstitute.org/publications/epic-games...

If you've followed the case, it's clear what anti-competitive harms Epic is alleging from the iOS app distribution market.


So when I asked you for reading material that might convince me of your side of the argument, to "get out of [my] bubble", the best you could come up with was a tangentially relevant amicus brief that you hadn't even read, and didn't even know to which case it was submitted?

You seriously don't have anything more substantive than that?

I have listened to hours upon hours of the actual court proceedings in Epic vs Apple. Have you listened to even a single minute of it? I found Epic's case to be muddled and oftentimes contradictory. Most of their ambitious claims fell quite flat when tested in court. It certainly isn't helped with Tim Sweeney making some fairly ridiculous claims on Twitter, e.g. that Apple's commission should be more like credit card rates. Fairly ridiculous when Epic run the own store, which they admitted isn't yet profitable at a 12.5% commission. And I'm really confused why they couldn't make money on 12.5% since the Epic Games Store is barely more than a shopping cart and game downloader.

Hoeg Law's final video includes a contextual refresher on the Sherman act. It's really worth listening to. Just six minutes, or three minutes at 2X speed.

https://www.youtube.com/watch?v=I5WS5D6GydY&list=PL1zDCgJzZU...


I'll leave the difference between "skim" and "read" be. I've read quite a bit of the documentation from court listener: https://www.courtlistener.com/docket/17442392/epic-games-inc...

I'll leave it here.


The problem is when many people you want to socialize are inside the garden and the walls don't let you talk to them, and the garden keeper actively works to shape peoples perceptions of outside the walls as scary and confusing, so you just get blank states when you try to suggest ways to talk to people who can't or won't go inside.


In terms of Apple, I consider it a bad thing, full stop. Although that doesn't mean there are some benefits to it, but overall, awful. Real walled gardens are a totally cool option. I vastly prefer natural forests and meadows, but I'm cool with people having what they want. Because they have a choice. I can't have Apple software anywhere but their walled garden, which sucks.

I'll take the wild, chaotic natural environment over a walled garden any day of the week. So much more interesting, even though there is danger.


> I can't have Apple software anywhere but their walled garden, which sucks.

Which makes perfect sense, because you can't take the tree from an existing garden and plant it in your own without taking the roots and soil with you. Or put another way: you can take the tree, but you'll be leaving a hole for everyone else to fill in.

It's a solution to a problem and everything is integrated.

Not every solution has to be super modular, plug-and-play and work with everything, everywhere, forever, and allow the end user to self-repair, but not everyone wants that.


Disclosure: Heavy invested Apple user. A walled garden can indeed be a good thing, as long as it is tendered properly. Do we want the gardener always deciding which plants etc are allowed in? I'm in favour of curated experiences and strong security, but Apple has gone too far. 'Our' garden now has no gates and barbed wire on the walls :(


Both Walled Garden in your example are owned by its home owner.

Which makes all the difference.


That's an important distinction between Apple and Microsoft that goes way back to their founding: Microsoft was always a developer-focused company (remember Steve Ballmer's infamous "Developers! Developers! Developers!") whereas Apple has always been focused on the end user ("the computer for the rest of us"). In the early days when developers and technical people comprised the majority of the personal computing market Microsoft dominated. Nowadays we live in a world where there are several million developers and technical people and several billion non-technical people. Since Apple has always catered to that market it should be no surprise they're now dominating. In that vein it's important to understand that Apple is not a technology innovations company, they're a technology solutions company. Their raison d'être is to put sophisticated technological solutions into the hands of non-technical people and make them successful using it. The walled garden which developers frequently complain about is what allows me to hand an iPad to an 8-year-old and not worry too much about what they're going to put on it or how they might compromise the device.


I'm a developer myself and use Apple both professionally and personally. I hear the walled garden excuse frequently from those who choose not to use Apple products, but honestly I don't run into limitations often on macOS. I have to ask - what is it that you're trying to do that makes you feel as though macOS is a walled garden? Short of really serious kernel extensions, I don't find myself pressing up against a walled garden on macOS.

That being said, iOS is absolutely a walled garden, though I rarely find myself in need of something that it doesn't accommodate anymore (with the exception of WebXR, which isn't exactly popular yet). But Microsoft doesn't have its own mobile OS anymore, so not sure this is analogous.


Like what exactly? What is Windows missing? They have a way to integrate with Android phones, I can share my Galaxy's screen with PC and use apps that way even. They are working on AR stuff with Hololens and VR. You can do wireless screen sharing, sync your Microsoft Edge tabs, do 3D audio in games.

I was all in the Apple ecosystem the past couple years except for the Watch. And I'm struggling to think of something that's impossible to do with Windows and non-Apple devices besides Airdrop and that didn't even work half the time between my iPhone and Mac mini. Sidecar is a cool feature to have built in, but apps like Duet and Astropad do the same thing...

And I don't think it's a bad thing that MS doesn't Sherlock devs either...


If you have Apple Watch, you can unlock your Mac instantly - no password input or TouchID required.

If you have Apple Watch, you can authenticate services on your computer just by double-tapping the Watch side-button. It also unlocks your nearby iPhone if you have the Watch on.

You can trigger Do Not Disturb across the ecosystem from any of your devices.

Find My lets you keep track of all devices in the ecosystem with a global lost and found mesh network built in.

Apple Pay works seamlessly across iPad, Mac, Apple Watch, and iPhone. My Mac uses my iPhone's FaceID or a confirmation on my Watch to authenticate payments - no setup or weird configuration required.

AirDrop works perfectly for me with all my devices. I can beam documents and data from device to device without thinking about it.

Handoff works across all my devices. I can start an e-mail on my iPad and continue on my Mac instantly.

I can see the charging/battery state of my iPad, iPhone, Apple Watch, and Apple Pencil from any device in the ecosystem, including Mac.

AirPlay works across my iPad, iPhone, HomePods, Apple TV, Mac, and even Apple Watch. AirPlay 2 is incredible and has no delay. Rock solid connection - much better than any other protocol I've used.

If I get a new device, setup and integration with the rest of my devices is instant - no configuration required. This is especially great with AirPods - I don't have to pair them with each device. I just pair with 1 and I'm good to go on the rest.

Apple Store support for everything in my ecosystem.

etc etc etc


That first one sounds like throwing away security. Unlock my computer because my watch is nearby? No thanks. Why even have a password or TouchID at that point especially when the watch has none of that...

I can configure Windows to lock my PC if my phone is not nearby. Which makes a ton more sense than unlocking it if it is.

As for the rest, I've never had much success with all that being "seamless". I could hardly ever get my Airpods to show the battery level of the case by opening it and holding it near my iPhone.


> That first one sounds like throwing away security. Unlock my computer because my watch is nearby? No thanks. Why even have a password or TouchID at that point especially when the watch has none of that...

Well the Watch has to be on your wrist, unlocked, authenticated with your iCloud account and close enough to trigger NFC.

It's actually one of the best non-fitness related things about the Watch imo.

Point taken on some of the features not always being seamless but I've had much more success than not and I really appreciate all the ways the ecosystem works together - I can't imagine not having some of these features.


> Unlock my computer because my watch is nearby? No thanks

You know that if you don’t want it, you can simply choose to not set it up that way, right?

Remember this feature competes with unlock mechanisms that can be defeated by a photo held in front of the camera, not with Fort Knox 6 factor identification.


> If you have Apple Watch, you can unlock your Mac instantly - no password input or TouchID required.

On a windows laptop, you have Windows Hello, which is very reliable and seamless. Most of the times you won't even notice it. No $400 watch required!

> You can trigger Do Not Disturb across the ecosystem from any of your devices.

This is just a single click. I don't see the big deal.

> Find My lets you keep track of all devices in the ecosystem with a global lost and found mesh network built in.

Cool, but I don't see much use for it. I have literally never needed to track my laptop or phone or keys.

> Apple Pay works seamlessly across iPad, Mac, Apple Watch, and iPhone. My Mac uses my iPhone's FaceID or a confirmation on my Watch to authenticate payments - no setup or weird configuration required.

My credit card works seamlessly across Apple, Android, Windows, real-life. My browser stores the card number and pre-fills it for me, no matter the device. No setup or weird configuration required.

> AirDrop works perfectly for me with all my devices. I can beam documents and data from device to device without thinking about it.

Very useful in some situations. However, I hardly need to use it since my android phone is auto-syncing photos and videos to my Onedrive, which is syncing to my laptop. This works across all ecosystems, unlike airdrop. When I do need to send a particular file, Your Phone app works beautifully.

> Handoff works across all my devices. I can start an e-mail on my iPad and continue on my Mac instantly.

I can do the same across all ecosystems using gmail and/or outlook.

> I can see the charging/battery state of my iPad, iPhone, Apple Watch, and Apple Pencil from any device in the ecosystem, including Mac.

Cool ability, I haven't thought about it, but I think I would enjoy it. Your Phone does show my phone's battery on my laptop.

> AirPlay works across my iPad, iPhone, HomePods, Apple TV, Mac, and even Apple Watch. AirPlay 2 is incredible and has no delay. Rock solid connection - much better than any other protocol I've used.

I am not sure what Airplay does. Is it something like bluetooth to stream music? Bluetooth works fine for me most of the time, and Chromecast also works well. Both of them are not locked into any particular brand, too.

> If I get a new device, setup and integration with the rest of my devices is instant - no configuration required. This is especially great with AirPods - I don't have to pair them with each device. I just pair with 1 and I'm good to go on the rest.

Same with Galaxy buds. I don't even need to have bluetooth on. The phone will still detect the buds.

> Apple Store support for everything in my ecosystem.

Not sure what you mean by this.

As you can see, I don't find any great benefits to the Apple ecosystem compared to other, more open systems. And it doesn't hurt that almost everything is much cheaper than iStuff too.


I don't think you realize just how much better Apple's solutions are compared to what you're talking about. I could go through and list how, but I don't think that would do it justice.

It's the details, integration and quality that you can't capture in words that make all the difference.

It's easy to downplay the features like you're doing, but the magic is all of them together working together in one ecosystem. Even the tiny tiny features like being able to ping my phone from my Watch - I use that feature every single day to find my phone in my house (if I left it in a jacket pocket for example).

Sure if price is an issue, Apple's ecosystem is not cheap.


Windows is a gigantic market so in general the vast majority of things one could do in MacOS are going to end up being possible in Windows (assuming Apple even was the first to market some feature in the first place).

It's almost unthinkable that some in-software feature that a lot of people like can be done in an Apple OS and not a Microsoft/Android OS.

The question is not whether it can be done, but how easily it can be done. If you have to jailbreak your phone or cobble together a collection of different applications to simulate the experience, for 80%+ of people that's basically equivalent to impossible.


You don't have to jailbreak your phone or root it to accomplish these things. Stuff like MS "Your Phone" for integrating your Android phone and Windows is built into some devices like those from Samsung and Windows will even walk you through the steps at first boot. Most of this stuff is built in and I'm not entirely sure app installs are onerous. 80%+ seems a little ridiculous.


you mean like play all the VR games or too AAA games in general?

I own both. From my POV I'm able to do more on Windows even if I prefer MacOS for day to day use and iOS for my phone (though again can do more on Android)


I'm curious what you use the Galaxy screen share feature in Windows for, because I've tried it a few times and always found it to be a bad user experience.


Not much lol. Checking my expensive Japanese-English dictionary that I only have on Android on occasion.


> OK, but Microsoft is in the best position to create great, integrated, first-class technologies that build on how great Windows is.

For a few years now there has been some noise that Apple’s Pro hardware lineup isn’t Pro, that macOS is becoming more locked down and more iOS-like, that the software is buggier than ever, and - until they dropped their App Store split to 15% for small developers - that 30% was too high a price.

With WSL, and VScode and GitHub Microsoft are making a (messy, so far) play for developers. But as you point out, they didn’t take it to the nth degree and nail the execution. I wonder if they’ve missed the opportunity now.

I develop for web on macOS, and have all Apple hardware, so maybe I’m missing some of the other things Microsoft have done.

Maybe it is the lack of verticality MS have, and the integration between devices, something they can never match Apple on that means there hasn’t been the transition they were aiming for. That’s why I’ve not switched.

I’ll be very interested to hear what people think about this.


I would say MS nailed VScode remote development. It is seriously, seriously good. I used it during WFH and while VS didn't work nearly as well as the version of IDEA I was using the remote part worked so well I would forget I was working remote.


> Something that's dawned on me for the last few years is that Microsoft basically settled on the idea that the idea of the modern operating system is effectively done, which I suppose is true, but then they stopped creating technologies to build on top of Windows all together.

What happened was open source software, linux dominating the server side, and android dominating mobile (iOS as well, to a lesser extent).

> Except everyone publishing software today that isn't a game developer is targeting the web.

Except for nearly all backend development... which mostly targets linux first.

> OK, but Microsoft is in the best position to create great, integrated, first-class technologies that build on how great Windows is. But they don't. I have mixed feelings about this, because when Apple does it, they basically put people out of business.

For example? Apple software is average, and Microsoft has a long history of 'putting people out of business' (not in a good way).

> But here's the thing, in 10 years time, you're going to have Windows, which is still just, and will continue to be, Windows. And macOS, and iOS, and tvOS, and every other Apple OS, is going to be so much more. And they already are so much more.

You forgot... android and linux. How does apple 'already have so much more'?

> For those who haven't experienced it yet, or can't afford Apple products, they're missing something that just hasn't happened in computing in any other period of time I can immediately think of.

That just sounds like an opinion of the average non-tech user who lives inside an apple bubble and likes icon shapes and default keybindings. You're not saying anything specific, just poetry about apple being 'first-class' and 'something that just hasn't happened in computing in any other period of time' (???)


> But here's the thing, in 10 years time, you're going to have Windows, which is still just, and will continue to be, Windows. And macOS, and iOS, and tvOS, and every other Apple OS, is going to be so much more. And they already are so much more.

Some of those, perhaps, but MacOS has been on a steady decline for the past five years. Apple doesn't seem to care about the Mac platform except as a lifestyle accessory/devbox for iPhone owners. Windows may be limiting itself by merely trying to provide a decent daily-use desktop environment, but Apple isn't even doing that anymore.


Apple products have every right to be great: after all, Apple is the largest company in the world. But they also have a duty to extend their services, one that they have sorely neglected for the past 15 years. I wouldn't be so angry at Apple if things like Airdrop, Handoff and Airplay were all open protocols. The only reason Apple doesn't extend these protocols is to have leverage over their competitors. It's not like any of the aforementioned technologies are impressive, either: they were all preceded by Warpinator, XDG and MPRIS, respectively. So Apple embraces these open-source concepts, extends them with proprietary interfaces and then extinguishes the source... where have I heard this one before?


EEE is more insidious than this so it's important to not dilute its meaning. EEE is about taking a previously open, successful interface and then making it proprietary. It's destroying what was already flourishing. In Apple's case, they just built their own thing.


To be fair, they have already done that (or rather, attempted coups) with LLVM and Clang, no?


No? They hired Chris Lattner and paid him to work on LLVM and Clang for around a decade. Clang and LLVM are still open source, and the Swift compiler is also open source.


EEE is about actual intellectual property and compatibility, not just concepts. As far as I know AirDrop, Handoff, and AirPlay are proprietary all the way down.


Like it or not, AirPlay will always be proprietary for DRM reasons.


That might be true if Google's Cast and Miracast both got away with ambiguous wireless mirroring years before Apple even entered the game.


Google Cast includes DRM


> But here's the thing, in 10 years time, you're going to have Windows, which is still just, and will continue to be, Windows. And macOS, and iOS, and tvOS, and every other Apple OS, is going to be so much more. And they already are so much more.

And what is wrong with this? A stable, long-lasting platform is a beautiful thing.

Maybe I'm an old fart (at 36...) but I don't really want "so much more". I want something that I can count on and bank on now, and that will be there for me for as far into the future as possible. None of this lofty futuretech seems to promise that, at least without bleeding me dry in pointlessly recurring interactions like subscription fees.


> Except everyone publishing software today that isn't a game developer is targeting the web.

All pro or semi-pro creative applications target native.

live streaming / video editing / animation / DAW / image editing

There are some toys for parts of these workflows in the webspace, but no serious tools for people who do this stuff seriously.


ok cool, so 2% of all laptop users?


Am I the one who think this is scary? The future of Microsoft not giving a damn, and a future where Apple dominate and dictate everything.

I am not an old timer like some here that has a MacSE, but for 20+ years, this is the biggest come back of Apple, and many of us were once hoping for. For me it was a simple thesis, something that is so much better deserve to be in a better position. May be it was more of a hatred against Microsoft in a way, since they were shipping crap after crap.

And here we are, Apple dominating and will continue to do so. But since 2016 Apple smells exactly the same as Google in 2003. Don't be evil Hypocrisy.

Except this time around Apple has the better product, services and brand. From a strategic point of view there are no major flaws of weakness. Google's only major weakness was relying on browser for access. That is why they created Chrome. They also saw what could happen if everyone were using iPhone. They will have access dictated by Apple so they need to built Android.

But Apple has no access problem. They are the final End Point ( or starting point would be more appreciate ) of all technology and digitalisation.

> I'm blind because I think Apple products are so great.

Not really, objectively speaking it is hard to argue Apple's product are not some of the best on the market.


Yeah. And in my opinion, Amazon is eating Satya's lunch. So, it's not clear to me what Microsoft is a leader in anymore.


But, Microsoft already has the AR/VR platform. That all devices can use. Oculus, iphones, androids, mac’s, windows, other VR devices.

And they are the only ones who have this.

It is called: Mesh.

https://techcrunch.com/2021/03/02/microsoft-debuts-its-ar-vr...


> Or, idk, I'm blind because I think Apple products are so great. The latter most is probably more likely.

As a fellow cult member who hasn't drunk enough kool aid, Apple products have been losing their luster in recent years.

Apple TV still doesn't have hands free voice control. Fire TV has had this for years now. Using your iPhone isn't great because it doesn't have the microphone setup to consistently hear your voice well. Does Siri and Apple TV finally work together with Home Pod like how Google TV uses smart Google speakers?

Apple Macbooks have suffered immensely with the near useless touch bar and terrible keyboard.

Thinness as a feature seems to override everything else. I don't care about having a thin desktop. I want something I can open and replace parts in without paying about $10k.

The list goes on. Including how there's still no word on the VR / AR unit, while Facebook mops the floor.

The only light in the darkness is the M1 chip.


> Apple TV still doesn't have hands free voice control

Yes it does. I use it everyday. Which Apple TV do you have?

> Apple Macbooks have suffered immensely with the near useless touch bar and terrible keyboard.

Still the best laptops in the market by a mile.

> Thinness as a feature seems to override everything else.

Many recent products have actually been thicker. New iPhone is thicker/heavier. New iPad is thicker/heavier. AirPods Pro are chunkier than regular AirPods. AirPods Max are some of the heaviest headphones in the market. Their new Mac Pro is a massive, modular beast. etc etc

> Including how there's still no word on the VR / AR unit, while Facebook mops the floor.

Apple is never first to market. But when they do enter, they tend to wipe the market clean. They also never talk about unreleased products.

> The only light in the darkness is the M1 chip.

This chip and associated tech will power their entire ecosystem and make every product better.


> Yes it does. I use it everyday. Which Apple TV do you have?

4k - 1 model older than M1. I don't mind buying a Homepod to enable it (I'm not happy about it, but less happy about being forced to do it with iOS devices not designed explicitly for hearing you anywhere in the room), but to my knowledge commands from the Homepods won't make it to Apple TV. Am I wrong?

> Still the best laptops in the market by a mile.

They are still considerably worse than older Macbooks, since thinness and removing the ability to upgrader hardware were the priorities. The only thing that keeps me in the Apple computer ecosystem is Mac OS.

> New iPhone is thicker/heavier. New iPad is thicker/heavier. AirPods Pro are chunkier than regular AirPods. AirPods Max are some of the heaviest headphones in the market.

That's a great trend since Ive left. Looking at the new iMac line was disheartening.


> 4k - 1 model older than M1. I don't mind buying a Homepod to enable it (I'm not happy about it, but less happy about being forced to do it with iOS devices not designed explicitly for hearing you anywhere in the room), but to my knowledge commands from the Homepods won't make it to Apple TV. Am I wrong?

My brain didn't register the "hands-free" part of your comment. I think you're partially right but not completely.

I have a pair of HomePods hooked up to my Apple TV 4k and I can do basic hands-free commands like "Hey Siri, pause", "Hey Siri, continue", or "Hey Siri, go back 20 seconds".

But more advanced commands like "Hey Siri, open YouTube" don't work - you still need the remote to navigate around afaik. I agree with you though, it would be cool to have much more advanced support with HomePods and Apple TV. I want to be able to navigate the interface with no remote at all.


I'm an Apple user at home, but Google really has Apple beat on the voice control stuff

I set up a Google Nest speaker thing for a friend's parents (they were all Android users, so it seemed like the best fit). I connected it to their network, plugged it in, then said "Hey Google, turn on the TV" — their smart TV turned on. I said "Hey Google, play Cory Carson on Netflix", and it opened Netflix on the smart TV and started playing an episode of the kids' show

I was blown away because I had no idea what TV they had, how it was connected, and didn't do anything to set it up other than connecting the Google Nest speaker to their WiFi

I really wish my HomePod + AppleTV could do that! The best I get is "Hey Siri, turn on the living room TV" and even that only works by the grace of HDMI CEC (randomly, it will just turn on the AppleTV and will fail to turn on the TV)

The Google Nest speaker was able to turn on a random smart TV in the same room with zero setup, and not even a direct connection. Play specific shows and more. My HomePod + AppleTV can't do anything close to that

All that said, I refuse to allow any smart TV in my house to be connected to WiFi. So I doubt the Nest solution would work for me. It was just impressive to see it in action


I have Alexa, Siri, and Google Home in the house. Like you Google’s voice assistant blew me away, so much that I decided to switch to the Google / Nest ecosystem as in buying everything Google like their security system. Fast forward one month and Google cancels their Nest security system, and that’s also when I realized I could never depend on a company as flaky and unreliable as Google to buy hardware from.

I’m hoping that Apple steps up their game, but it’s been 8 years already, and only now are they release a less expensive version of the HomePod. I don’t feel that Apple knows what people want anymore which leaves Amazon to take continue dominating the space with their cheap products. In Amazon’s defense, not all their smart home products are bad cheap and no one is still close to really touching them in that space.


On the other hand if they had no touch control on the Apple TV people would be complaining about how it’s an always on spying device.


> Or, idk, I'm blind because I think Apple products are so

> great. The latter most is probably more likely.

Isn't Microsoft already shipping AR headsets to the US Army? if anything they are the leaders in the space

anyway, as far as "Metaverse" goes I'd argue we already have multiple and the presentation layer is just gradually changing


On the other hand, Microsoft did a fairly decent job capturing cloud revenue with o365 and Azure. Apple is leaving most of that on the table.

Apple has cloud services, but they are, for the most part, intended to help sell more iPhones. Can't argue with the Apple's approach since it works well revenue wise though.


Does no one remember continuum, that was the future, too bad no one actually bought windows phones.


> and every other Apple OS, is going to be so much more... [non-Apple users are] missing something

Android hasn't really faced many obstacles in cloning things. Consumers in China, for example, still buy Android phones, despite iPhones and despite having enough money for those iPhones. I don't know if anyone is missing out on all that much. I wouldn't say they are like, unenlightened. If anything the Android ecosystem has led to more digitization of life in China than in America, far faster - it looks way more like the metaverse in terms of computing taking over daily life than here, which is to say that you're really far off the mark in terms of what really matters.

Anyway, this metaverse stuff. It blows. Roblox blows. Second Life blew. World of Warcraft is great, but you can hear about why it's great direct from the source (https://www.media.mit.edu/videos/conversations-2014-05-07/) and it's all about really carefully curated and purposeful design choices, the user driven parts of it are not why the game was so good.

Existing metaverse experiences blow not for lack of immersion! So the technology will change little of that. If Roblox was photoreal, it would still blow. Minecraft AR didn't matter. None of that shit matters.

A company that barely supports 1 external video card vendor, that releases shit graphics APIs, that sues game companies - they're not going to break ground in the "metaverse." Apple Arcade games are good, but that's because they are fighting the real antagonist, the real horror show: free to play gaming. That is the antagonist of the metaverse, having to be free and monetize people via Robux or V-Bucks or whatever it is that activates neurons in a 10 year old's brain. It's got nothing to do with the technology.


Regarding the topic at hand... MS have an actual product that real live humans use today...

https://www.microsoft.com/en-us/hololens


That's because they got sued for antitrust the last time they tried to do this.

And given that they don't sell hardware, there isn't money in it. And if there isn't money in it, it isn't worth getting sued over trying it.

So here we are.


Interesting. So you think they are not even working on making this thing faster? Shame. Windows needs some performance boost.


Can you name a few examples? From what I've seen I can now run a Linux within my Windows, that solves my long-standing gaming vs developing problem. I can finally run Docker properly on my Windows box, and literally everything else that was problematic before. I'd pick Windows over MacOS 12 times out of 10.


I was a contractor for Apple for 4 years and when I left a few months ago my new company let me choose windows or mac and I chose windows (mainly spite) but it’s really good now. I live in the command line and having full blown Ubuntu inside windows is great. Docker works as expected. On top of that ms office is wayyyyyy better on windows than mac. I won’t be switching my personal devices off arch anytime soon but I think windows is a better development platform than mac now (can’t believe I just wrote that)


I also couldn't believe it when I realized it a while ago! Also...I hear Arch more often lately. Why would someone choose Arch over let's say Ubuntu? I'm curious. Never used it.


Arch is great for desktops and laptops because it’s rolling and always up to date with upstream. Install it once. Update it once a week. Never have to do anything else. They also have a user repository where they pretty much have everything. There’s nothing on my laptop that isn’t version controlled with their package manager.


>> Except everyone publishing software today that isn't a game developer is targeting the web.

What?

Even your example (Creative Cloud) has an iOS and iPadOS app.


Sorry, to clarify, anyone targetting Windows users, is probably only doing so by proxy, because there's no compelling reason to create a first-class Windows desktop application over just a web app.


I think you are right on this and it's a real bummer.

In Ellen Ullman's book Life in Code, she writes about Whitfield Diffie's (of Diffie-Hellman fame) speech at the 2000 Computers, Freedom, and Privacy conference in Toronto.

> "We were slaves to the mainframe! he said. Dumb terminals! That's all we had. We were powerless under the big machine's unyielding central control. Then we escaped to the personal computer, autonomous, powerful. Then networks. The PC was soon rendered to be nothing but a "thin client," just a browser with very little software residing on our personal machines, the code being on network servers, which are under the control of administrators. Now to the web, nothing but a thin, thin browser for us. All the intelligence out there, on the net, our machines having become dumb terminals again."

I've been thinking about this a lot lately. Smart people like Diffie saw this happening more than 20 years ago.

It's really made me rethink web apps that don't need to be web apps (including stuff like electron). Like you said, Microsoft has thrown in the towel and it seems now Apple is really the only platform making a compelling argument for native apps and keeping the computer smart.


Even better, the network dependency means piracy is impossible.


Aah I see what you mean


The curious thing about the past, present and future of AR... if there was low hanging fruit (in terms of irresistible applications) that people must absolutely have, love at first "sight" experiences, wouldn't it have happened already? with whatever crude software and hardware? Not as a gimmick but as persistent interest and investment[0].

People got busy with computers when computers were the size of houses. They carried mobile phones when phones were the size of suitcases. Is the proto-AR revolution happening somewhere out there without anybody noticing?

Its possible that there is no low hanging AR fruit. That somebody has to do the difficult job of climbing up the AR tree so to speak (refine the technology until it feels like magic).

It just seems that this would be an exception in how things played out so far in the "digital transformation" journey. Its the nature of the human brain to fill-in the gaps and overlook the rough edges when it really has an incentive to do so.

[0] investment in the sense of personal time by users, creators, business people etc to really learn and use the technology to scratch whatever itch they found it is scratching...


there is no low hanging fruit. For AR to really work so many things need to happen.

For one, it arguably requires even more CPU/GPU than VR because unlike current phones and current VR you really want ALL the apps to run concurrently in AR. You want your social media app to show the names and or interests of the people around you. You want your mapping app to show you directions. You want your restaurant app to point out which restaurants have seats available or a low waiting time and price / cuisine / style. You want your virtual pet to run around your feet and body, you want to see each person's virtual fashion, you want your virtual monitors showing your older 2d apps running, you want your virtual cooking instructions and while sometimes you'll want to turn down one app over another, usually you'll want many running at the same time highlighting different things. You can even argue you want the various apps to share 3d spacial data (sans privacy issues) so your virtual pet can interact with your friend's virtual pet and the virtual furniture they have placed in their house via some furniture app.

one app at a time is just not good enough for the general population to get iPhone level interested in AR.

Getting full view glasses with fast enough depth sensing so things can be correctly occluded and run all apps simalutainiously and have the batteries last all day seems many years away


I'm not a graphics expert, but the reason why VR is so intensive is because it has to render the same frame twice from two different perspectives (one for each eye). This is compounded by the fact that low resolution / frame-rate VR hurts to look at, and so there is a minimum resolution requirement on top of a minimum frame rate requirement on top of a double render requirement. There's a lot of compute power necessary to build a smooth VR experience.

AR doesn't have to have those same requirements because you're not going to get nauseous looking at a buggy label flickering in the distance. You don't have to spend compute cycles to trick your brain into believing what you're seeing is real, because it is actually real (minus the overlayed 3D models, of course).

Now, sure, you will have to spend compute cycles on depth sensing and mapping your immediate area. But that's something the OS should/will do, not every app simultaneously. If you think about what an AR operating system would be responsible for doing, mapping your surroundings and providing that data via API is probably one of the first features it would have. It's no different from Windows or macOS communicating with your monitor so that your applications can draw on it. Similarly, every app likely won't be responsible for drawing its UX onto the user's vision - it will probably submit some graphic or model to the OS, which will then anchor that model "onto" reality and handle the user moving their head around it. Much like how every Windows app is not responsible for resizing or moving its window, that's the window manager's job to do. In AR we would probably have a reality manager or something.

What I will agree with you on is that we probably need desktop-level rendering power to solve AR, not mobile-level. However, with the release of M1, it does seem like we're pretty much there already, and I would not bet against Apple's chip team failing to make a smaller M1 that fits into AR glasses.


> every app likely won't be responsible for drawing its UX onto the user's vision - it will probably submit some graphic or model to the OS, which will then anchor that model "onto" reality and handle the user moving their head around it. Much like how every Windows app is not responsible for resizing or moving its window, that's the window manager's job to do. In AR we would probably have a reality manager or something.

I don't think this is true. The problem with an app just giving 3D data to the OS is then no apps can compete on quality. There's a reason Cyberpunk 2077 (or whatever the top graphics game is) looks different than Wii Bowling. Even with 2D windows each app does its own rendering using the GPU and provides the results as a rectangle of pixels that the app computed. Some apps look amazing because they used 100% of the GPU power. Some look so-so or maybe they have a great aesthetic (Animal Crossing?) but they aren't spending power on rendering. Now, add that across apps. Your virtual pet app wants to render photo realistic fur where the fur appears to bend in relation to things in the real world, your virtual fashion app wants to show you're friends outrageous auto-reactive clothing, but your navigation app just wants to draw some solid more mostly solid colored lines. The virtual pet app will hog all of the GPU just to draw its fur bogging the entire OS down.

It won't be a simple as it was for desktop PCs because you can't let any app slow down the system (which is what happens on PCs) since you need the system to always run at 60 or 90 or 120fps. At best you could maybe make a preemptable GPU (not sure any current GPUs are preemptable), or use 2 GPUs, one for the OS, and one for apps.

I think Apple or someone will ship an AR device without solving these issues. I think it will fail just like the Apple Newton failed. Too early. Running AR glasses with an iOS like OS that runs one app at a time is just not a useful paradigm for AR and solving the problem of letting 5 to 20 apps all run at the same time, each using as much GPU as possible for their specific idea of quality, AND have that all run fast enough to keep up with latency issues to actually have the virtual stuff not swim over the real stuff just has a long ways to go. Maybe an M17 but not a M1


I’m also not an expert but I believe that AR is that much more difficult because any latency between the projected image and the real world will show up as displacement when you move or turn your head. Whereas in VR, because the world is obscured, the brain will accept your artificial world even if there’s ~20ms latency.


I believe it is happening and it's called Pokemon Go. There should be many more industrial applications than consumer ones though, eg imagine what construction workers could do with it.

The main problem is that it's so personal and low latency, but it also takes so much processing power. You need to spend a lot of time on performance and power work before anyone will use it without their phone melting.


It is happening now, which is one of the points of this piece. Apple has been laying the groundwork bit by bit since 2019, but there’s also lots of AR today, just not in “the metaverse” sense as discussed here quite yet. Snapchat filters, AirTags and UWB, and Google Lens are some of the most popular examples right now. Anything that adds a digital layer to the physical world via a computing device, is AR.


Is this just a fancy/confusing of way of saying “Apple is building an AR ecosystem”?


Yes. Apple could be building AR/VR hardware, but if you’re convinced they are, then you’ll start to mentally force everything they publish as a building block for said hardware.


It is a kinda fancy / highfalutin way of saying that. Partly because the article is intended to be somewhat pithy. But splitting it into layers (say, substrate) helped me think about the parts involved, how they needed to build upon the layer below, and work together to have a chance of pulling it off.


And is the "AR ecosystem" the "metaverse" or the "metaverse substrate?"


Apple's Bluetooth-LE tags don't seem to be mentioned but to me this is one of the most notable beachheads for Apple to mix the metaverse with physical reality. we still don't see a ton of uses, but there's something super enticing about being able to casually discover digital things amid the physical. I'm not surprised good adoption has been slow, that we don't have a ton of neat use cases to wow people, but also: I was shocked as heck to see Google retreat from their open standards competitor, Physical Web/Eddystone.

Apple also makes good use of Bluetooth for airdrop & wifi password sharing, something else google is light-years behind in.

Good writeup, interesting, even if it seems a bit (quite) oversold to me. Makes me think of Benjamin Bratton's The Stack, the many tiers that compose the digital.


That was touched on in the "Find My" section.


> “Find My”-style devices will help tighten up the mapping of the physical world to enable augmentation — really this is part of the platform substrate, but the same tech underlies a key bridge between physical and virtual.

The section has good stuff in it, but to me it came off as me-centric, as about high end experiences like VR, accurate spatial systems.

And that feels like it's a very different piece of the platform than what I was talking about, which is beacons that we can leave for each other. Getting to a bar & having the menu advertise itself. Art installs that come with interesting mini-sites on the browser, or which are cross-media. QR codes are the closest thing we have today, but QR codes require very explicit intent to use, and I've long been interested in the promise of more ambient computing, of seeing the numerous micro- / edge- clouds about me, & seeing what they might offer.

I didn't get any of that from the Find My. I'm looking for Find Your, I guess. Or Share Mine. Which Physical Web did, which Apple's iBeacons continue to do.


This is the feature set needed for Hyperreality.[1] Which is hell with interactive overlays.

AppClips. Drive-by installs. What could possibly go wrong?

[1] https://youtu.be/YJg02ivYzSs


This is what Android equivalent would look like.


Fantastic video, thanks for linking it. And I'm in 100% agreement.


Hell is exactly how I would describe this.

Well-made, cool video though!


Interesting take, but SwiftUI and Combine are really bad. They claim ‘functional programming’, but there’s so much (leaky) magic involved. The sub par automatic code generation yields slow and laggy experiences. I don’t think Apple has any plan to improve this beyond ironing out obvious bugs and waiting for natural performance yields to kick in.

It is bad enough that it reminds me of Carbon and the related legion of missteps that hurt Apple badly some 20 years ago and pissed off a lot of loyal developers.

Honestly, SwiftUI combined with Apples now complete market dominance is enough to make me reconsider my career path.

Just because they’re so big doesn’t mean they can do no wrong and several of their recent technologies have been regressive.


I am curious why you have such a low opinion of SwiftUI, most third party developers I have heard from are eager to drop Storyboards and switch immediately.

The magic involved is not really magic at all, just Swift syntax sugar. Nothing about SwiftUI is proprietary, you can make your own SwiftUI variant by copy and pasting its code. It's just another library after all. Granted, some Swift features do seem like they were explicitly granted to make SwiftUI prettier, but it's not like you can't use those features in your own code or libraries too.


It’s not just syntax sugar. There is a lot of compile time code generation running behind the scenes.

It also means there’s a lot of hidden determinism where there didn’t used to be.

Storyboards were rubbish, I agree, but SwiftUI is more fundamental. Apple is abandoning the PostScript model which permeates every graphical computing paradigm (WinForms, GTK, Qt etc.), and if you don’t like programming with cotton gloves on, you’re basically not welcome.

All very much a shame from my point of view, because UIKit is easily the most powerful UI toolkit around. It’ll be interesting to see how long third parties will continue to be able link against it directly.


The funny thing about Apple is how people confabulate mythical powers to it. I can't wait for Apple iTulpa[1].

Jokes aside. What is the mission statement here, what is the dream? Make you never want to interact with the real world again? Help you manage your to do list better?

I feel there was much more of a mission statement for mobile back when everyone was stuck on public transit and at the office for hours every day. You could finally make your life on transit and at the office more meaningful by integrating your personal life with that through your phone.

What's the dream now? I don't get where all this is going unless it's like you're never going to leave your house again, so escape into VR.

[1]https://en.wikipedia.org/wiki/Tulpa


At least for me the dream is not having to have information gatekept by the location and size of a screen. With glasses apps can be as tall and as wide as you want them to be, and many apps won't need traditional UI at all because they can just overlay their information over real world objects.

Anything that currently has a paper label or barcode in the real world, could be a candidate for an AR app that helps users find stuff quicker. And rather than having to look through aisles or buildings to get to what you want, you could just use a search bar or virtual assistant to look up the thing you want and get a persistent marker to its exact location.

Basically just imagine the kind of UI a video game presents to a player, and how a well executed UI blends in subconsciously to the point that you can navigate menus and issue commands without having to think of it. Now imagine that experience but overlayed over real life, where an app can help you remember someone's name or birthday as they approach and you can take a note with your location and current view attached to it. Something as silly as "funny car" while looking at the funny car, or as serious as telling the nurse after your shift where the patient's pain point is.


> SwiftUI [...] doesn’t feel like an AR platform component (Where’s the third dimension?).

I'm unfamiliar with SwiftUI, but fwiw, I found the extensions needed for CSS3D to support AR to be surprisingly small.

A few years back, having a custom browser-based VR stack with passthrough AR, I sketched a talk for the BostonVR meetup, to give around April Fools. It would have purported to be an introductory onboarding demo, of newly available CSS3D extensions for AR. With support for placement in realspace, billboards, HUD overlays, integrated multiple displays, position aware and 3D displays. The extension needed was surprisingly minor. The talk would have been basically "introductory CSS positioning, in a slightly enriched context", which I expected to be quite believable, followed by a "surprise! - the making of the demo spike". I don't quite remember why it didn't happen.

> 2(.5)D experiences

Shallow-3D UIs seemingly have a lot of potential, but don't get much discussion.

Meta: I wish HN discussions around AR were of higher quality. It's be clear for years that Apple was pursing this. Being unfamiliar with the Apple ecosystem, I'd have liked to see more discussion of what those pieces and their characteristics might suggest about the future. Or for instance, of whether Apple is doing any CEP complex event processing, or retraction of app state changes, to support input with diverse latencies.


Shallow 3D definitely has a lot of potential. Not everything needs to be ILM effects - even in the long term. Sometimes buttons & text are all you need.


> Sometimes buttons & text are all you need.

And most displays have some privileged focus depth(s). Vergence-accommodation conflict, and eye strain from long use, can be reduced by staying close to that. Ahem, and as my eyes have grown old, the annulus where I can focus well has narrowed.

And maybe, it might be possible to have text in layers, close enough to avoid vergence-accommodation conflict eye strain, but far enough to be tolerably readable, and ideally to allow eye tracking to determine which layer is being read.

I was layering a keyboard-cam-with-annotations over desktop for a bit, to see the kbd in an HMD, but also to explore "graphic artist looking at screen containing processed view of tablet input"-like UX for software dev. The layered clutter was somewhat annoying, so I'm unclear on viability, but it did seem intriguing.

Also, "shallow" still permits some depth. A cm or few at 50 cm (laptop screen) seemed comfortable for me (for some limited frequency of focus switching, and I've only a little experience, and ymmv). That's depth enough for more than buttons and text, but still a very different design regime than "fill the room with XR UI". ILM effects, hmm... someday I'd like to find a stereo drone straight-down view of beach surf, shallow but 3D, to use as desktop background. :) Maybe.


Can someone please define 'metaverse' in the context of... whatever this article is about?

Like jfc people, 'End users will experience the metaverse through the default device delivered experience.' sounds like the most doublespeakiest doublespeak ever to have been written.

I love tech and this lofty shit makes me want to gouge my eyes out. Speak plainly and simply, folks.


Of course, I think I can help.

What is trying to be communicated here is psychographic content will be delivered via standard device substrates existing in well aligned usage patterns that will no doubt delight the end-user with value. Metaverse here simply means they will deliver an experience that is parallel to user expectation yet orthogonal to the digital representation of the physical model of the prevailing social zeitgeist.

Or maybe someone just read Ready Player One recently, and is just thinking: We're going to build the Oasis but nobody could possibility understand what that is, I'll have to break it down into easy-to-misunderstand corpobabble.


> We’re going to build the Oasis and

The Black Sun was there first.

Can i interest you in a loose collection of extremely janky perl scripts? The bar’s owner hates it but, who cares? If he kicks us out we can just race bikes for a while until the fuss blows over.


Any idea where a guy can score The Librarian these days? Wikipedia is cool and all, but it's lacking a certain... finesse.

On a more serious note, I think these are different metaverses. Ours is pretty much locked in VR as far as I can tell, and Apple is building an overlay on zeroth-order reality, aka AR. Arguably harder and more powerful if done properly. Which I'm sure it won't be.

:P


> Arguably harder and more powerful if done properly. Which I'm sure it won't be.

My response is kind of stuck between "that's overly cynical, isn't it" and "statistically speaking, that's pretty likely".

(I was actually working at an early AR project at Nokia a bit over a decade ago, at the time called Point & Find, later called CityLens. It gave me a feeling that good AR may be one of those things that remains "just a couple years away" for decades.)


Define "good AR"? Pokemon Go is nearing its 5th birthday. A VRchat overlay you can use on the street? If you get the Epson AR glasses, combine it with the Quest 2's Snapdragon XR2, and mess around in Unity for a bit, that could be done by the end of the year. Create an app that's killer enough to make up for the bulkiness/inconvenience/price of a version 1 streetwear AR headset and we'll be getting somewhere.


That's fair -- with the caveat that your last sentence is asking for a lot. :) I think Pokemon Go did a great job of being an AR game, although I couldn't help but notice that literally everyone I knew playing it turned off the camera functionality after a few weeks and gave up on the AR overlay part.

I don't know what the "killer app" for AR is really going to be. CityLens and comparable apps like Yelp Monocle on smartphones clearly haven't cut it. (Show of hands: how many people remembered Yelp had an augmented reality mode? I don't think it's there anymore.) I think the big challenge now is thinking of applications that aren't games where using goggles/glasses give you more than incremental improvements over putting the UI on your wrist or in your ear(s).


>I don't know what the "killer app" for AR is really going to be.

I wonder if it really is a navigation HUD. For example, i’d love this, with some auditory interface, for things like cycling and motorcycling and also probably normal driving too. Safely presenting salient information in the same depth of focus as the road, and giving me tools to interact with it with voice, would be good AR in those situations, and i’m in them pretty frequently.

Is that a big enough market? Will it inevitably descend into the inverse of They Live?

we’ve given the hardware a shot at least a few times now and wearable glasses tech has not yet been useful. But i can’t think of anything else.


They can also just project the image onto your retina directly, but this requires some serious tracking I don't understand or know if it really exists.


people have been playing with it since the 80s, but it has yet to become more than alpha/research stage.

https://en.wikipedia.org/wiki/Virtual_retinal_display


- High resolution

- Low latency, ideally imperceptibly low

- "Delta-zero", meaning the overlaid virtual reality should never drift from reality itself. This is probably theoretically impossible, so let's just be imperceptible here too.

This should boil down to a completely continuous interface over the sensory domains of all humans (other animals?)

For now I assume we are only talking about the visual and audio senses.

All eyes and ears.


This has been the case in PC VR for years. The Valve Index has a combined 4.6 MP at up to 144hz, with reprojection and motion smoothing so that the image still updates with head movement even if the application hitches. Or now you can get the Vive Pro 2 with 12 MP at 120hz. Or wireless oculus quest 2 with 7 MP at 120hz. The novelty would be a clear screen, or forward facing cameras, and a useful AR overlay.


You rmemeber Blaxxun from the '90s ? Man, that was a Metaverse.


That's where my mind immediately goes when I hear 'metaverse' too


Like a hipster wearing a monocle fashioned with metaverse-substrate. The new age user interface is like cup of coffee and a pipe with eco-friendly, translucent tobacco.


I appreciate the irony of your second paragraph


Does this have something to do with the reality distortion field?


>> Speak plainly and simply, folks.

> Of course, I think I can help. (…)[jibberish]

Lol, well done!


Listen to him. He speaks jive.


Completely agree. The words "metaverse substrate" just set my BS meter to a thousand, so I didn't even bother reading the article at first. Then I did read (most of it), and still my BS meter was at a thousand.

If you can't explain what you mean in simple, easy to understand language, and you're not talking about quantum physics, you're just bullshitting. Should be required reading: https://en.wikipedia.org/wiki/Politics_and_the_English_Langu...


As alluded to in another reply, it is a bit of a BS term!

But it’s also not totally unfair. I think much of the metaverse & AR ecosystem doesn’t have great terminology.

I mean ‘slap some goggles on and touch floaty doo-hickeys that you see but they aren’t really there’ doesn’t quite have the same curb appeal…


I assume it's a reference, at least in part, to the virtual reality world in Neal Stephenson's "Snow Crash".

https://en.wikipedia.org/wiki/Metaverse


Which is where this falls apart. In the book the Metaverse is basically a completely novel location in “VR”.

If we’re pursuing a Metaverse, VR Chat is closer to a Metaverse than Apple.


Ads that follow you around in VR. There will be no escape.

Also, Apple will try to make useful things appear in your VR goggles - like showing you how to get home on a map, when in fact you're planning to go out for a drink, or go grocery shopping, or anything but.

This is a terrible article, but the idea is - if not sound, at least potentially interesting.

It's basically enhanced location-aware cognition, taking input from everything around you - sound, video, tactile input, GPS location, event history, AI-enhanced memory - and combining it in real time to produce (checks notes...) "a new class of life-enhancing interactions."

The problem? It needs a lot of moving parts working together seamlessly with better-than-human performance and reliability.

Otherwise it will be a kludgy distracting nightmare of failed meta-everything - like the most annoying and useless intern anyone has ever had, only everywhere.


HYPER-REALITY from 2016 is a very well done imagining of that AR advertising dystopia. https://www.youtube.com/watch?v=YJg02ivYzSs


> Otherwise it will be a kludgy distracting nightmare of failed meta-everything - like the most annoying and useless intern anyone has ever had, only everywhere.

Clippy?


Can I just get Alexa on a drone that perpetually follows me around and broadcasts (or projects visually) occasional targeted advertising? /s


Don't give them ideas, the marketing team is everywhere

/s (I hope)


Except it won't go on a coffee run.


Great, then I look forward to another useless technobauble being hamfistedly shoved down my throat.


> Can someone please define 'metaverse' in the context of... whatever this article is about?

Personally, I dislike the term and I believe that it causes people to lose credibility when they use it. However, I can define it to some extent.

Consider the term 'universe' in the context of 'a particular sphere of activity, interest, or experience'. For example, the 'Harry Potter universe'.

The 'metaverse' is simply such a 'universe' that you can't really opt out of, because it's a layer on top of, rather than distinct from, the regular old world. This is why AR is often considered to be such an intrinsic aspect of it.

Now, you might say that 'the internet' is just such a metaverse, but it isn't because it doesn't benefit the people who peddle the buzzword.


I guess I don't get what's so meta about it. We're not talking about a universe that describes universes.

Metaphysics = the physics of physics

Metascience = studying science itself with science

Metaphilosophy = Philosophy about philosophy itself

Metadata = data describing data

Gaming 'meta' = the game of how to best play the game

'Meta' is one of those ultra-abstract terms that obfuscates the meaning of whatever word it is attached to unless the listener gives or has given the word a fair amount of thought. Ah, then maybe that is why it is being used here.


> Metaphysics = the physics of physics

Metaphysics isn't really the "physics of physics" though. It is part of philosophy, not physics. It isn't even the "philosophy of physics", since that names a branch of philosophy distinct from metaphysics (albeit they do have some points of overlap.) This is an example of where the meaning of a term is different from the sum of its parts.

The philosophical implications of quantum theory (and its interpretations) is one part of the philosophy of physics, but the study of those implications go beyond just metaphysics proper and also extends to other areas of philosophy such as the philosophy of mind, epistemology, the philosophy of logic, etc. Conversely, there are some debates in metaphysics with which contemporary philosophy of physics doesn't concern itself with much, such as the relationship between existence and essence, or the problem of universals.


I think that in the context of AR they actually are trying to talk about "universes that describe a universe".

A "metaverse" is a "universe" in the sense that it's a shared 4D spacetime that you can explore (using some technology to give you a window into it). It's "meta" because it's layered on top of our "base universe": It's there to describe and augment the physical universe we exist in.

For example: If, in your vision (via glasses, or a HUD, or a pane of glass or a brain implant), there's a big 3D arrow hovering above a road then the arrow exists in the "metaverse" and the road exists in the physical universe that the metaverse is describing.


The definition of meta has changed over time. My understanding is it got the current "abstract" "thing aboout itself" definition long after the term "metaphysics" was coined, which probably originally meant "things we don't have a good name for but should probably file after the stuff about physics"


Exactly, I believe the term comes from Aristotele's work "τὰ μετὰ τὰ φυσικά", literally "the after the physics", which may have just had a very practical meaning: writings that come after the physics writings.


Why wouldn't you be able to opt out of some AR world?

I thought metaverse just meant Second Life 2.0.


You're right, that wasn't the best choice of words. I just meant that it would be difficult to ignore or be uninfluenced by it, at least if there were able to exist "THE metaverse" rather than several fragmented attempts.


In general, the term 'metaverse' is commonly used in the AR/VR space to describe virtual worlds. Today, in (virtual) reality it feels somewhat analogous to an "immersive operating system". It's the entry point for doing other things on a VR system.

Some examples: Roblox, Facebook Horizon, VRChat, NEOS


It's right there in the first paragraph:

"I’ve been less sure they understood the full scope needed to build a compelling AR ecosystem – a metaverse."


I saw that, and the definition still says nothing.

Ok, so a metaverse is somehow the realization of the "full scope" needed for a compelling AR ecosystem? And what exactly does that mean, and how does that define this particular usage of the (heavily overloaded) term 'metaverse'?

Meta meaning 'about', so a universe about a universe?

edit/ugh: googling the term says 'a virtual-reality space in which users can interact with a computer-generated environment and other users.'

Like wtf does that have to do with the meta root word here?

There is far too much abstraction in this vocabulary defining a thing that is very real.


To understand the metaverse first you need to understand the metaverse. That's what I take from these comments and the article.


mind_blown.gif /s


It's the universe (the regular meatspace one) augmented with metadata, traditionally named such in AR/VR circles. Metadata like names, statistics, directions, etc. Such as a HUD you'd see in first-person video games.

Another use of the term (not here) would be a way to jump between VR systems, but they never generalized well enough to make that a significant thing. Right now Steam would count


The Metaverse in this context is a continuously updated digital copy of the real world, where the "environment" is generated passively by users.

Imagine a city park where a handful of people have glasses with cameras on them that are continuously sending a geo-located feed of the images to a map server which processes it into a 3D reconstruction. You could watch in real time as the city park is reconstructed and updated in real time 3D, add in virtual content, make content "layers" for different digital assets, track interactions etc...

It's a globally crowdsourced, consistently updated real time digital twin of the world.


I was also confused and clicked just to try to figure out what the headline meant. In the very beginning of the article there's this line:

> I’ve been less sure they understood the full scope needed to build a compelling AR ecosystem – a metaverse.

Seems this is about augmented reality being integrated into the regular experience of a significant population.


I could be wrong, but sifting through that utterly terrible corporate buzzword filled whatever the hell that was...

I think it's what Apple's calling their new Augmented Reality platform...I think...or what the author is calling what could be Apple's potential AR platform...maybe?


A highly-reflective model of Earth's visible and intangible aspects outlined by David Gelernter (1991) in his book “Mirror Worlds”.


Basically bunch of wanking about futuristic AR sci-fi technology that the author has decided to accept as proven to be coming, but actually isn't.

See flying cars 20 years ago.


Slightly adjacent to the article, I don't think Apple cares all that much about the AR metaverse/ecosystem.

My own prediction, using many of the same data points as the author, is that Apple is trying to create a suite of features that act as a sort of Software Personal Assistant. For years Apple has consistently put in better sensors and larger TPUs than strictly necessary for the expected lifetime of the device. We're already seeing some of the results of this with the Health app.

Apple understands the profit potential of platforms, so they'll make some of the data that enables these features to App Developers and that in turn may enable AR, but I doubt any Apple executives are seriously focused on bringing AR capabilities to developers.


I hope you're wrong. I understand that the technology for the dream that "Google Glasses" was selling isn't quite there, but I hope Apple has a research team moving closer and closer to it.


Non-gaming AR tech will have to be extremely good out of the gate to dodge the issues that sank Google Glasses. I suspect it will remain an active research project for as long as it takes, and in the meantime Apple will chart their product roadmap along a sequence of technologies that they can be confident of having ready on 1-5 year timescales.


They may still make that, but I suspect developers will not have full access to the full power of the platform.

A thought I should have fleshed out in the above post: the kind of sensor data that enables AR can also be used to deduce a great deal of personal information[0]. With Apple taking a strong position on privacy I suspect they will make only a limited subset of data available to developers.

[0]: https://www.researchgate.net/publication/332386880_Privacy_I...


> I suspect they will err on the side of hiding this data from developers.

More likely, they will allow developers some mechanisms for leveraging the data on device, without being able to exfiltrate it.


"Not quite there" is an understatement. The advancements necessary in optics, battery technology and GPU performance for a practical AR wearable are considerable, this is without considering the need for a fashionable form factor and practical heat dissipation. IMO we're at least a decade out from a serious MVP.


>> SwiftUI is getting more capable with every release, and appears to be the long-term UI platform for apple. It also isn’t just about ‘below the glass’ experiences.

SwiftUI 2 is so much better. I am taking a 50 hour course, and spent a few hours hacking this morning. So much better than when I tried it 14 months ago.

Spacial Audio works really well. They are probably using something called Head Related Transfer Functions, which I used at my company’s VR lab in the mid-1990s.

The example AR Swift Playgrounds examples are also pretty cool.

I agree with the general premise of the article!


The conclusion almost followed the points except for one major failure of imagination: VR. Apple will release a VR product but it can only do so after the migration to Apple Silicon. Anyone familiar with M1 performance and VR games on Intel will immediately see why. Did anyone else notice/remember what specific use case Apple touted for the the newest generation of Mac Pro? VR content creation. Very resource hungry. Apple can't release the VR device we know they could, without giving content creators the tools. What makes VR more than just a gaming product? Giving everyone with an Apple laptop the ability to create content for a meta verse. You might be thinking Ready Player One, I'm thinking more like a better Roblox. There's massive ARR potential in a VR metaverse ecosystem where everyone can be a content creator / inhabitant. Apple is the only company that could put it all together and make it work from soup to nuts.


Some of this stuff is cool, and I can see how it might fit together (although I doubt half of it will pan out). But some of it just seems like nothing.

The Shazam stuff seems very limited in use case, and I'm annoyed at "AppClips" because they should just be websites.

Notes+Spotlight+Shared with you all seem like they're inventing new paradigms to avoid ever adding a user file system on iOS, since Apple is opposed to that. It's possible that those new paradigms will be great, but I think they'd be better if Apple gave up a little control.


https://en.wikipedia.org/wiki/Files_(Apple)

That's existed for 4 years. A user file system is effectively available on iOS, even if it's not exactly what you might expect coming from a desktop OS.


AppClips and AR actually fit together really well. If a museum, theatre or other space has an AR app, it’ll be a lot smoother to get people to use it if they just have to tap their phone on a thing and then hit “OK”. Then when they leave the venue and no longer need it it’s gone.


> Device-local speech recognition is fundamental to AR — we can’t tap our feet & twiddle our thumbs while we wait for a round-trip to the cloud to conclude how to handle “Group these five items, and remember them for next week when I’m at the office”

I'm not convinced that this is true. Round-trip latency has gotten a lot lower, especially if you take edge computing into account. 20-30ms round-trip is not unusual. If your mic feed is being streamed in real-time, and the program is able to achieve a high-confidence prediction at (or before) the moment you finish speaking, I think it should be possible to deliver an experience that feels instantaneous, at least for users with cutting-edge connectivity. For visual interactions, 100ms feels instantaneous; I bet tolerance for spoken interactions is even higher.


Author seems confused about what a metaverse is. Here's my attempt at actually building a metaverse substrate: https://substrata.info/


This guy is way too excited. He's thinking he's gonna get Cyberpunk 2077 but in reality he's getting a combo of Wall-E and The Final Cut.


I both welcome, and fear, my floating armchair to take me on my doomed Buy-n-Large interstellar mission!

I wouldn’t say /too/ excited. Maybe ‘had his brain pulling a Charlie from Always Sunny in Philadelphia with that Pepe Silvia meme thing’.


Are you out of your ducking mind?

Sent from my iPhone


Technobabble aside, I'm betting the battery life on the Apple Glasses will be as bad as on the Apple Watch or worse.

And people with prescription glasses won't be able to afford the Apple Prescription(tm) anyway.


Marketing buzzwords.


Anyone aware of any good podcasts about AR/VR?


This very light grey text on white background = almost unreadable. (Firefox 89 on linux)

https://contrastrebellion.com/


Apologies for this -- I apparently failed to test on dark mode in Firefox. I fixed it up just now (https://github.com/grork/personal-blog/commit/b9df1e924ef296...), and should be much more readable.


Much better, thanks!


Reader mode to the rescue! It is almost the default way I consume most blogs and news sites, reading experience is so much better!


Same here. I use it on every other website and it's just great.


"Please don't complain about website formatting, back-button breakage, and similar annoyances. They're too common to be interesting. Exception: when the author is present. Then friendly feedback might be helpful."

https://news.ycombinator.com/newsguidelines.html

Edit - ah: I see the author is present. Ok then!


Interestingly, the background is black for me in Chrome, and white in Firefox (both under macOS)


Almost seems like Chrome has some default styles for dark mode, such as a black background on <body>.

Edit: TIL about the color-scheme (draft) CSS property. FF doesn't support it.


similarly, it renders light on black/dark grey for me in safari.


> "Please don't complain about website formatting, back-button breakage, and similar annoyances. They're too common to be interesting. Exception: when the author is present. Then friendly feedback might be helpful." - https://news.ycombinator.com/newsguidelines.html


Something is screwy with the dark-mode CSS.


Yep, it's fine unless you use a dark mode, in which case it looks like this:

https://i.imgur.com/7XwxrSb.png

If you're not using dark mode, it's #EBEBEB on #121212, which has plenty of contrast:

https://contrast-ratio.com/#%23EBEBEB-on-%23121212

If you're using dark mode, it doesn't simply do nothing, which would be the reasonable approach with those colors, it gives you that light gray text on #FFFFFF pure white background. It's basically the opposite of dark mode.


Apple makes the idea of Plan9 viable using wireless technology and smart protocols.


[flagged]


Well, 'metaverse', here, comes from 1992's Snow Crash and has been in use for nearly 30 years. Not really just the past few months.


If this is correct, Apple's vision is that in the future, all their customers will be glassholes. They could be right. I never expected people would walk around wearing iDweebs, those silly headphones.


Apple keeps trying to make AR a thing. Bless their heart...


"Please don't post shallow dismissals, especially of other people's work. A good critical comment teaches us something."

https://news.ycombinator.com/newsguidelines.html


Bless their heart, and their $204b in cash, and their team of 1000+ engineers dedicated to AR.


I'm really looking forward to not engaging with any of this stuff at all!


Some of the ideas in the article make a lot of sense in the context of AR, specifically the parts about on-device Siri and Spatial Audio.

However…I’m sorry, the article as a whole just leaves me with the taste of Apple butt-licking.

So many of these parts of this “metaverse” are typical recurring revenue and ecosystem lock-in encouragements.

Find My: Let’s sell some high profit margin trackers and keychains.

SharePlay: let’s sell more TV+ subscriptions.

Universal Control: platform lock-in: make you question using a Windows desktop instead of an iMac if you already own an iPad.

Spatial Audio: sell more headphones

Notes features: sell more News+ subscriptions, more vendor lock-in.

ShazamKit: data mine advertising info. Yes, Apple has ad platforms, and with each passing day their business model creeps closer and closer to Google’s (e.g., When they inevitably drop Yelp in favor of their own review system, or if they launch their own search engine)

(Apple can still mine data and use it to sell you things and understand what you’re likely to buy even if they “respect” your privacy and leave all the data on-device. They can still get you to click an affiliate link or buy an app without leaving the realm of on-device ML.

And remember that Apple’s marketing is not the same as their documented privacy policy.


Heh, I'd say at least Apple is differentiating...

Google:

GMail -> Let's sell some ads

Android -> Let's sell some ads

Android Auto -> How can we put ads in cars?

YouTube -> How can we show people more video ads?

Search -> Let's sell some ads.

Google Music -> Did this sell enough ads? If not lets rebrand it and try to sell more ads.

etc.

Anyway, not a slight against Google, but you can do this with any large company probably. I'm not sure how meaningful it is.


Oh please describe all of these Apple ad platforms.


Sure!

https://searchads.apple.com/

https://www.businessinsider.com/apple-hires-chaos-monkeys-au...

The App Store has sponsored apps.

Apple News has ads in the app, before you click on any articles.

Apple Music/iTunes Store/Books has promoted artists/albums/movies/TV. It’s unclear whether any of these are paid promotions by the content producer but they’re in a banner so I have to assume Apple gets something out of it.

Apple Podcasts also has a featured section, likely paid ads.

And, as I mentioned, it’s extremely likely that Apple is working on a replacement for Yelp for Apple Maps, which will potentially be a big source of revenue.

And of course, there are ads for Apple’s own in-house TV+ content. Not an ad network, but if they know what you like they can better promote their TV shows. By default the TV app sends random push notifications.

The more Apple gets into content the more knowing your personality and interests is important to their business model.


So… one search “platform”, which is just app ads in the store. Ok.


This is the most biased take on useful platform differentiating features I’ve ever seen.


Indeed, it’s totally nuts to suggest that the world’s most capitalized company might be focused on revenue growth instead of the “metaverse substrate”.


Meh, I think that's not out of line with how other vendors (e.g. Google) are treated here right now.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: