Hacker News new | past | comments | ask | show | jobs | submit login
Apple's Shifting Differentiation (stratechery.com)
292 points by kaboro on Nov 11, 2020 | hide | past | favorite | 317 comments



> it is possible that Apple’s chip team is so far ahead of the competition, not just in 2020, but particularly as it develops even more powerful versions of Apple Silicon, that the commoditization of software inherent in web apps will work to Apple’s favor, just as the its move to Intel commoditized hardware, highlighting Apple’s then-software advantage in the 00s.

I think Ben is missing something here: that the speed and specialist hardware (e.g. neural engine) on the new SoCs again give developers of native apps the ability to differentiate themselves (and the Mac) by offering apps that the competition (both web apps and PCs) can't. It's not just about running web apps more quickly.


It's a nice idea in theory, but I don't see Apple putting in the effort to make this fruitful.

For example, we just saw an article rise to the top of HN in the last couple days about the pathetic state of Apple's developer documentation. Their focus seems to be less providing integrations into their hardware, and more providing integrations into their services. Meanwhile, developers increasingly distrust Apple because of bad policies and press around App Store review. It's a mess.

I agree that Apple could and should help app developers use this cool new hardware. I'm sure there are good people at Apple who're trying. But the company as a whole seems to be chasing other squirrels.


There are some areas where Apple is prioritizing getting developers on board with their hardware, and the neural engine seems like one of them.

Over the past couple of years, coremltools [1], which is used to convert models from Tensorflow and other frameworks to run on Apple hardware (including the neural engine when available), has gone from a total joke to being quite good.

I had to get a Keras model running on iOS a few months ago, and I was expecting to spend days tracking down obscure errors and writing lots of custom code to get the conversion to work -- but instead it was literally 3 lines of code, and it worked on the first try.

[1] https://github.com/apple/coremltools


You're earning money with a model deployed on an iOS device? Now that's an achievement. It's even rare to actually get productive models in the first place but then doubling down on less powerful hardware then you could get with aws is just mind-blowing to me in an production context


It's the age old thin client vs. fat client debate repeating itself again. It seems like as the chips & tools get more mature we'll see more and more model deployments on customer hardware. Transmitting gigabytes of sensor/input data to a nearby data center for real time result just isn't feasible for most applications.

There's probably lots of novel applications of AI/ML that remain to be built because of this limitation. Probably also good fodder for backing your way into a startup idea as a technologist.


Suppose you want to do object detection on a phone’s live camera stream. Running your model on aws is probably infeasible, because you’re killing the users data plan while streaming frames into your remote model, and network latency kills the user experience.

On-device detection (“edge ai”) is gaining steam. Apple recently purchased a company called xnor.ai which specialized in optimizing models for low power conditions.


One clear example of this is audio related apps. iOS has a rich ecosystem of DAWs and VSTs because their platform seems to be much better with low latency for audio. You don’t find the same on Android.

That’s a result of Apple putting effort into hardware + software to make that happen.


They showed in the keynote Davinci Resolve running on a Macbook Air with impressive performance. They could have easily stuck to demos using Final Cut like they often do, so this seems like a pretty good sign that from day 1 they do care about 3rd party software running well. They've also been showing more and more games which are obviously also performance sensitive. I fully expect Tensorflow models and other major libraries will be able to take native advantage of the Neural Engine in the near future as well.


Very largely agree (and the chasing squirrels analogy made me laugh!) but of course the speed comes without any extra effort from Apple - so if your native app becomes attractive because its now that much quicker - say some form of video editing - then you're good to go.


Not quite. If your native app becomes attractive, Apple might replace you with a built-in clone and then use that as the reason to kick you out of the app store.

If I remember correctly, that's what happened with flux.


f.lux was never allowed in the iOS App Store, because it needs private APIs to change the screen color temperature.

Was it on macOS App Store at one point and then kicked off?


The term GP is referencing is Sherlocked[1]. As someone familiar with the iOS jailbreaking ecosystem circa 2010, you could definitely loan the term to apps that are from outside their walled garden.

That said, it would be silly of them not to in some of these most obvious cases: a flux/redshift comparable feature is now built into most OS’s as we’ve become attached to our devices, and Sherlock was argued by critics of the term to be a natural progression of iterating in their file indexing capabilities.

[1]: https://en.wikipedia.org/wiki/Sherlock_(software)#Sherlocked...


It's not properly built in. OLEDs support full conversion to red-only light, allowing you to preserve your night vision. No other app or built in implementation except f.lux and cf.lumen allow for turning all colors off except red. This is the main reason that I jailbreak my android phone (a oneplus). Not ad blocking, not side loading apps, but because I want to not get my eyes destroyed every night when I try to go to the bathroom and use my phone as an impromptu flashlight...

What the fuck guys? Do you just not care about astronomers? Why is it that no one has properly implemented all of f.luxs features?


On iOS have you tried Settings > Accessibility > Display & Text size > turning on Color Filters and sliding intensity and hue to the far right? Maybe set the triple click shortcut to Color Filters?


I am not aware of a sherlocked being kicked out of the App Store for duplicating features of Apple’s version, though. That was quite a bold claim, asserted without any example.


Yes, but there have been instances of Apple (and in general, ARM as well, though not as much) adding special instructions to the A series chips to make JS execution faster. They can make web apps run better than other platforms. And then they can make native apps run even faster. Both are a advantage which the competing ecosystem doesn't have.


The special CPU instruction for JS thing is a bit silly. Javascript defines it's floating point rounding behaviour as what Intel CPUs do. ARM behaves differently by default, so they implemented an instruction to emulate the Intel behaviour specially to avoid Javascript running too slowly implementing the excepted behaviour in code.

This doesn't give ARM chips an advantage over Intel CPUs at executing Javascript for obvious reasons, once you know why they added it.


> This doesn't give ARM chips an advantage over Intel CPUs at executing Javascript for obvious reasons, once you know why they added it.

I believe the point is that it's one example (of hundreds, maybe thousands?) of performance paper-cuts addressed by Apple hardware that result in significant performance-per-watt advantages over devices not using Intel CPUs.


And it effectively just sets the FPU flags, runs the instruction, then resets the FPU flags. It just avoids the JS VM from having change FPU flags repeatedly.


Could you point to one of those (Apple only instruction added for the only purpose of accelerating JS)?


ah. also jazelle.


There may always be a small group of dedicated Mac development enthusiasts, but the last decade has shown that companies will trade performance and frame rates for maximum usefulness and time to market.

Apple's performance boosts will reward those companies who never valued performance. Why would they change their approach now?


I side with you. It is not about a faster chip nor hardware over software. What Apple understood is, that specialized chips or hardware allow for even better software and therefore services.


> specialized chips or hardware allow for even better software and therefore services

Based on the front page of HN, "better" is not a word I'd choose


Apple is also working against itself in that department. As far as I know a webapp does not need to be approved by Apple to go live.


The assumption that Apple wants to approve all apps is wrong.

Apple sees App Store apps, and web apps as having different advantages, and it is in their interest to have the best platform for both.

It’s not just me saying this, they keep saying it too, and proving it by investing in making web apps run better.


And yet they've completely nerfed PWAs on iOS. So they can say what they like; I don't believe it.


Not everything they do will fit your view of how they should support the web.

Believe whatever you like.

But this means you are also choosing to ignore absolutely enormous investments at every level of the stack that they have made to increase web performance, adopt standards and improve web user experience.


It's hard to square the claim that they've made enormous investments when you compare feature support against budget. Apple's operating budget is 200x Mozilla's ($85B vs $0.45B), and yet Safari lags significantly behind Firefox in features (and for some the use cases, in performance as well).

Disclosure: I work at Google on JS libraries, and at one point was in the Chrome org, but my opinions are my own.


It's reasonably clear that Apple's main focus for Safari is on memory, power use and performance rather than features so that doesn't in any way disprove that they are investing heavily in it.

You'll probably know better than me but Apple's work on Webkit was presumably worthwhile enough for Google to fork it into Blink (no problem with that but maybe worth acknowledging that fact).


I think it highlights a distinction between Apple's past work and their recent work.

Google forked Blink because for whatever reason they were unsatisfied with the state of Webkit -- nominally because they wanted to take a different approach to multi-process, but there may have been other technological and project direction/pace disagreements. Since then, a number of browsers have switched from Webkit to Blink/Chromium as their engine, and arguably Safari is falling behind on new features and overall quality (weird quirks that require web devs to work around).


Don't disagree on features and some aspects of quality but I think it's a mistake not to recognise that the overall user experience also depends on other factors.

If Apple's focus is on getting better power consumption and memory use (esp on mobile) then that's still investing and arguably that does as much if not more for users and the web than adding more features.

PS Let's not forget that Apple are still standing behind WebKit when Microsoft have given up on their own rendering engine so let's give them some credit for helping to avoid a Chrome only web.


Google has a strategy to destroy other platforms and replace them with Chrome.

It’s hardly obvious that this is best for users in the long run.

I point this out not to say it’s wrong for them to attempt this, but because it makes no sense to use Google’s strategic decisions as evidence of Apple’s intentions or investment.


Safari is known among many web developers as the worst browser to develop for.


I would agree if us poor souls didn't have to still support ie11.


> Apple sees App Store apps, and web apps as having different advantages, and it is in their interest to have the best platform for both. It’s not just me saying this, they keep saying it too, and proving it by investing in making web apps run better.

The method to pin a webapp to the home screen is substantially worse than it used to be.

There was a period where javascript running in Safari ran significantly faster than javascript running in a webapp opened from the home screen, or in any other app opening a web view. Is that still the case, or did they decide to share that function?


It hasn't been the case for years.


I think you're confusing the Mac with iOS. Native Mac apps don't have to be approved by Apple unless they are on the Mac App Store.


Apple has made publishing apps without their approval significantly harder during last couple of years. With pretty much mandatory notarization there were several app developers who just stopped developing macOS software due to increasing amount of restrictions and process involved.


Very fair comment. I guess that most developers don't see it as a huge issue though (I've not seen any issues with Apps that I use on Catalina). Direction of travel is towards more onerous Apple involvement.


To the contrary, most apps I have to install from outside the App Store now aren’t signed and need the workaround (right click, Open, OK the scary dialog, open again) to run.


You make this sound like a new problem. You have had to do that for unsigned apps for years and at least in my experience the majority of apps that come from outside the App Store are unsigned. This has been a thing for so long that I almost instinctively right click, Open the first time I run an app I downloaded from the internet


Agreed but I think it's stretching it a bit call it a 'workaround' when it essentially tells you what to do :)


Are you really ‘scared’ by the dialog?


Do you not think we're moving towards a future where both platforms converge into just iOS?

Think we can't ignore that the iOS install base has been larger than the MacOS install base. Start looking at it that way and the iOS way of doing things is the norm in the eyes of Apple and MacOS is the odd one out.


> Do you not think we're moving towards a future where both platforms converge into just iOS?

Could be, but as far as the new cross platform frameworks are shaping up right now it looks like their strategy is slightly different. Apple is seemingly creating a developer ecosystem to loosley describe interfaces and share them between platforms while they ultimately decide how your UI is rendered. Maybe you're right and one day that means flicking a switch and everything is unified. I also look at something like iPadOS for instance which started as extremely similar to iOS and has now diverged and become it's own thing, different to both the Mac and iPhone.


"Everything should be made as simple as possible, but no simpler."

There should be limits on this. It's sad to be the baby outside the window, drenched in bathwater.


I think the bathwater is the least of the defenestrated baby's problems. :)


It looks like Apple is pushing towards having a single app store, see the news about the new macs being able to run iOS apps. I can definitely see them eventually moving to only allowing apps from the iOS store installed on macs, the same as how ipads and iphone are now.


I wonder if the next step after that will be Apple dictating what new apps are desirable in the app store and then a bit further Apple making apps themselves and only outsourcing the support or signing franchise deals. Once the hardware could only be repaired by Apple, they could move completely into a subscription model where you could subscribe to e.g. Office or Streamer package that would include a laptop for two years and a predefined set of applications.


Why would anyone subscribe to that?


You will find that many people would pay for having their worry about choosing the right laptop and software completely removed. They also won't have to worry about repairs etc. as long as any damage would be accidental or from a manufacturing fault.


That’s already why people buy apple and AppleCare.

What you are describing is taking this even further in the direction of an information appliance.

I am unconvinced that there is any benefit that your model provides that Apple does not already.

You can already just buy a Mac with AppleCare and install MS office from the App Store.

People may want their choices to be simplified, but they are also going to need to be able to use whatever important new thing comes along. E.g. Zoom or Slack.


If think more people would buy a subscription instead of forking out few grand at once. It will be like a finance instead you won't own the laptop. I also understand this is quite stupid, but I feel that this is the direction Apple is going to go to extract even more money from their target audience.


Leasing the hardware makes sense - basically like the iPhone upgrade program but extended to macs.

I know people who upgrade their Mac every time there is a speed bump and just sell the old one. They would presumably be candidates for this.

I wouldn’t be surprised to see this.

I just don’t see any reason Apple would tie the leased hardware to a limited software bundle.


For sure, but in a world where you have near-infinite CPU power, ram, and performance per watt, the difference would not be noticeable between the two. I really think that the gains made in performance have enabled people to be far more okay with Electron. 6-10 years ago VSCode would have been seen as a non-option, now it's just 'bloated' compared to editors like Nova.


>that the speed and specialist hardware (e.g. neural engine.....

Yes. For one, there seems to be no Unified API for GPU Computing in the PC Market. And Microsoft doesn't seems to be interested in doing a Direct X version of it.

And for NPU, which is increasingly important for things such as Speech Recognition, ( For many parts of the word where Languages aren't easily typed into, they are the default way of input on their Phone ), Photo Face Recognition without using the Cloud. Increasing use in Graphics, Video, Audio productivity apps. There aren't even any specific Hardware on PC market. ( That is why Intel is desperate to move the XE as a co-processor ). And it will be years if not a whole decade before a PC parts comes up and reach a large enough market volume.

And it sort of makes you wonder 3-4 years down the road when the M1 ( excluding Memory ) becomes a $20 SoC, will we see a variant of Mac cheap enough that will hit back at the 1.5B Windows PC Market. Where Apple current has roughly 110M Mac.


Good points but I suspect Apple is more interested in the $1000 business PC market - so say a videoconferencing app that was much, much better as result of using Apple's API's might be a handy way of selling more Macs to business.

Of course set against that they've lost the ability to run x86 Windows in a VM for legacy business apps (but not sure many were doing that anyway!).


How long 'til there's a WebML spec that abstracts of neural engine, tpu, etc.? Similar to WebGPU and WebAssembly SIMD.


Being worked on now as the Web Neural Network API:

https://github.com/webmachinelearning/webnn/blob/master/expl...


Too Long


I thought this statement from the article had more than a hint of truth to it: "Figma in Electron may destroy your battery, but that destruction will take twice as long, if not more, with an A-series chip inside!"


This will allow Sketch to kick Figma's ass as soon as they solve their online multi-user editing


UI designer here, migrated from Sketch (used for about 5 years) to Figma. I'm happy to celebrate Sketch as a great Mac app, and I still keep it in the dock for short tasks, but I'm not looking back. Figma won me over.

And it's not just Figma's collaborative features. Figma made fundamentally better decisions about design tool feature set. Better vector editor. Better concept of "symbol" as a component. Much better approach to auto-layout. Much better approach to shared colors / text styles.

At every step, Figma is just a better designed design tool. And a large part of why it's taking the design world by storm is exactly that. Most design work is done alone, not collaboratively dragging elements on the screen. Figma is just a great tool. Sketch is trying to catch up, but they would need to modify a LOT of their past decisions to get to the spot Figma is at.


I am reading all this Figma talk in utter disbelief. Actually in my view Sketch never had a chance to replace Illustrator or Affinity Designer. Figma? For collaboration if use case is demanding it may be. But suddenly UI design field is filled with decorators doing boxes with drop shadow but having "Designer"opinions. May be I am in mostly silent minority of professionals that don't search and prize tools over essential skills. As a designer your work must be detached of workflow sentiments. Tools are tools. Nothing more. I can make a great UI with Inkscape any day I want. I can implement and test it in pure HTML/CSS prototype. So what? On Apple and cloud computing in general. I like owning things, call me old-fashioned, boomer, or whatever. When I own a piece of software, I can use it without someone logging my mouse and keyboard, without fear of losing internet connection. Is this a small thing to behold?


That's quite a rant, so I'll pick out probably the core message:

    As a designer your work must be detached of workflow sentiments. Tools are tools. Nothing more. I can make a great UI with Inkscape any day I want.
As software development is changing, UI design tools are changing. Inkscape has no concept of Symbols or Auto-layout. Sure, you CAN establish a design system in Inkscape, but it will be mostly copy-paste and lots of resizing.

Figma takes this a step further. Design components can be connected to JS components using Storybooks. Localization can be automatically applied to designs, and elements will automatically adjust because of a powerful, flexbox-like auto-layout.

Sure, you CAN make the same design using Inkscape. But Figma enables previously impossible workflows and makes the end result -- user experience with product -- better.


Please, don't be so quick on the trigger.:) The core message is: Tools are tools. Nothing more. Actually I have been around enough to design and implement pixel perfect designs using Macromedia Fireworks - the originator of the idea of symbols. Figma enables collaborative implementation and in this use case I will consider using it. But the prevalent idea in current moment is that Figma is universal cure of a big problem, a tool to end all tools. Which is not. The biggest problem for me is the lack of originality and deep visual identity in UI design in general. We have powerful computers that can play 4k video games but UI design is boring boxes and most important thing for designers is to collaborate online.


> We have powerful computers that can play 4k video games but UI design is boring boxes and most important thing for designers is to collaborate online.

I think the hidden truth here is that "boring boxes" solve most problems pretty damn well.

Unless your product is literally art/design, then you don't need anything custom, and boring boxes are probably the correct choice.

There is some wiggle room here - It's easier to attract customers with pretty designs, and you can make it easier to onboard a new user with some nice effects and design flourishes. But if you go overboard you differentiate yourself too much and make your product much harder to reason about and interact with.

As a user trying to get value out of a product I mostly don't care what it looks like. I do care a lot if slow animations or videos keep getting in my way. Nice the first time, fucking miserable on the hundred and first.


Again quick. From what I read from you I get a feeling that you are not a designer. Professional design has balance and is build upon technical and UX understanding and proper testing. Boxes can be more than fonts with borders and background colours. And actually when done right emotional impact is rewarding for UX. A good design must have dimensions (visceral, behavioural and reflective level). You can check Donald Norman book: Emotional design. Highly recommend.


I understand where you're coming from, but I think your misapplying some of his advice.

Funnily enough, I've read Emotional Design, It's been a long time (I think it was back in like 2006/2007) but my memory is that most of the focus of the book was on the design of physical objects. I don't really think website design has the same freedom. It doesn't invalidate all of his topics, but it certainly limits how they apply.

But there's a different angle here that I'd ask you to consider - Over the last 20 years, websites are eating up interactions that used to be conversations.

You might have walked to a bank and talked to a teller - Now you use their website.

You might have driven to home depot and asked a store associate a question - now you shop online.

You might have gone to blockbuster and rented a movie from the clerk - Now you browse netflix.

You might have gone to the post office to get some mail - Now that content is in an email instead.

Each of those interactions was just a conversation, literally just sounds coming out of a mouth, but they all achieved useful side effects. While you might have a pleasant chat every now and then, the goal was not reflective/emotional investment. The goal was the utility provided by the service.

I think approaching the design of a site with the goal of evoking an emotional or visceral reaction (ESPECIALLY from the literal appearance of the site) is actually turning the advice of the book on it's head - Put the user first!

If I'm interacting with your site to achieve a useful side effect, whether that's order an item, get the news, see my mail, deposit a check, watch a show, etc - Then my emotional reaction is heavily biased towards how well and how quickly I can achieve my goal. My emotions don't care a flying fuck whether your button is red/blue/green or if your gray is #d3d3d3 or #878787. And I certainly don't want to have to navigate a crazy custom design, just like I don't want to hit a detour while driving home - even if it happens to be scenic.

I do care, a whole lot, about consistently easy to use services, with a low barrier to entry. On the web, that mostly means boring boxes.

---

As a thought experiment, I'm sure you've been to the DMV before (I'm not actually, you might not be US based, but you probably have an equivalent).

Ever had that DMV trip that took 3 hours waiting in line before finally getting seen?

Not happy were you?

Ever had that DMV trip where it was basically empty and you got seen immediately?

I bet you felt thrilled. (probably an overstatement, but at least pleasantly surprised)

It's the same building, same carpets, floors, columns, windows, roof. The only change was how quickly and efficiently you accomplished the goal you had. But your emotional responses were miles apart.

Apply that to websites. I don't want to be looking at my bank's website - I do it because I need to move money or use their services. Make that the priority. Make it with boring, easy to use boxes, and I will love it.

Bury it in menus, or add 10 clicks because "that page looks a little cramped" and I will not be happy.


> Tools are tools. Nothing more.

A screwdriver and an electrict drill are both tools.


I used to be a CADD jockey. Taught 100s of people. Did a lot of 3rd party tooling.

CADD is a 800lb angry gorilla sitting between you and your design.

Yes, it's a poor craftsperson who blames their tools. And, sometimes it's nice when the tool doesn't fight you every step of the journey.

I imagine the same is true for graphic design.


As a developer on the receiving end of designs, in my experience, performance isn't really a discussion point as long as it isn't abysmal. Figma hits it out of the park in every way in user experience, functionality, deliverables, and more. This is a personal and very subjective opinion, but Sketch doesn't feel even remotely close to achieving the same level of quality.


Personally, the user experience is bad in Figma, as each time it wants me to login in the browser while using the Figma desktop app. Every time you start up the app :(

Also requires me have a working internet to be able to use it. Sketch or Affinity Designer don't have these problems. I hope they will fix that problem with Figma.


Being so strongly tethered to the web is absolutely Figma's achilles's heel. Well that, and its needless Electron wrapping for its desktop version. Its file format not being open and well documented (unlike the Sketch file format) is also concerning to me.

So if they…

- Add a full offline mode

- Ditch Electron for a more focused/lightweight webassembly+canvas implementation

- Open and document their file format to allay lockin concerns

…I'd be much more inclined to use it instead of Sketch.


> - Ditch Electron for a more focused/lightweight webassembly+canvas implementation

Actually, that's exactly how Figma is built—their desktop app just wraps their web version while hooking into file system APIs provided by Electron/NodeJS. See https://www.figma.com/blog/webassembly-cut-figmas-load-time-....


Sorry, I used ambiguous language. What I was suggesting was replacing Electron with a standalone webassembly+canvas engine, since much of Chromium is redundant in this particular case.


If all they need is filesystem APIs couldn’t the web app wrapper just be backed by some C++ or similar to provide that?


The document is rendered with WebGL on the canvas but the UI around it (layers panel, properties panel) is React. Not to mention a lot of other business logic for things like permissions that’s shared with the rest of the fullstack app. So either way you need some sort of browser engine.

If you use the native webview, it’ll probably use less memory but be slower because it’s basically running Safari instead of Chrome. It’s probably the wrong tradeoff for Figma because the browser’s memory usage and JS heap memory is pretty negligible compared to the amount of memory the user’s document uses, especially large ones with a lot of images. There’s way more room for optimization there and that has nothing to do with Electron.

It’s fun to think about what the performance would be if it was 100% C++ given infinite resources but realistically it’d be way less productive and more bug prone than React. I’ve written UIs in C++ before,would not repeat. That time would be better spent optimizing actual bottlenecks, like rendering the design file (where the GPU is the bottlebeck).

We actually have a native (not WASM) with native webview build we use internally for debugging with XCode. No, performance isn’t better enough to warrant dealing with Safari issues and shipping that over the Electron + WASM build.


Thanks for the detailed answer! That all makes sense, I actually saw one of your engineers present at a conference a few years ago and was really impressed. The tech blog is also fascinating to read.


As someone who works on multi-user editing - Sketch will not be 'solving' it without a damn near complete re-write and neither will any other piece of software that didn't build a foundation that supports multi-user editing from the start.

For a user, it may seem like 'why can't they just add sync', for a programmer, it's a little more complicated :)


Not if the design community fully embraces Figma and opts not to go back to sketch. I'm already seeing this transition at my company, and I have a hard time seeing management OKing a switch back to Sketch once they iron out their collab issues.


I switched my team from Sketch to Figma. I was able to replace 4 software tools with one (Sketch, Zeplin, InVision, Box). It doesn't do everything as well as the others, but it does them well enough. I have been managing co-located design teams since before Covid. The mix of visibility into my design teams work, the capabilities of sharing components and libraries, and the ability to easily share work across product and engineering has made a huge impact. My team's work is more transparent, up to date, and we can iterate together more easily.


Ben’s blow at Sketch seemed a little excessive to me, but that is the painful path that Evernote has decided to follow to survive. Although the target market is designers and sketch is not multi platform, being full native is hard.


Unfortunately Evernote has destroyed its performance by doing so. The Evernote support boards are full of angry paying users figuring out how to downgrade versions and discussing alternatives.


Don’t tell me... I am a paying EN customer.

Although I find that the biggest performance impact is not Electron but that now you don’t have a local database and a lot of requests are going through the network before the caching mechanisms kick in.

In my opinion, it’s the correct move to make but incredibly hard. Now they have launched v10 they have at most and additionaa billing cycle (12 months) to start cranking out compelling features and polishing the product.

My concern with EN is that they bleed so many paying users that they end being unsustainable. Time will tell.


Don't underestimate Figma's engineering team, what they have built is astonishing and they're working at a much lower level than Sketch's team.


It won't because Sketch is still macOS exclusive. Figma can be used even in a Chromebook.


Sketch will lose to Figma as long as it is not cross-platform. It's as simple as that.


More like "With this new chip programmers can waste twice as many CPU cycles, making everything twice as slow if you don't upgrade".


>> The iPad has since recovered from its 2017 nadir in sales, but seems locked in at around 8% of Apple’s revenue, a far cry from the 20% share it had in its first year

For me, the reasons are that I love my current Ipad pro, I don't have a good reason to upgrade it, and the pricing isn't advantageous.

I change my Macbook often because of work-related reasons (I.e. when changing job I need to get a new one)

I change my iphone often because the new ones have a big enough differentiator (for me at least), but more importantly, the way the pricing is structured with my cellphone provider, I almost get it for free. (I.e. I had to upgrade my plan to get more gig/months and a few other features so I only had to pay $200 for the new iphone 12).

But for the iPad pro? I'd need to pay the full $1500 upfront and the new one doesn't offer a big enough value add compared to the previous ones.


You are aware that just because the true cost is obfuscated by the cell payment plan that doesn't mean you aren't paying for it in exactly the same way as the ipad pro, which you could also buy on a finance plan. There is no such thing as "free" or "almost free", you are paying for it.


2 years ago, I got a sim-only plan for unlimited minutes, unlimited sms and 10GB data for £15/month.

1 year ago, I renewed with the same provider, who offered me the same plan for £10/month.

1 month ago, same provider offered me 12GB data for £10/month. Shopping around indicated I could get 15-20GB for below £15/month if I jumped networks, and stuck on a sim-only plan.

Instead, I got an iPhone SE 2020 128GB (RRP £449) and 40GB monthly data for £70 up front and £26/month.

Total cost of ownership over the two years is £694.

Cost of the iPhone if I purchased it from Apple is £449.

That remains the remaining spend is £10.21/month for 40GB data.

I definitely got something discounted.


The discount cones at the cost of being locked in to the SE and your data cap for the next two years, which may not at all be a bad thing to you, but may be for someone else.


I failed to note that I wanted to buy an iPhone SE this year, which is why I looked for bundles.

The last time I bought a phone (2016) there was no bundle that was cheaper than buying the phone and a sim-only plan.

This time around, the inverse is true.

The only thing I’m locked into is my plan for 2 years. I can do whatever I like with the handset, and intend to offload it in a year assuming Face ID is seamlessly usable in public again by then.


who knew a bunch of shrewd financial wizards and well functioning market will get consumers more shit than otherwise!

here in NZ 12GB is good for 6 months at the cost of £50

a typical iPhone plan for 24month is £80 or thereabouts.


This doesn't seem quite so obvious to me.

1. Do we know what the contracts between Verizon-Apple look like? Verizon may get discounts for buying in bulk or for being a strategic partner.

2. Cell providers (like Verizon) may be funding OP's iPhone 12 with the monthly-payments of other customers that don't exercise their ability to upgrade their phone exactly every 24-months. I think this group may actually be rather large.

3. Interest rates for financing your iPhone via your cell provider's 2-year contract plan are likely lower than financing an iPad on a credit card or other third-party financing.

All of the above likely combine to make the iPhones proportionately cheaper than iPads, possibly significantly.


If he's a developer on HN, there's probably a good chance that he knows that.

Shoving the payments onto the carrier isn't always a bad thing.

For example, I don't always have $1,100 lying around doing nothing, and these days I'd rather be liquid in case of sudden job loss or other emergency. So keeping that $1,100 in my emergency fund and shoving the payments onto the carrier at 0% interest is a better idea than paying up-front for a phone.


I don't know about the US but where I live the plans are the same whether you buy a phone or not. So you are much better off buying a phone from the carrier and you get a considerable discount.


They didn't say what cellphone provider. Obviously it's _somehow_ baked into the plan, but would the plan actually be cheaper if they didn't take the $200 iPhone?


More like at least $600 iPhone.


I still have the iPad 2 and I tried many times to convince myself to upgrade, but I only use it in the kitchen to browse recipes and then I learned about Apple stance on right to repair and that made me not buy any more of their products. If the iPad finally dies, I'll buy Android tablet.


iPad 2 user here, downgraded to iOS 6.1.3, lightning fast, skeuomorphic UI. Still hacking on it. I hope it never dies.


I have a launch day iPad I use for simple things like notes and recipes and e-mail, too. Love the skeumorphism. Too bad Safari isn't usable for much more than Wikipedia anymore, though.


How to?


You mean how to downgrade? There are guides out there. Apple released certain unsigned versions of iOS that you can install on old hardware.


Yes, we have a 2 but Safari is unusable.


Same here, not sure what I'll do about a web browser. I'm just trying to build apps with Objective C and XCode.


I accept that the performance of Apple's chips have increased rapidly in the last few years, but the benchmarks that they are using to compare to various x86 CPUs makes me suspicious that they are cherry-picking benchmarks and aren't telling the whole story (either in the Stratechery article or the Anandtech they got the figures from).

Why am I suspicious? THERE IS ABSOLUTELY NO WAY THAT A 5W PART LIKE THE A14 IS FASTER THAN A 100W PART LIKE THE i9-10900k! I understand they are comparing single threaded speed. I'll accept that the A14 is more power efficient. I'll acknowledge that Intel has been struggling lately. But to imply that a low power mobile is straight up faster than a high power chip in any category makes me extremely suspicious that the benchmark isn't actually measuring speed (maybe it's normalizing by power draw), or that the ARM and x86 versions of the benchmark have different reference values (like a 1000 score for an ARM is not the same speed of calculation as a 1000 score on x86). It just can't be true that the tablet with a total price of $1k can hang with a $500 CPU that has practically unlimited size, weight and power compared to the tablet, and when the total price to make it comparable in features (motherboard, power supply, monitor, etc) makes the desktop system more expensive.

Regardless of whether it's an intentional trick or an oversight, I don't think that the benchmark showing the mobile chip is better than a desktop chip in RAW PERFORMANCE is true. And that means that a lot of the conclusions that they draw from the benchmark aren't true. There is no way that the A14 (nor the M1) is going to be faster in any raw performance category than a latest generation and top-spec desktop system.


> THERE IS ABSOLUTELY NO WAY THAT A 5W PART LIKE THE A14 IS FASTER THAN A 100W PART LIKE THE i9-10900k! I understand they are comparing single threaded speed.

FWIW, the 100W i9-10900k isn't even Intel's fastest single-threaded chip: that's the i7-1165G7 (a 28W part). Intel's desktop stuff is ancient: they're effectively still just Skylake (2015) derivatives; for single-threaded stuff the more modern designs they're shipping on mobile (Ice Lake and later) beat the desktop parts because their IPC is that much faster.

Power doesn't really help single-threaded performance, aside from wide vector units, nowadays.


Is it sustainably faster, or only for a few seconds until it throttles? Searching for "10900k vs i7-1165G7" showed the 10900k as mostly faster with a few exceptions.

I'm curious about next week's launch and benchmarks, Apple's claims compare it to a 1.2GHz i7, which I expect to quickly throttle. That's why I also expect the parent comment to be right, current desktop CPUs will still be faster.


You can see a single-thread benchmark for both of those here: https://www.cpubenchmark.net/singleThread.html (comparing desktop vs laptop tabs), they get a similar score.

The perf vs power charts on that website also put to rest the mistake of thinking perf simply increases with watt consumption.

As for performance vs heat, well, you'd expect even better results from the chip consuming much less power. How does that 100W chip perform with a phone-miniaturized heat sink? Or the power-sipping chip with a double-tower fan cooler?


Agreed, I think you make different design decisions- a mobile CPU is not just an underclocked desktop CPU. Apple also introduced a new CPU, not just put a A14 inside.

I'm looking forward to benchmarks and seeing how well the new machines work - compiling speed, lightroom, responsiveness, how well Chrome works with max 16GB RAM :).


power heat perf - pick one

desktops allow all of them


> There is no way that the A14 (nor the M1) is going to be faster in any raw performance category than a latest generation and top-spec desktop system.

Well, no point in arguing here. You may be right, but the machines will be in the hands of users next week. It would be stupid for Apple to make those claims if they weren't true. We'll see soon enough.

Assuming the claims are true, we shouldn't forget that Intel per core performance improvements have been incremental at best for several years. They've really run into some major problems with their fab process development in recent years. TSMC (Apple Silicon foundry) is well ahead. It has been kind of hard to watch since that has historically been such a strength for Intel. They're a strong company, they'll get it together.


Apple is the company that was cherry picking benchmarks for YEARS as PowerPC was being crushed by Intel. Apple has been making false or at least misleading claims forever.

Not all of them, mind you, but you need a boulder of salt.


PowerPC chips had some advantages over Intel, which was why they were used to create supercomputers for a while, and were picked to power the first Xbox.

PowerPC’s biggest flaw was power efficiency, a situation which became critical as Apple’s sales skewed toward laptops. The G5 was a beast but it ate power like a beast too; it was never going to work in a laptop, and so Apple had to switch.


It's the Xbox 360 which had a PowerPC CPU, named Xenon. The first one used a standard Intel x86 chip.


To be fair, all consumer chip companies' PR is hype-ish. Anyway we should wait for independent benchmark rather than official Ad.


Maybe I just need more education in this realm but I’m not sure why a difference in electrical wattage makes it physically impossible for one processor to produce a better result than another processor.

In the presentation, Johny Sruji seemed to place a bigger emphasis on the reduced power consumption than he did the speed. Saying things like, “this is a big deal” and “this is unheard of”.

In my mind, the argument of wattage seems analogous to saying, “There is no way a low wattage LED bulb will ever outshine a high wattage filament bulb.” I have assumed that we’ve been able to make leaps and bounds in CPU technology since the dawn of computers while also reducing power consumption.

But maybe there is some critical information I am missing here. I’m certainly no expert and would love to hear more about why the wattage aspect holds weight.


So there is technically such a known limit due to a link between information-theoretic entropy and thermodynamic entropy, which provides a lower bound on energy usage for a particular digital circuit via the second law of thermodynamics. In simpler terms, there is an unavoidable generation of heat when you "throw bits away" like AND and OR gates do. However we are several orders of magnitude away from that efficiency bound in today's chips, so your analogy to LED bulbs is more apt than you may realize: LED bulbs are still far away from their theoretical maximum efficiency, but they're still a massive improvement over incandescent bulbs.

If you want to know more about this limitation, I suggest looking at a way of organizing computation that avoids this issue called "reversible computing"[1]. As I said, it won't be of practical significance for classical computing for a long while, but it's actually pretty fleshed out theoretically.

[1]: https://en.wikipedia.org/wiki/Reversible_computing


(Apologies I’m not sure what your educational background is)

Basically, higher wattage makes chips create more heat in shorter time.

Heat in general is destructive, like with cooking or with camp fires or when a car “over heats” and stops working. Silicon chips are very detailed and any small change could make them stop working. So the heat applied needs to be below some threshold (i.e. don’t let it get too hot).

If the chip needs more wattage it creates more heat and that heat needs a “heat sink” and fan to protect the chip from degrading.

Heat sinks and fans require a lot of space, high surface area to volume ratios. Take a look at the PS5 tear down, 90% of the insides are a heat sink. Laptops and phones don’t have a lot of room for heat sinks or fans.

Therefore, if the chip can use less wattage then it will get less hot. Meaning it can work better in “fan-less” devices like the MacBook Air and the iPhone and iPad.


> THERE IS ABSOLUTELY NO WAY THAT A 5W PART LIKE THE A14 IS FASTER THAN A 100W PART LIKE THE i9-10900k

"RIM thought iPhone was impossible in 2007": https://web.archive.org/web/20150517013510/http://www.macnn....


They thought the iPhone was "impossible" in the sense that it couldn't offer everything it claimed to offer without having terrible battery life.

And they were absolutely correct: battery life on the original iPhone was abysmal. But it turns out that consumers didn't care.


I don't have the stats, but I know for a fact I've never owned an iPhone (back to the original) that needed to be charged midday. Not much more matters to the average user.

And I'm sure there are power users who killed their original iPhone by noon in 2007, and I'm sure there are power users who do the same today.


This was in the time of dumbphones, which had battery life on the span of days, not hours. Compared to cell phones back then, it was relatively abysmal to need to charge your phone once a day. Even Blackberries didn't require daily charging back then IIRC.


For sure. I remember my Ericsson T39 was multi-days, and I think the extended battery (which made it thicker) took it to... 9 days? Maybe more? Insane. Of course, the screen was never on because there was nothing to do with it :)


I used / bought the original iphone and thought it was fine, and the phone was magic.

The next one or two though had really bad battery life (iphone 3g?) I mean, 3-5 hours active use down from 7-8 (which meant you needed to charge one in morning or evening usually).

Remember, the original iphone had data, but it was pretty slow (but still amazing).


Well, you overlooked the battery side because of other features. But you have to remember that battery life for the average phone back then was measured in the tens of hours. I'd use to leave for the weekend with my Nokia without a charger. A decade later and smartphones aren't there yet. We've just learned to accept the pain :-)


Put it on low power mode and only use it to make a few calls and send a few texts and an iPhone will easily last a weekend. The real issue is that many people actually use their phones constantly, and it's not just for calls/texts.


THERE IS ABSOLUTELY NO WAY THAT A 5W PART LIKE THE A14 IS FASTER THAN A 100W PART LIKE THE i9-10900k!

An A14 is both faster and lower power than a 6502.

Also, why are you shouting? It's just computers. It's not important.


>Also, why are you shouting?

Exactly. It is very annoying. Especially coming from recently registered accounts. Instead of posting a question asking why, when I could have Explain to him like he was 5, it is now a massive rant.


Same as a Cray XMP! That baby used 345 kilowatts.

But comparing against 40 year old technology has nothing to do with comparing against current offerings.

We'll just have to wait a week to see how it fares compiling Chrome.


While it's Intel's latest, the i9-10900k is hardly a current offering - its yet another spin of Skylake, a 5 year old CPU design, and using a variant of Intel's 14nm, a 6 year old litho process.

The i9 has a density of ~44mT/mm2 versus the M1's 134mT/mm2 (3x)

The i9 has ~9.2B transistors, compared to the M1's 16B (174%)

The M1 is two generations ahead on lithography and has a more sophisticated CPU design than Intel. It'll do fine.


> Same as a Cray XMP!

That reminds me of this classic bit of technology humor, the Apple Product Cycle[1]. It doesn't ring quite as true today as when it was first posted, but the broad strokes are still similar. Specifically it appears we're on the stage where "The haters offer their assessment. The forums are ablaze with vitriolic rage. Haters pan the device for being less powerful than a Cray X1 while zealots counter that it is both smaller and lighter than a Buick Regal. The virtual slap-fight goes on and on, until obscure technical nuances like, “Will it play multiplexed Ogg Vorbis streams?” become matters of life and death."

[1] https://web.archive.org/web/20061028040301/http://www.mister...


I'm not sure anyone is saying that the 5W A14 is faster than a 100W i9 (at least for sustained performance).

I don't see why you should say that 'no way' an M1 (with appropriate cooling etc) is going to outperform a desktop class system - if you read the Anandtech review it's clearly superior to competing architectures in many respects and is built on a better process technology.



at the end of the article:

"This moment has been brewing for years now, and the new Apple Silicon is both shocking, but also very much expected. In the coming weeks we’ll be trying to get our hands on the new hardware and verify Apple’s claims."

Anandtech doesn't have the ability to say either way since they are also going by the marketing data ... once they benchmark it for real then they'd be able to prove either way, but your statement is premature.


If my statement is premature the parent’s all caps statements are barely more than a zygote.


That comment was about the M1 chip, which hasn't been benchmarked outside of Apple. The A14 has been around long enough (reviewers have had them for at least a few days) for people to run the (flawed/misleading) benchmarks on, which is the chip I was talking about in my original comment.


I read that article too and mentioned it in my first sentence. That's the article that Stratechery pulled their figures from. The validity of that article and those benchmarks is what I'm doubting.

Will the new Macbooks with M1 chips compare favorably against Intel laptops with low power and fanless designs? Yeah.

Is the existing A14 chip faster than than a 10900k (even in single threaded performance)? No way. There is something in the benchmark that is messed up to the point where you can't compare them.


See the neighbor comment. If i7-1165G7 (a 28W part) can be faster than 10900k, there’s no reason M1 couldn’t be faster.


the question is "for how long" - if it heats up and throttles after a few seconds, desktop CPUs will still have a major advantage for longer tasks.


2/3 devices announced yesterday come with active cooling (MacBook Pro 13" and Mac mini).


Apple active cooling systems are also not that great, with the six core MBP15" heating and throttling when it was first released, and the MBP 16" also having heating/throttling issues.

The Mac Mini also isn't the paragon of active cooling. I've worked with one of the current gen Intel Mac Minis, and that thing gets really hot! Like 60-80 degrees celcius. The insides must be cooking if the outside is that hot.

The 2013 Mac Pro also had heating design issues, only corrected with the current gen Mac Pro.

I'd say active cooling is a consistent weakness of the entire modern Mac hardware design division.


Apple's cooling design isn't great for cooling (maybe great for visual?), but it should be advantage for Apple Silicon compared to Intel CPU on the same Mac.


I mean the whole point of Apple Silicon is that since it's all made by Apple they can control how much heat their MacBooks generate now


So you have no idea why, but you are certain they are wrong? There are several sibling comments with lots of reasons why a lower-power core could perform better than an outdated desktop one.


The RAM is inside the M1 package. That has to make a huge difference in memory access time; it probably saves about a nanosecond in each direction just because it's closer to the CPU. There's probably other stuff going on like little or no microcode compared to the x86 ISA. So yeah, it's plausible that the M1 is really faster in absolute terms than a desktop PC.


I think you should prepare yourself for quite a shock.


Apple’s chip is not just a general purpose CPU, it is designed for specific workloads.

We have similar performance jumps in cryptocurrency mining: GPU’s are orders of magnitude faster than CPU’s and ASIC’s are orders of magnitude faster than GPU’s for the same power consumption.


It's supposed to run a Mac, why isn't it a general purpose CPU? What specific workload do you have in mind? I could use it for web browsing, programming, media editing...


Anything that has to do with audio, video, graphics processing and so on.

There are videos on youtube demonstrating that you can edit large video files on an iPhone connected to an external monitor and do it smoother than a much larger PC.

Here is an example of using iPhone SE strapped to an external screen, editing a 4K footage: https://www.youtube.com/watch?v=LmbrOUPFDvg

Notice how smooth is everything.


I thought the mouse support was iPad only... thanks for the TIL.


Perhaps the answer is that x86 is far more general purpose than that. Many of the instructions are as old as the Millennial workforce at Intel who is building their chips, from an era where the web wasn't a concern and GUIs were uncommon. A vast amount of transistors are dedicated to a different optimization than what the software of this decade cares about.

OTOH ARM and Apple have tailored their chips to the workload of ~10 years ago (running Javascript at single-digit watts) which is far more tailored to what actually gets used these days.


But the M1 is a general purpose CPU and it is faster (without any help from Neural Engines etc) than competing devices.


Why without any help from Neural Engines and etc.? They, the workload specific processors, are included in M1.


Plenty of workloads like SPEC and compiling don't use GPUs, neural networks, etc. They just use CPU cores, cache, and memory. Fortunately Apple has gotten the basics right and they have also added accelerators.


I would guess that the tight proximity of the components as well as the way their communication is designed also brings something on the table. People are used to complain that the ram is soldered to the board, now it is part of the IC.


They are but they don't account for the CPU performance jumps being quoted - these are the result of better standalone CPU performance.

Your parent comment seemed to imply that the leap was due to an architecture shift like CPU -> GPU. It's not, it's just better CPU design.


Well, there's no standalone CPU to talk about, is there?

Purpose built architecture could mean many things, like having efficient cores and high performance cores, codec specific hardware, the way that the memory is accessed, cache configuration, co-processors, signal processors.

Everything counts.


> Regardless of whether it's an intentional trick or an oversight, I don't think that the benchmark showing the mobile chip is better than a desktop chip in RAW PERFORMANCE is true. And that means that a lot of the conclusions that they draw from the benchmark aren't true. There is no way that the A14 (nor the M1) is going to be faster in any raw performance category than a latest generation and top-spec desktop system.

The benchmark is true but misleading. It compares 'Intel vs Apple Top Performance' meaning essentially the speed it could go at max. It is not a real world number or result and exists purely in a vacuum. If your phone ran at that speed for an extended period of time I guarantee it would melt. I think the only conclusion to be drawn is that Apple's mobile CPU's are very capable and well designed, and ARM has a lot of untapped potential.


The point about robust collaborative editing as an API is interesting. The HN crowd generally prefers native apps and gets tired of the electron parade, but for major business software it’s increasingly table stakes that another person can see what you’re working on, live, by clicking a link.

Apple does have relatively good live collaboration in its iWork apps. Perhaps there’s a future API there?


The HN crowd generally prefers native apps and gets tired of the electron parade...

There's no significant preference for native; it's just that we won't tolerate bad apps. I'd guess that VSCode is the most popular editor among users here by a long way, and that Google Docs has far more users than MS Word. HN readers don't pick native apps when there's a good Electron or web-based alternative.


>I'd guess that VSCode is the most popular editor among users here by a long way

I don't know about that. When ever there's an article about Vim, it gets lots and lots of useful, insightful comments. If you search for Vim on HN, you'll see the catalog of Vim posts and threads.

I haven't seen the same thing regarding VS Code.


To add on, the reason why it might seem that

> the HN crowd generally prefers native apps

is because it is hard to write a good Electron / web app that is actually performant + easy to use. Which is why many people here are wary of new Electron apps.


> Apple does have relatively good live collaboration in its iWork apps

Does it? As much as i like the iWork apps, my experience (and impression of the general sentiment) has shown that Google Docs et al continues to blow the pants off iWork in that regard


The collaborative editing (which I first recall seeing in Google Wave) was essentially carried into Google Docs.

N.B, "SubEthaEdit had this for years!" - I know.

As a whole, it could be argued that where Wave failed, Slack - and predecessors like Basecamp - succeeded.

iWork has always seemed like it has a different user in mind with its collaborative features and never really had much traction in the market, which is already served by offerings whose entire reason for being is collaboration, not just as a general productivity suite.


N.B, "SubEthaEdit had this for years!" - I know.

FWIW, I first saw collaborative editing over a network done on an Amiga. So I guess this has been a thing for "decades."


If I'm reading the specs correctly, there was 10 Mb Ethernet capability on the Amiga...in 1985.

Mind blown.

(I was born in '86, albeit late in the year, so I find this especially hilarious.)


I did say "relatively good" :)

I would agree that collaboration is a little smoother in G Suite, however in my experience this is mostly about ease of sharing. Once you've gotten another Apple user to understand that they can "receive a shared document from you" and work on it, then usually collaboration itself is smooth.


Can you give me some examples where Apple's collaboration SW falls short?


I would have wished that Apple had been the ones acquiring Realm. They really had the best API for building apps with live collaboration I have ever experienced. Instead they went to MongoDB, so who knows how that will end?


Realm was a dead end - it was based on a custom database, which was a mistake.

Them going to Mongo is actually the best possible outcome - by replacing their dead end custom database with MongoDB, they make MongoDB into a more compelling product, making document-based databases more batteries-included than ever before, which is excellent news.


I always liked Realm but have not heard anything about them lately. Where did they go, and is the current work as powerful as the early stuff?


I’m not really a fan of MongoDB but the handling of Realm so far seems well done. I migrated an app to their new sync service and everything worked fine. They also offer a fairly generous free plan and are open sourcing the sync engine.


I tried to use it recently and it was an endless nightmare of suck. But perhaps I'm just allergic for any type of framework that tries to dictate my app structure and threading strategy.


Software is the key for me. I am by no means a power user but I really am loathe to give up even more of my software library to swap to Apple Silicon after Catalina cratered my gaming library in Steam and some older apps from companies long gone.

I don't expect I am alone in this observation but the number of software companies they highlighted during the M1 debut was very slim and to be honest I have not heard of half of them until then and don't remember them now.

So to me it matters not how much faster AS is, what matters is if I can run want I want to run. I am not going to own two separate machines to do what I want to do. If AS machines cannot do all I need I will keep my current Mac till support runs out and look again


It's interesting to think about gaming specifically. Genuine platform exclusivity is rarer and rarer these days, as the underlying engine market has become more and more commoditised. It is relatively cheap to port a game built in Unity or Unreal engine from PC to Xbox, PlayStation, etc.

Mac has always lagged because the market wasn't big enough to warrant investment. Catalina wouldn't have cratered your gaming library if there was enough incentive for the developers to update the game. Now that iOS apps (read: games) can run on Mac with relatively low effort, and we already see PC games being easily ported to iOS (eg: Fortnite, legal challenges notwithstanding), I feel that this is the best thing to happen to Mac gaming in years.


Rosetta 2 allows you to run your x86 mac apps on Apple silicon. It's like the PPC->Intel switch again; your software keeps working regardless of the hardware underneath.


To add to parent’s point, AS is a one way trip to Big Sur, with no viable alternative (no downgrade, no dual boot anymore)

Then it’s not just big app companies that need to adapt, homebrew and CLI short lived utilities will also need ARM compatible versions or we’ll be wrapping each individual command in a Rosetta 2 launcher.

It might not be that bad, but I’d sure want to wait and see how it pans out, as it doesn’t look like a trivial transition.


Doesn't homebrew compile in place anyway? Macports does if they don't have a binary package handy. And as for arm compatibility, most command line tools come from Linux where they've already been compiled for arm ages ago.

So the CLI should be the least problem with Big Sur. Unless they somehow cripple it on purpose so you can't use common unix command line stuff any more.


It should all work fine eventually, even if not on day 1 at least after a while. But I think there will be a transition phase for edge cases.

For ARM support for linux, my understanding is that pure server stuff is completely fine and battle tested, hobby/side peoject level utilities are a riskier bet. I remember hitting some of those on a rasberry pi a while ago, but then the situation might be way better already with all the buzz surroubding ARM nowadays.

For reference the github issue tracking page: https://github.com/Homebrew/brew/issues/7857


Homebrew is fine, and nothing about the CLI is crippled in Big Sur.


The Mach-O loader should transparently invoke Rosetta 2.


They plainly state that software with AVX, AVX256, AVX512, virtualization, and other nebulous instructions simply won't compile using Rosetta 2.

They seem to be attempting to avoid newer, patented instruction sets (Intel issued a strongly-worded statement about companies that attempt to use their ISA that seemed aimed squarely at Apple), but that means that SSE3+ (and maybe SSE2) are also avoided which could mean a lot more software doesn't work then they'd like to admit.


> and maybe SSE2

For x86-64, SSE2 came with the original x86-64 release in 1999/2000 (according to Wikipedia, it was announced in 1999 and the full specification became available in August 2000), so if patents are limited to 20 years, it's already out of patent.


No, they say AVX, AVX256, and AVX512 instructions won't work and to run them in conditionals that verify their availability.


Counterpoint: Try to run PPC software on a current mac.

“Keeps working regardless” is a great promise Apple has continued to make, but they only hold that up as long as it takes you to get the latest thing.


"Keeps working regardless" for some period of time during the transition.

x86_64 Rosetta won't be forever, just like PPC Rosetta wasn't.


More recent counterpoint: 32-bit x86 on a current MacOS.


Aha yes, thank you


Dont think that's fair. PPC is very old. An extra 5 years of OS updates that support intel would be acceptable to me.


Meanwhile I can still use WindowsXP (and even DOS) apps on Windows 10, and many linux tools are rock solid without any ongoing maintenance.

Some problems are straightforward and have final solutions in software, and that should be seen as a good thing.


Yeah that’s great. But macOS wouldn’t be the way it is if it didn’t make breaking changes over the years.

Windows 10 compatibility is an excellent achievement. But it comes at a cost of a significantly worse user experience (in my opinion, at least). Like legacy menu hell.


What’s a good thing is that different people can do things differently, according to their priorities.


It’s not without some overhead though, so there will be performance issues.

Fortunately those performance issues will probably only affect games released in a fairly narrow band of time, new enough that they’re still resource intensive but predating the transition. It would maybe be a 5 or 6 year span I guess depending on how performant the ARM system in question is?


They've stated that some rosetta apps actually perform better on apple silicon due to the previous intel model being so slow (e.g. on the macbook air)


I can see that being the case in the Air but that wasn’t really running a modern/current chipset to begin with right?


Metal will be optimised. They said in the presentation that a game running on Rosetta would benefit from these optimisations, offsetting some of the slowdown resulting from the translation. So games that spend much of their time in Metal calls should be fine. Of course, the proof of the pudding is in the eating, which will start next week.


How is this going to work with games and GPUs?


Nothing with Apple Silicon has a dedicated GPU, so, about the same. Existing calls to Vulkan and Metal don't need to be translated, the API and software stack are the same on the new hardware.


The overarching strategy is to bring iOS to the Mac in order to grab market share of the PC market. Apple's moat will increase more and more because of the vertical integration they have.

A lot of people have iPads. They have replaced PCs for many tasks. Many I know only use their PCs for word processing or work. Other things we do on a computer are in the browser.

In 2 years I'll be replacing 2 Windows computers in my family. Give me the choice of a Mac with familiar iOS interface, and access to the App Store, or a Windows PC, only good for word processing and browsing the web, I choose Mac/iOS. I am familiar with iOS but not so much with Mac (I've owned 2 but could never get used to the UI). I reckon millions will be faced with the same choices.

I predict that in 5 years Dell, HP and Lenovo will be struggling. Time will tell...

Oh, and of course the M1 today ain't compelling enough. But wait until it evolves. Apple has a track record of iterating year over year. Fascinating...


> I predict that in 5 years Dell, HP and Lenovo will be struggling. Time will tell...

I mean most casual users don't use laptops already, either preferring their phones or tablets. Those that do, need it for work (or a nicher crowd - gaming). And there's no way in hell the primary buyers of Dell, HP and lenovo computers, companies, with no reason to pay the apple tax will pay for expensive 1000$+ MBPs for each employee


“IBM says it is 3X more expensive to manage PCs than Macs - Saving up to $535 per Mac per four years in comparison to PCs”

This was in 2015 I think. Anyway, large businesses will look at TCO rather than purchase price. If Apple can perform better in that metric, they will be preferred, and vice versa.

https://www.computerworld.com/article/3131906/ibm-says-macs-...


I can't believe IBM (now enterprise Apple agency) is fair. I'd like to see fair report.


Businesses already pay 1000$+ for Lenovo, Dell and HP laptops. Work and gaming are not niche. Huge markets.


As phones are most users' primary computing device, I imagine desktop computers will become stateless machines with big displays and keyboards for users to dock their phone. The user will be able to access their phone's apps and data on the desktop computer, perhaps also offloading computation to faster desktop CPUs.


Fascinating thought. Will Apple make a standalone screen/dock to plug in your iPhone/iPad...?


> the M1 probably costs around $75 (an educated guess), which is less than the Intel chips it replaces, but Apple is mostly holding the line on prices (the new Mac Mini is $100 cheaper, but also has significantly less I/O). That suggests the company believes it can both take share and margin

However, the creation of the M1 wasn't free. There's a significant R&D cost, both initial and ongoing that isn't calculated as part of the Cost of Goods Sold.


Focus on hardware as a differentiator that is not easy replicate would also allow Apple to sell privacy and security.

The competitors such as Google are often focused on software and data, relying on data collection.


> Sketch, to be sure, bears the most responsibility for its struggles; frankly, that native app piece reads like a refusal to face its fate.

Ben can write paeans to this new "cloud" business model. But at the end of the day, the question for us, the users is simple.

Do we own what we buy?

When I buy a Mac (and I've bought several), I am buying a computer. A general purpose computational device. And by selling it to me, the company is selling me a general purpose computational device.

What right does the company have to stop me from installing/modifying my device in any way that I see fit? Sure, they may refuse support/warranty, that is their prerogative, but what gives them the right to stop me from having someone else repair it? Or, to boot into Linux? Or, to open my own computer?

I have a MacBook Pro from 2016. Recently, I wanted to give it a thorough cleaning. So I took out my speciality screw driver and unscrewed the screws for the bottom plate.

It wouldn't budge.

It was then I realized that I needed suction cups and strength to move the plate downwards to unlatch something inside to make it "pop".

This design serves no engineering purpose. It exists to make it harder for me, the device owner, to access the device I've purchased without sacrificing dollars at the altar of Apple.

And this was their most "open" product. Prior to the M1 announcement, you couldn't boot into another OS - or significantly alter - your iOS device. And now we can't do so with our Macs. We seem to have collectively decided to blur the line of ownership.

A device we buy isn't ours even after purchase. No, we must continuously give our money to the corporation for the benefit of their revenue projections.

Which returns us to this,

> Sketch, to be sure, bears the most responsibility for its struggles; frankly, that native app piece reads like a refusal to face its fate.

With Sketch you own your data, and thanks to the open format, you can port that data to other mediums.

With Sketch you own a copy of the tool that allows you to do your job.

With these other, less powerful but "collaborative" software, you don't truly own your data or the tools to access it. You merely rent it.

Should there be an event where Figma is acquired or goes out of business, then (in all likelihood) every user of this platform will lack the ability and the choice to preserve their work for future generations (and for their business).

What are the odds of Figma staying as it is, in the control of founders, chugging along as a profitable business a year from now? 5 years from now? A decade? Two decades?

I do not wish to single out Ben, but this post is an exemplar of the shift in thinking being pushed by the current crop of tech cognoscenti. They have made a growing argument that the future is one without ownership. Where it's one where you don't own your devices, you rent them. And they assure us that's the future, and because that's the future, it's going to be amazing.

But that sounds like dystopia to me. It is one thing to have a tradeoff between accessing all the songs in the world and owning a few on vinyl to having the tools of your trade be abstracted away.

Spotify and Netflix aren't essential services to me. My computer is. My vector design software is. My ability to write code is. My ability to make things is.

They argue that there are benefits to "collaboration" with the "cloud", but that doesn't need to be so. The only reason why they're operating in the browser is because the tradition of web apps started within them. There is no reason why every other application can't collaborate natively, with combined local + server-based data storage with other apps across the world.

Video games do it all the time! Games like counterstrike etc are in some ways far more collaborative than a Figma file. The state of what occurs in the world depends on every other person in the world, with the context being time sensitive, and the state being additive. And it works beautifully.

If it can work for non-essential entertainment, why should we accept the reduced paradigm for our essential tools? Why should we buy into crippled software that is limited by the fact it runs in the browser? Why should we buy into the abusive business model of having to rent our ability to do work from another company? Why should we buy into the idea that we don't own the fruits of our labor? And that we don't get to have a copy of our work nor access it without paying the toll?

Bohemian Coding, if you ever read this, don't go down with this ship. Add support for Windows. Or, Linux. It will save your company.


Thanks for your comment. I hope that all of this "computing" under restricted and profitable for megalithic corporations trend will end soon. But the problem is that majority of users cannot comprehend the damage. General Users are like kids in a candy store, they will grab everything shiny and sweet without thinking twice. My other concern is cancel culture in new developer/designer/engineer generations. They simply don't care. Critical thinking is something from the past. Apple ideas for future of computing are "give us money". Suddenly everything thats important is iOS and vertical integration. This strategy was evident when they postponed making of Mac Pro for years. As a designer I have only one problem: I don't like Windows at all. I can live happily with Linux desktop but I need graphical software as good at least as Affinity Designer. So I have a big request: If someone here is close to Affinity management, take a note I will pay the price of current Affinity Designer x2 for a chance to run it under Linux. :)


IMO this argument blurs the line between two separate concepts.

I think it's fair to say that a company shouldn't be allowed to sue you or have you arrested for doing whatever you want to a device that you have purchased. For example, see the John Deere lawsuits where they are trying to DRM repairs to tractors and make it illegal to work on them.

What I think is less fair is arguing that a company has a requirement to make it easy or possible for you to work on your device. You're entering the world of engineering and end-user trade-offs and starting to talk about forcing companies to add or remove features that might be in the end-user's best interest.

Would a screw with a proprietary screw-head be ok if the company could argue that it made assembly easier? Would you require that they prove there is some tangible engineering benefit to all decisions? What's the line between "general purpose computing device" and "electronic toy"?

You could probably write a law that made it illegal to add restrictions whose sole purpose was to prevent the user from modifying or repairing their device but the only way you'd ever be able to enforce it is if someone was caught writing incriminating emails. Pretty much anything could have some imaginary engineering justification behind it.

If you can crack a proprietary protocol or re-create a proprietary screwdriver then more power to you, you shouldn't be arrested or sued. But telling a company they're not allowed to use a proprietary screw-head is a messy road to start down, and that's what you're saying when you're asking for companies to be forced to allow you to do what you want on your device.

Vote with your wallet and if the choice you vote for doesn't win then you're free to go create your own. But don't outlaw business models or legislate engineering choices. You should have a right to repair, you should have a right to get your data, you shouldn't have a right to tell me what my engineering choices need to be.


The repairability of the hardware really has nothing to do with software ownership. You can buy a Dell with soldered in components. Android phones are glued together similarly to an iPhone. There is very clearly an engineering reason for this: old laptops were heavy and bulky. Old phones had tiny batteries and simple functionality while a modern phone is trying to pack in as many features per square mm as possible.

You can run open source software on iOS or macOS. There’s nothing in these operating systems preventing apps from working with local files and having options to export and import data. There’s nothing preventing the author of a web application from doing the same (though it’s hard to see a lot of motivation for someone selling a saas subscription to do so). There are self-hosted open source collaboration tools like NextCloud, OnlyOffice, and Gitlab. The fact that VSCode is web based doesn’t change how it’s open source and completely open to tinkering.

What you’re doing here is confusing business model with technology.

And I think what you’re failing to do is look at this software from the perspective of enterprise customers.

A business isn’t worried about the things a consumer is worried about. They have access to their data ensured through a contract. They have legal assurances that they won’t be left out in the cold.

Software is, essentially, business logic. And often, what’s valuable about software isn’t the code itself, it’s the people who are supporting, patching, and improving that code. To a business, a software purchase often feels a lot more like a consulting contract. Businesses could run on 100% free software, but what they really need is someone else to spend the time working out issues and making it “just work.” They need to waste as little of their employees’ time on cost centers, because labor is the highest cost. Ownership doesn’t matter to a business because the only thing a business needs to own is its own core products and services.

You compared this situation to video games - interestingly enough, on that subject, the majority of game consoles have been 100% locked down for almost 4 decades now. Video games are almost entirely closed source. Modded games are mostly banned from online play. Nobody cares that they can’t open up their Xbox to upgrade/replace the components because it’s simply not a priority. Video Games are essentially media content, art, not business logic.


> "...this post is an exemplar of the shift in thinking being pushed by the current crop of tech cognoscenti. ...they assure us that's the future, and because that's the future, it's going to be amazing."

it's good to remind ourselves that stratechery is a marketing vehicle for consulting services, not an inquiry into truth. these blog posts are examples of standard strategic analyses learned in any decent MBA program. absent relevant experience, these analyses may seem oracular, but they're not. they're decent and competently researched, but not amazingly insightful--by design: why give away the real crown jewels (if there are any)?

it's pretty clear that the trend in software has been toward subscriptions (renting software) for the last decade or two, and that subscriptions are more advantageous for profit-seeking businesses, so connecting two very well-known dots provides all the "insight" here. a third well-known dot, that apple has a lead in this regard, provides context.


If Ben is using Stratechery to market consulting services, he's the worst marketer in existence, as I can't find any evidence he does consulting.

He's not giving away the Crown Jewels, the vast majority of his writing is behind a pay wall, and, to my knowledge, he's full time and only funded via that pay wall to write Stratechery (https://stratechery.com/about/ indicates this as well).

The free articles are an example of his writing to generate traffic to maybe convince folks to sign up and pay him for the rest of the content (which is pretty good, consistently).


> “...pay him for the rest of the content (which is pretty good, consistently).”

having read this free piece and a number previously posted here, the analyses tend to be fairly average in comparison to mba-level work. perhaps you can enlighten me—who pays, and why? who has interest but also doesn’t otherwise have access to mba-level analyses or analytical tools?

a solid position, line of reasoning, or conclusion is difficult to draw from this article. he seems to want badly to say something, anything, insightful about apple’s new silicon to not miss the short window of opportunity afforded by the recent announcement. but what did he say other than repeat some numbers from anandtech and sidetrack onto sketch vs figma?

apple‘s strategy isn’t new or surprising, and this chip is one (comparatively small) part of jobs’ original vision of ubiquitous consumer computing, with apple at the center of it. the switch to arm isn’t even a strategic surprise. they wanted ever smaller and more powerful chips (which intel could have owned but f-ed up) to drive that ubiquitousness and be at the profitable forefront of it (that’s also why they’re so obsessed with thinness). apple has always wanted to own consumer computing. that’s it. that strategy is not that hard to make sense of. but somehow this article badly flubs that low hurdle.


First define for me what you mean by 'average in comparison to mba-level work'? What analytical tools are you referring to?

As to who pays? People who have things to do, and don't have the time to do the 'mba-level' work it takes to track all these different trends and such, and want just a short overview of 'whats important' and "who's moving where".

And your 'obvious' take is not obvious if you look at where apple is spending money, effort and hiring, etc.

And another aspect of this is that he's been talking about some of this stuff for many years, which isn't at all obvious from one of his pieces stand-alone. The weekly updates blend together and consistently reference stuff he's previously discussed, and he also has no problem calling out where he's gotten it wrong.


it's average in comparison to the many similar papers i read (and wrote) during my mba program. he's doing the same thing, except instead of turning papers in to professors, he's blogging about them. some are better than others, but this one is certainly below average. any manager for whom such information is critical will either have an mba, or have access to similar (and likely much better) strategic analyses (e.g., the strategy group at any mid-size or larger company).


You still haven't given me an idea of what you think is 'lacking' in these 'strategic analyses' that would be appropriate for something like this writing?

As to strategy groups, I've worked with strategy teams from a big chunk of the Fortune 100, did time at a Big 4, and I think your expectations of levels of critical analysis are ....overshooting reality :)

Ben's definitely not some strategy messiah, no question - I doubt he'd even claim to be in the top percentage. But he's consistent, thoughtful, and lets me just not think chase tons of other sources on a regular basis.


in this particular case, it'd be great if there was a recognizable thesis and supporting points backed up by discernable research (even cursory).

i'm not suggesting he's a crank, but that adoration should be tempered. we humans tend to get wrapped up in popularity rather than substance. the evaluation of analytical stances should be heavily tilted toward substance.


> notably Apple avoided the trap of integrating hardware (the iPod) with hardware (the Mac), which would have handicapped the former to prop up the latter. Instead the company took advantage of the flexibility of software to port iTunes to Windows

That's how things turned out, but not how they were originally meant to be. Jobs' reluctance to port iTunes to Windows was so obvious that even the Isaacson bio made it clear. And it wasn't simply an atavistic impulse, it was faithfulness to his "Digital Hub" strategy which saw smart-device integration as a way to sell Macintoshes. Jobs was a late and reluctant convert to the idea of a post-PC era. Repasting an earlier comment of mine https://news.ycombinator.com/item?id=9470925 :

>> If you look at Apple's trajectory over the last 15 years, you can see the vision was consistently outlined from the very beginning—the Digital Hub strategy ([link working in 2020: https://youtu.be/AnrM4n6S3CU?t=2585 ]).

> That's not really the case though. As Jobs outlined it in that video, the Digital Hub strategy was to sell Macintoshes by positioning them as something you could dock your consumer-electronics gadgets, mostly from third parties, into. It was a plan to sell hubs, not spokes. This strategy had limited viability for Apple, because a typical consumer wasn't likely to think "I've just bought this $500 camcorder, so now I need to spend twice that or more on a Mac in order to offload and edit the video". If they were going to use any computer as the digital hub for their camcorder, it was probably going to be their Windows PC. That's probably why iTunes for Windows was such a difficult and long-drawn-out decision for Jobs: because it was a decision to mostly abandon the Digital Hub approach in favour of selling more of the spokes. Then Apple's slow and initially reluctant embrace of the "post-PC era" with over-the-air iDevice updates and cloud storage to partly displace iTunes means that it's increasingly taking nearly the opposite of Jobs' 2001 stance: "We're clearly migrating away from the PC as the centrepiece" and "We don't think of it in terms of the PC business anymore" are things that Tim Cook could say today without really startling anyone.

At his 2007 joint interview with Gates at D5 https://www.wsj.com/video/bill-gates-and-steve-jobs-at-d5-fu... Jobs is coming round to the post-PC agenda, reluctantly.


Mobile and device proliferation more broadly snuck up on a lot of people. (Including me. I wrote a piece in 2003 about Apple morphing into an entertainment company and it was way too living room-centric. http://bitmasons.com.s3-website-us-east-1.amazonaws.com/pubs...)

The general view in the 1990s into the early 2000s was that you would have a centralized home computer/storage hub/etc. that everything else connected to.


> Android is more flexible and well-suited to power users, and much better integrated with Google’s superior web services

... if (forced) integration with Google's web services is a plus, that is.

> (Apple) The company has the best chips in the world, and you have to buy the entire integrated widget to get them.

Does it? Those claims of 3x faster seem carefully cherry picked.


Its difficult to think different if you're the world's largest company and dominate the phone market. In laptops it looks like they are differentiating themselves very well.


Apple has a small piece of the smartphone pie. That said, their CPU's are consistently in a class of their own.


Small is relative. Some estimates claim 1b active iPhones: https://www.aboveavalon.com/notes/2020/10/26/a-billion-iphon...


A lot of people are still using an iPhone 5S or an iPhone 6. From iPhone 6 to now, that's actually close to 1.7b iPhone sales. 1b active iPhones sounds very reasonable after accounting for breakage and the unused spares forgotten in a drawer.

Apparently there's approx a total of 3.5 billion smartphone users today. https://www.bankmycell.com/blog/how-many-phones-are-in-the-w...

iPhones could be 30% of the market.


Small in units shipped, but dominant in design, aesthetics, and, most importantly, profit margin.


> but dominant in design

Debatable. If anything, modern Android designs are cleaner than iPhone looks.

> aesthetics

Again, debatable.

> and, most importantly, profit margin

This is the key factor, and it's tied to something you missed.

Apple products and iPhones are ahead in profit margins because Apple consistently delivers reasonable quality goods, with few disappointments, so that users trust them. They've gained user trust despite obvious "design, aesthetics" mis-steps such as the notch or the touch bar.

The key words are: consistent delivery, reasonable quality and few disappointments. That's how they hook users in. Apple mostly delivers on time something very close to what they promised and that thing doesn't have catastrophic flaws. That's a much taller bar than you'd think, in the tech sector.


For any other company, the notch probably would've been viewed as a design misstep. For Apple, it became a distinctive feature that was actively imitated by phones that didn't even need to have such a large cutout. That's what it means to be dominant in design and aesthetics.


> the notch probably would've been viewed as a design misstep

Why? The alternatives are a smaller screen or terrible audio when using one’s phone as a phone. It’s different. But it makes perfect sense and made perfect sense the first day.


There is only one problem with the notch IMHO... and it's a software issue: taking video fullscreen in the horizontal orientation should only have ever filled the rectangular area.

Otherwise, it's great: the status indicators you want anyway go up in the corners and don't take out a full row of the "actual" screen.


They weren't the first. Sharp and Andy Rubin's Essential Phone had that terrible design first.

OEMs come up with something and the designs echo through the industry.


> modern Android designs are cleaner than iPhone looks

Design is about more than the way apps & interface looks. I recently switched from Android and the thing that most struck me is that iPhone usability is more consistent. I was able to do everything I wanted with Android and honestly I loved it, but its not as intuitive as iOS.


I use mostly Apple devices, both Mac and iOS, because I like the hardware. But I also have a cheap Android phone, and even though I'm not using it very often I find it more consistent/intuitive than iOS, mostly for two reasons:

One is iOS's arbitrary separation of settings and other app functionality. It makes no sense. I'll never remember what goes where.

Secondly, Android's back button is simple. I can use it without thinking even though you can probably find a lot of inconsistencies in its behaviour. iOS has multiple inconsistent one-off solutions for going back that cause a lot more mental friction.


The back button was something I missed when I went to an iPhone 6S in 2015. Apple had back arrows near the top of the screen, which I could not reach with a thumb and one hand. Annoying.


I think people also forget the dominance of iPhone (and imessage) in the US market. iPhone is simply the most popular and go-to phone in the US market. Not so much in similarly richer areas like Europe.


And what's up with the camera bumps? ;-)

I know most people use a case, anyway, but I'd rather have an extra mm or 2 and a flat back, and either get more battery or just empty space...


1b active iPhone users out of 3.5 billion smartphone users worldwide.

A quarter of the pie can't really be called a small piece.


> Apple has a small piece of the smartphone pie.

An iPhone is the single best-selling phone on pretty much any given year.

iOS is also 50/50(or more) with Android in a number of countries.


Most of them are tier 1 countries.


So?


Half of USA market is far from "small piece".


The US is only 4.2% of the global population.


And that 4.2% of population created one of the richest corporations in the world. Pretending that Apple doesn't have an important world market share with their mobile products is really a bit facetious.

Even if it reaches a low % of total world population, it does command important amount of profits and dictates trends.


And this year they are creeping back to 90s era: iPhone 12 and iPhone 12 Pro are essentially the same phone. So no reason why both exist, except to muddy up the product line.


The key differentiator is the camera on the Pro, which significantly increases the cost. Some people will pay a premium for the privilege of owning the top model and yet others will go for the Pro precisely because of the more advanced camera.

By having separate lines, Apple can sell the 12 for cheaper than the 12 Pro, and those who are willing to pay for the 12 Pro camera can do so. Don’t see any muddying here.


The problem is as a “normal user” you can’t go to your ‘tech friend’ and get a straight answer — there are too many considerations now to direct someone to one phone over another.

It’s not ‘yeah, the new iPhone is great! You gotta have it.’ It’s “so what do you want to do mostly?” — no one knows.

That is what I believe the 90s curse really means — your evangelists are no longer as effective because they give potential customers an overwhelming amount of information that slows down, and sometimes prevents, a sale.

—- Your point is exactly right - Apple has decided to harvest the demand curve over making something undeniably great.

Edit: let me clarify, watch and homepod mini are currently in the category of 'just get it' products. This is only a critique of the iPhone line.


> The problem is as a “normal user” you can’t go to your ‘tech friend’ and get a straight answer

Uh, it's 3 options.

Small size, best camera option, remaining option.

I didn't know this - I just went to Apple's website, clicked iPhone and it has a single page that presents all this very clearly.

So um, yeah. This is by the way how I do 'tech advice' to anyone who ever needs it - I open google, I type in the question and the first link has the answer 95% of the time.

Phones haven't been in 'gotta have it' category since iPhone 6 when they released a bigger size that a lot of people wanted. Since then, it has been 'better camera' yearly releases, oh and 'better chip', as if anyone needs a supercomputer to browse Instagram.


Cameras (and the associated computational photography power) have the advantage (for Apple and other manufacturers) of simultaneously still being on a fairly rapid improvement curve and being something a lot of people really care about. It's no coincidence that Apple really hits hard on the photography angle.

There is clearly a difference between my iPhone 6 and iPhone X but I've never been on a particularly frequent upgrade cadence. Under normal circumstances, I'd probably upgrade to this year's model but there's not a lot of point until I get out and about a lot again.


It's four options. Two of them different:

- iPhone 12 Mini, the small one

- iPhone 12 Pro Max, the huge one

And then there are these two: iPhone 12, iPhone 12 Pro.

They are the exactly same phone in every regard that a "regular" user cares about. They are so identical that Gruber lumped them together in benchmarks in his review at Daring Fireball.

The only difference is the difference in cameras which is important to a very small number of people. And even then it doesn't make sense to make two different models instead of one, with the new camera setup.


I think you underestimate how many people care about cameras a lot--or at least think they do. Personally, I'm inclined to give Apple the benefit of the doubt that it knows what it's doing by having a cheaper mid-size model and a more expensive one with a better camera.


I like how Gruber put it:

“Pro”, in Apple’s parlance, also simply means “more expensive”.


Ah ok I've had to do a few more clicks.

So the 'latest' options on Apple's website are presented as iPhone 12 Pro (best camera) and iPhone 12 (not best camera) - 2 options.

Then each option has 2 sizes, that makes it a total of 4.

I guess they could've named things better - it's still same old 'best camera = more expensive' formula from years prior, with an addition of 'smaller size' to the mix.


Think the issue is Jobs era Apple wouldn't do this, it would just be one great phone and one budget phone which would usually be last years good phone.

New Tim Cook Apple philosophy is doing what Apple fans used to criticise other tech companies of which is needlessly fragmenting their product line to try and milk more money out of it.

It's things like this that make the difference between ok and good and good and great.


Apple's revenue has gone from 65.2 billion dollars under the last year under Jobs to 275 billion dollars in 2019. It's easy to say that "Jobs would never!" but Jobs never ran a company almost 5x bigger. It's entirely possible that he would have. It's equally possible that Apple never grows to these heights with him at the helm. It's just an impossible comparison to be making.


Yeah I also notice there are a ton of MBA preferred jobs for aapl now too


[Edit] I just re-read your comment... "remaining option" ...


A curious critique of Apple's product line given the strategy of the competition: https://gadgets.ndtv.com/mobiles/samsung-phones

(I would argue it's very simple: "Get the iPhone 12 in the size and colour you like", the Pro models are an up-sale for the people who want different screen-size or slightly better cameras)


I'd argue if you're comparing to your competition, you have no idea what you want to build.

[Edit] Apple is specifically doing this frayed product line to harvest the demand curve, lower supply costs, and amortize R&D across a large unit base... they are playing to the current low interest rate world in this product line. It's an intentional business choice, not a product choice.


To the lovely downvotes, do we have to talk about how this plays perfectly into the tiny form factors required for all day AR?

However I still believe it’s a bit more aggressive than necessary.


Consider applying for YC's first-ever Fall batch! Applications are open till Aug 27.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: