Hacker News new | past | comments | ask | show | jobs | submit login

This is a great comment, you clearly know what you're talking about and I learned a lot.

I wanted to push back on this a bit:

> The "Objective-C problems" they fixed were things that made Objective-C annoying to optimize, not annoying to write (except if you are a big hater of square brackets I suppose).

From an outsider's perspective, this was the point of Swift: Objective C was and is hard to optimize. Optimal code means programs which do more and drain your battery less. That was Swift's pitch: the old Apple inherited Objective C from NExT, and built the Mac around it, back when a Mac was plugged into the wall and burning 500 watts to browse the Internet. The new Apple's priority was a language which wasn't such a hog, for computers that fit in your pocket.

Do you think it would have been possible to keep the good dynamic Smalltalk parts of Objective C, and also make a language which is more efficient? For that matter, do you think that Swift even succeeded in being that more efficient language?




Let me preface this by saying performance is complicated, and unfortunately far more... religious than you'd expect. A good example of this is the Objective-C community’s old insistence that good performance was incompatible with garbage collection, despite decades of proof otherwise. This post [1] about Swift getting walloped by node.js is fascinating in seeing how a community responds with results that challenge their expectations (“what do you expect, the JS BigInt is optimized and ours isn’t”, “It’s slower, but uses less RAM!”, etc.). As it turns out, questions like GC vs. reference counting, often end up being much more nuanced (and unsatisfactory) than which is simply "faster". You end up with far more unsatisfying conclusions like one is more deterministic but often slower (and as it turns out most UIKit apps aren’t realtime systems).

All this to say, it is hard to answer this question in one comment, but to try to sum up my position on this, I believe the performance benefits of Swift were and remain overblown. It’s a micro benchmark based approach, which as we’ll see in a second is particularly misguided for Swift's theoretically intended use case as an app language. I think increasingly people agree with this as they haven't really found Swift to deliver on some amazing performance that wouldn’t have been possible in Objective-C. This is for a number of reasons:

1. As mentioned above, the most important flaw with a performance based Swift argument is that the vast majority of the stack is still written in Objective-C/C/etc.. So even if Swift was dramatically better, it’s only usually affecting your app’s code. Oftentimes the vast majority of the time is spent in framework code. Think of it this way: pretend that all of iOS and UIKit were written in JavaScript, but then in order to “improve performance” you write your app code in C. Would it be faster? I guess, but you can imagine why it may not actually end up having that much of an effect. This was ironically the bizarre position we found ourselves in with Swift: your app code was in a super strict typed language, but the underlying frameworks were written in a loosey-goosey dynamic language. This is exact opposite of how you'd want to design a stack. Just look at games, where performance is often the absolute top priority: the actual game engine is usually written in something like C++, but then the game logic is often written in a scripting language like Lua. Swift iOS apps are the reverse of this. Now, I'm sure someone will argue that the real goal is for the entire stack to eventually be in Swift, at which point this won't be an issue anymore, but now we're talking about a 20-year plan, where it seems weird to prioritize my Calculator app's code as the critical first step.

2. As it turns out, Objective-C was already really fast! Especially since, due to its ability to trivially interface with C and C++, a lot of existing apps in the wild had already probably topped out on performance. This wasn't like you were taking an install base of python apps and getting them all to move over to C. This was an already low-level language, where many of the developers were already comfortable with the "performance kings" of the C-family of languages. Languages, which, for the record, have decades of really really good tooling specifically to make things performant, and decades of engineering experience by their users to make things performant. And so, in practice, for existing apps, this often felt more like a lateral move. I actually remember feeling confused when after the announcement of Swift people started talking about Objective-C as if it was some slow language or something. Like, literally the year before, Objective-C was considered the low-level performance beast compared to, say, Android's use of Java. Objective-C just wasn't that that slow of a comparison point to improve that much on. The two languages even share the same memory management model (something that ends up having a big affect on its performance characteristics). Dynamic dispatch (objc_msgSend) just does not really end up really dominating your performance graph when you profile your app.

3. But perhaps most importantly, I think there is a mirror misguided focus on language over frameworks as with the developer ergonomics issues I pointed out above. If you look at where the actual performance gains have come from in apps, I’d argue that it’s overwhelmingly been from conceptual framework improvements, not tiny language wins. A great example of this is CoreAnimation. Making hardware accelerated graphics accessible through a nice declarative API, such that we can move as much animation off the CPU and onto the GPU as possible, is one of the key reasons everything feels so great on iOS. I promise no language change will make anywhere near as big of a dent as Apple's investment in CoreAnimation did. I’d argue that if we had invested development time in, e.g., async/await in Objective-C, rather than basically delaying that work for a decade in Swift, we’d very possibly be in a much more performant world today.

Anyways, these are just a few of thoughts on the performance-side of things. Unfortunately, as time moves on, now a decade into this transition, while I find more people agreeing with me than, say, when Swift was first announced, it also becomes more academic since it's not like Apple is going to go back and try to make Objective-C 3 or something now. That being said, I do think it is still useful to look back and analyze these decisions, to avoid making similar mistakes in the future. I think the Python 2 to 3 transition provided an important lesson to other languages, I hope someday we look at the Swift introduction as a similar cautionary tale of programming language design and community/ecosystem stewardship and management.

1. https://forums.swift.org/t/standard-vapor-website-drops-1-5-...


To add to the GC discussion, something that many that weren't around during the GC project failure for Objective-C, is that ARC was pivot from a failed project, but in good Apple fashion that had to sell the history on their own way.

The GC for Objective-C failed, because of the underlying C semantics, it would never be better than a typical conservative GC, and there were routinely application crashes when mixing code compiled with GC and non-GC options.

Thus they picked up the next best strategy, which was to automate the Cocoa's retain/release message pairs, and sell that as being much better than GC, because performance and such, not because the GC approach failed.

Naturally, as proven by the complex interop layer in .NET with COM, given Objective-C evolution, it would also be much better for Swift to adopt the same approach, than creating a complex layer similar to CCW/RCW.

Now everyone that wasn't around for this, kind of believes and resells the whole "ARC because performance!" story.


Do you happen to have any source/book on why you can't use anything but a conservative gc on C-like languages? I would really like to know why that's the case.


Basically C semantics are to blame, due to the way C was designed, and the liberties it allows its users, it is like programming in Assembly from a tracing GC point of view.

Meaning that without any kind of metadata, the GC has to assume that any kind of value on the stack or global memory segments is a possible pointer, but it cannot be sure about it, it might be just a numeric value that looks like a valid pointer to GC allocated data.

So any algorithm that needs to be certain about the exact data types, before moving the wrong data, is already off the table in regards to C.

See https://hboehm.info/gc/ for more info, including the references.


Thank you!


Great posts. Objective-C is still my programming language of choice.

> Now, I'm sure someone will argue that the real goal is for the entire stack to eventually be in Swift, at which point this won't be an issue anymore, but now we're talking about a 20-year plan, where it seems weird to prioritize my Calculator app's code as the critical first step.

It seems like it is the goal for at least some people at Apple. But so many Swift frameworks rely on Objective-C frameworks (SwiftUI wraps lots of UIKit. SwiftData is built on top of CoreData, etc.)

In twenty years Swift will be roughly the same age Objective-C was when Swift was introduced (give or take). By then the Swifties will be getting old and gray. I think it’s reasonable to bet that some young blokes will be pushing a new programming language/UI framework by then. I’m not sure Apple can replace the entire Objective-C stack even if they wanted to. Maybe if they spent the next five years not working on any new features and did nothing but port all the frameworks to pure Swift (we know Apple will never do that).

Unless a new hardware platform takes off and supersedes iOS/macOS and starts Swift only I just don’t think Apple can rid themselves of Objective-C (I personally think that they shouldn’t even want to get rid of Objective-C). But watchOS doesn’t have many developers and visionOS wants all existing iOS and macOS apps to work because they want a large ecosystem.

I sometimes wonder if Objective-C will outlive Swift. Sure it’s the underdog but I always root for the underdog. I hope someone will make an Objective-C 3.0 even if it isn’t Apple.


As someone interested in Apple app dev, would you recommend still starting with ObjC? I notice the dev behind the Swiftcord app (open source Discord client in Swift) has noted at length how much you still need to call into UIKit to get things done as there were a lot of blind alleys in SwiftUI.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: