Hacker News new | past | comments | ask | show | jobs | submit login
The Death of Cocoa (nshipster.com)
242 points by andrewbarba on Dec 30, 2014 | hide | past | favorite | 110 comments



I like Matt, but I think he's wrong.

Objective-C isn't going anywhere, just like C++ hasn't gone anywhere, nor has C left the building either.

There are 124 public frameworks in the 10.10 sdk, 357 private ones. While a good chunk of these are written in Objective-C, a good deal of them are written in C, or Objective-C with the guts in C++. AVFoundation, for example, is mostly C++ in the "backend".

The amount of effort it would take to move those to Swift would be fairly substantial, and with no real gain. This doesn't mean that new frameworks won't be written in Swift, but still interoperable with Objective-C, this will probably happen - though I doubt it'll happen in my development lifetime. So we are looking at a gradual replacement of frameworks, or superseding with newer ones (QuickTime -> AVFoundation -> ???).

He brings up Carbon, but erroneously as Cocoa actually predates Carbon and Carbon was only meant to serve as a compatibility bridge from OS 9 to OS X. Carbon was meant to die.

I don't plan to switch to Swift unless I absolutely have to. I spend 14 hours a day developing for OS X, but my problem isn't Objective-C, it's the ambiguous documentation and mildly temperamental behavior of certain frameworks coughAVFoundationcough. I didn't need Swift, wasn't looking for Swift. I have nothing against it, I'm sure it's awesome, but language isn't my issue at this point, so I don't really have much to gain by it.


I appreciate your feedback, but I think you're reading an entirely different argument than what is presented in the article. It's not that everything is going to be rewritten in Swift (I'm not sure where you got that idea). Rather, the article is an exploration of what Swift without Obj-C & Cocoa could be, and under what circumstances that might become a reality.

If Swift doesn't have to accommodate Objective-C as it currently does, it's reasonable to assume that system frameworks will be designed and built in ways that make the most of Swift's language features, just as Cocoa currently does with Objective-C.


    What is almost certain, however, is that Cocoa, 
    like Objective-C, is doomed. It's not as much a 
    question of whether, but when.
That was the part I was responding to.


My initial enthusiasm for Swift faded pretty quickly when I tried to actually use it. The tooling is buggy and unstable. Code that calls Cocoa APIs is often no less verbose than ObjC. Every XCode release brings syntax-breaking changes. It feels like the language design was compromised by the requirement to bridge to Cocoa; all the strictness of the language is just a facade when you're manually casting every time you deal with a collection, for example. I guess I like the strict handling of nils in principle but in practice I've found this often means the user gets a crashed app instead of a more tolerable bug.

I'm getting out of iOS/OS X coding anyway for other reasons but even if I weren't I certainly wouldn't use Swift for production yet. Apple should not have pitched Swift as production-ready at WWDC. I think a lot of us would be less critical if it had been billed as the technology preview it was (and still is).


I'm really loving Swift, but Xcode has gotten atrocious. I have a keyboard app with 8 different sub-keyboards, and 80% of the time, when I open my project, Xcode gets stuck in indexing. You can't build, you can't close Xcode, you can't do anything but force-quit and hope it doesn't happen next time. There are also frequent SourceKit crashes, random crashes to the desktop, and incredibly laggy typing and auto-completion. WTF?

"I guess I like the strict handling of nils in principle but in practice I've found this often means the user gets a crashed app instead of a more tolerable bug."

How are you using optionals in such a way that it crashes your app? I've found that using optional chaining and the "if let" block guards your code against accidental nilling even more. My rule of thumb so far is to never ever use the "!" operator except for initialization. I've found that whenever I get a crash — which is rare these days — it's usually because I forced an optional, or possibly used "as" instead of "as?".

"All the strictness of the language is just a facade when you're manually casting every time you deal with a collection, for example."

I dunno, I kind of like the explicitness of (cocoaArray[i] as? ABMyObject). It's no more work than you'd be doing with a generic NSArray anyway, and you get type checking for free. (And, of course, all your Swift native arrays and maps don't have to follow this pattern.)


Xcode 6.1.1 fixed most of the problems you mentioned in the first paragraph for me. Sourcekit crashes was the biggest annoyance, ughh.


I'm on latest and the indexing issue is just getting worse and worse for me.


Try removing (backing up) all folders from ~/Library/Developer/Xcode/iOS\ DeviceSupport. It helped for me.


I'm glad I'm not the only one who feels that way. I read through the book on night 1 and was stoked to Swiftify my life, but was disappointed nearly as fast when I realized how icky Cocoa(Touch) was to implement in Swift. It felt unnatural, and I've used languages with named parameters before, but forcing the two paradigms together seems wrong.


Initially I was ripping my hair out with Swift, but after couple months of using it full time, things are much smoother now. I still have doubts on it's production-readiness though.

And the strictness is pretty badass imo. Objective-c feels very imprecise and unsafe in many ways.


Apple used quite a bit of fanfare (they're just good at it), but I never construed Swift to be more than a beta, and I still don't think that it's more than that.


I agree with most of what you are saying. I have no problem with Objective-C as well. I never had a problem with Objective-C and I have done Objective-C for over 10 years now…

But I would like to mention that people seem to overestimate the work required to "port" a framework from one language to another. Take 280North as an example. A tiny company (funded by ycombinator) founded by 2 ex Apple employees. They have developed a JavaScript variant of Objective-C (called Objective-J) and reimplemented the most important parts of AppKit. It took them a couple of months but in the end it all worked out, many people used (and are still using it) Cappuccino (name of the framework) and 280North was bought by Motorola for something like $20.000.000. So two guys => 1 language, a couple of frameworks in a matter of months.

Also I am sure you are aware that every year Apple introduces a ton of new frameworks and deprecates some frameworks. It would only take a couple of releases to deprecate 80% of the frameworks used by most apps and introduce replacements for them.

Swift without a new set of frameworks makes little sense IMHO. So either Swift will dies - just like GC did or it won't and then we will see new frameworks written in Swift.

Talking about GC… When Apple introduced GC they told us that it was the latest and greatest shit. It was the future. When introducing GC they told us that "hey - we have ported Xcode to GC - look it is that easy.". So they had more apps using GC from day 1 than using Swift from day 1 (well the WWDC app written in Swift does not count…). Apple had to ship GC-capable versions of their frameworks from day 1. They were even more committed to GC than they are now to Swift so it would not surprise me at all to see Swift let go in a couple of years and buried deep.

At the moment everything seems possible.


>Talking about GC… When Apple introduced GC they told us that it was the latest and greatest shit. It was the future. When introducing GC they told us that "hey - we have ported Xcode to GC - look it is that easy.". So they had more apps using GC from day 1 than using Swift from day 1 (well the WWDC app written in Swift does not count…). Apple had to ship GC-capable versions of their frameworks from day 1. They were even more committed to GC than they are now to Swift so it would not surprise me at all to see Swift let go in a couple of years and buried deep.

No way for Swift to "let go". It solves a problem (having a modern language for OS X to go forward") that Objective-C doesn't.

Besides swift makes sense even without a new set of frameworks. People use it already and are productive. Sure, Swift inspired frameworks would be a better fit, but that not a show-stopper.

GC wasn't that much a deal to let go, because they found a way to do the same thing without it (ARC). It's not like they had to rewrite anything for the new post-GC era.


> GC wasn't that much a deal to let go, because they found a way to do the same thing without it (ARC). It's not like they had to rewrite anything for the new post-GC era.

The official recommendation from Apple was to not transition for GC but only to use it for new projects or for complete rewrites. So it was a big deal for every developer who relied on GC.

And only because something is new does not mean it is modern. All of the concepts that Swift implements are fairly old. Swift is just new.


Sure, I don't disagree.

Though, look at QuickTime for example. That was deprecated in Mavericks, but it's going to be around for another few releases. AVFoundation (the framework I use the most) is used by FinalCut, iMovie, etc. so porting that isn't straight forward because it would break so much stuff. So I do think you'll see deprecation of certain things as maybe they move to design patterns that are a better fit for Swift, but entirely replacing AVFoundation would be straight up insane when it works so well (as long as you are doing everything by the rules laid out in a largely unwritten book).


One of the reasons why QTKit sticks around is that it has been around since 10.3 and it's foundation is even older going back to I think 1992.

So I think QTKit is an edge case and probably not used by a ton of developers directly. I think the 80/20 rules also applies here... by porting 20% of the frameworks you get 80% of the functionality.

That being said: I think that if it happens it will happen over time. Deprecating something in Mavericks does not mean that it has to be gone in the release after Mavericks. GC was with us for way too long... But you can replace almost anything within lets say 5-7 years. That would be 5-7 major releases.


btw nice app


What does "GC" stand for in this context?


Garbage Collection. Not that long ago Apple came out with full GC in Objective-C to a lot of fanfare (no more retain releases yay). Not long after that they said to just use ARC (which is compiler retain/releases) because the GC was not working well. Today, it's hard to find any mention of the GC.


The GC had at least one fundamental flaw:

If your app used GC then everything your app links against or loads at runtime has to be GC-compatible. So Apple had to make all of it's frameworks GC-compatible. But many 3rd-party frameworks didn't.

Also: If your app used WebView to load a web site which contained something with flash (YouTube, ...) then your GC app would crash immediately because WebView loaded the Flash plugin which was not GC-compatible.

Thus using a WebView to load arbitrary sites was incompatible with GC.


Minor nit to give credit where it's due. 280 North had three founders: http://en.wikipedia.org/wiki/280_North,_Inc.


And for what it's worth, it took a hell of a lot longer than a couple of months to build something useful. Years later and it still isn't a full port of what was available at the time in Cocoa.


I saw a talk of one of you guys... you mentioned that it took a lot of experimentation before you even considered not using plain JS. But you should also not downplay your achievement! You did a great (maybe not complete) port of Objective-C/Cocoa which was good enough to be used by a lot of people and good enough for Motorola. :)

And Apple is certainly in a better position to do such ports than you were back then: They have the full source code of every framework they developed. I am not saying it is trivial or not much work but it is doable over a couple of years... don't you think? You have certainly more internal knowledge than I do.


14 hours a day, and your problem is ambiguous documentation and temperamental behavior? I don't mean to detract from your primary points, but 14 hours a day is quite a chunk of time. I hope its not your norm, because that just sucks.


That just sucks for you. I am not you.

10-14 hours a day is the norm and I still get in the gym, dinner with the wife, vigorous sex, a few games of vain glory and other facets of "normal life".

I also take weeks off at a time because I mostly work for myself. In fact, I've been traveling for the last 4 years (I moved to Vietnam from NYC about a year ago).

So, it might suck for you, but I would hardly say my life sucks because I work so much. Mostly because it isn't really work to me, but mostly because at 41 I know what balance works for me.

But thanks for the concern.


I have this image of you, in a dining room with a mirrored wall, having vigorous sex with your wife while she is eating a sensible dinner - trying to safely navigate a forkful of mixed greens. You are looking at yourself in the mirror and flexing interspersed with loud beckonings of "Am I VIGOROUS?! YES I AM! GETTING SHIT DONE!"

You get done, smack your own butt playfully, and walk out with your plates saying, "I have to get some other normal shit done. Love you!... and you too wife!"

I'm not saying there's anything wrong with you or your lifestyle... but when you awkwardly present the style of your sex life as part and parcel to normal living, it underlines that you are a bit detached from social norms, thus normal life.

BBC interviews people on the street... BBC: So how are you getting on with life these days? Person: Ahhh... well, it's mostly normal day to day stuff. Kids, pub, job, vigorous sex, groceries, some football or cricket when I can. Just normal all around.


Sorry, I just reread that and I didn't mean to come off like a dick. I'm just saying, I've found the balance that works for me and that work is not really work to me. I do get the normal things in, though sometimes at an accelerated pace. And I don't do the 14 hours every single day, but 10-12 hours is probably more the norm.


I actually think something like this might be my ideal -- spurts of intense work, followed by several weeks of not working at all. This is what I tended to do on personal projects when I was younger. I think this could be a quite healthy, long-term sustainable way to arrange work.


I take it that you don't sleep?


I've had many 14 hours days that were awesome, many of them included work. I agree, just because you don't like your "work" doesn't mean you should judge me for enjoying my "work". LOL


> I moved to Vietnam from NYC

OT but why did you move and how is Vietnam? Are you from Vietnam?

Just wondering how it is to live there.


Not even doctors work that many hours. Holy shit.


Out of school for winter break and I code pretty much 20/7.


There is no reason not to use Swift, Swift can do all that Objective-C can do and more. The language, framework support and the compiler have reached 1.0, the only thing that keeps happening is an error message in Xcode that I can safely ignore and keep working. But projects written in Swift are safe for production.

The reason you could switch to Swift is that it is safer, less verbose (unless you interact with Objective-C) and more flexible. You can choose to use Swift in a way that you are just using Objective-C translated to Swift and ignore all the extras or you can choose to use all the extra goodies you get like map, reduce, currying, advanced switch case statements and more. These features aren't new but they are not common in a lower level language comparable to C++ and Objective-C.

Swift is comparable to Rust in that sense. Nobody needed Rust but Rust doesn't stop you from doing anything you could do before while it gives you a lot of extra goodies from higher level languages.

A lot of people tried Swift when it was in a beta and ran into problems with it and got frustrated. Then repeat those experiences all over the internet like it happened to them just yesterday. There's also a significant portion of the iOS development community that rejects AutoLayouts because it was too alien to them and the first iteration wasn't perfect.

Last but not least: you're not forced to use only Swift in any project. Of course it's easier to wrap C code in Objective-C and then use it with a bridge in your Swift project. I mix and match every day, I'm not going to rewrite perfectly sane Objective-C code because I'm doing a Swift project. Try to be pragmatic about it.

Truth is that Objective-C style libraries really do seem really cumbersome and verbose when you use it in much more lightweight Swift code.


>I don't plan to switch to Swift unless I absolutely have to.

Me neither. Lua all the things. Because, this:

>While a good chunk of these are written in Objective-C, a good deal of them are written in C, or Objective-C with the guts in C++.


I didn't know that was possible. Just googled and there are several options, what do you use?


Holy clickbait headline, Batman.

Yes, Carbon has been replaced in the past but that involved a $400 million acquisition and 10 years of continual complaining, kicking and screaming from established Carbon users who had no desire to change. I doubt it's an example of how future changes will occur.

Replacing entire application frameworks is hard. Super, super hard. It seems like it might be simple to start by replacing Foundation with Swift's standard library but actually, replacing Foundation would mean replacing every Cocoa framework since they all rely on it. And there's a gigantic amount of work in those frameworks; 25 years of development (all the way back to the early NeXT days).

I think this is why Swift includes such extensive support for Objective-C interoperation: Apple expect Swift will need to link against Objective-C APIs for a long, long time.

I think we're much more likely to see a major deprecation sweep through Cocoa in one or two years time (probably once Swift finally has all the features Apple have hinted are coming). Not deleting things, per se, but simply saying "these things look ugly or silly in Swift" so use these other things instead.


And what's the practical difference between death and deprecation? As the Swift standard library grows, and it becomes the widely preferred language for iOS and Mac development, isn't that a form of death?


Simply: I don't think Apple are going to suddenly replace Foundation (or any other Cocoa framework) with a Swift only version. They'll just iterate the frameworks they currently have to make them a little more modern and a little more Swift friendly.

Not that they could anyway: according to Chris Lattner, Swift is currently 1-2 years away from including a language "resilience model" that would allow safe linking to frameworks built with a different version of Swift. Until then, the Swift standard library needs to remain as small as possible because it is statically linked into every Swift app (it can't be shipped separately with the OS).


I have no illusions about the amount of effort and investment necessary to change technology stacks. Carbon was only officially deprecated in OS X 10.8. To speak of the death of technology is only to acknowledge that it is mortal.

My claim isn't that everything will be rewritten in Swift—far from it. Instead, the article is an exploration of what it would mean for Swift to evolve beyond the constraints of Objective-C & Cocoa interoperability, and how that may occur as Swift gains momentum.


I had the same feeling when I did my first side project in Swift earlier this month. My main experience is with CoreData.

An example with 1:n relationships: CoreData returns and expects an (untyped) NSOrderedSet. Now I may either keep the NSOrderedSet, but have to cast each object I want to use - and my Swift code is just as bloated as Obj C would be:

  let obj = mySet[0] as MyClass
instead of:

  let obj = mySet[0]
Plus, map/reduce etc. won't work on an NSOrderedSet.

Or I create a typed Array from the NSOrderedSet, which is fine to work with in Swift - except that it is not managed by CoreData anymore. So I'd have to be careful to synchronize with CoreData manually, and it's not just saving that I have to watch out for, but also other operations like rollback etc.

Another example is NSNumber. NSNumber needs to be manually mapped to Int, while the other direction works automatically. That makes sense when it is unknown whether NSNumber is an Integer or a Floating Point Number, but in CoreData I have specified that it's an Integer 32... (Well, I think it was similar with Obj C, actually).

So, working with Swift in the Playground felt like a huge improvement over Obj C at first, but then working with some Cocoa APIs it started to feel ... more clunky again.

I think it was/is similar with Scala. If you can stay in the Scala libraries, awesome. It is a huge improvement over Java. But Scala won't automagically turn a terrible Java API in a great Scala API. Yes, you'll save a couple of semicolons and type declarations, but it's not that big of an improvement. Thankfully (in Scala), there are a lot of better frameworks or wrappers for existing frameworks by now. I guess Swift will have to go the same way...


I don't know about NSOrderedSet, but for number properties on Core Data objects, you can avoid NSNumber entirely by using primitives. There is an option when generating model objects in Xcode to use primitives, which will do so for you. Properties may be a mix of primitives and objects, too. E.g. if I have a "Car" model that has "tankSize" and "milesPerGallon", I could have "tankSize" be a 32-bit int represented by a int32_t, and "milesPerGallon" a 32-bit int represented by a NSNumber*.

I'm using CoreData via Swift and find it mostly more enjoyable than Objective-C.


This feature of strings is cool! It sounds like the end of the Unicode encoding mess that most languages drag the programmer into:

"One of the truly clever design choices for Swift's String is the internal use of encoding-independent Unicode characters, with exposed "views" to specific encodings:

A collection of UTF-8 code units (accessed with the string’s utf8 property)

A collection of UTF-16 code units (accessed with the string’s utf16 property)

A collection of 21-bit Unicode scalar values, equivalent to the string’s UTF-32 encoding form (accessed with the string's unicodeScalars property)"


Haven't used Swift of its strings yet, but I have used string/unicode in python, NSString in Cococa, and string in Go. The only one that hasn't bit me in the ass is string in Go; the approach they take is a sequence of bytes, utf8 encoded by convention (which is easy to follow); there are methods to work with unicode code points when you need to, otherwise it's just bytes. It's simple, well-defined, non-magic, and doesn't screw you over.


Python 3 uses utf8-as-a-default as well. Not to mention you can easily specify the encoding as well in python 2. Never bites me.


Wish I had documented my exact frustrations, but it wasn't not being able to choose the encoding...something more of the flavor of:

Most of the time I just trying to shuffle bytes from one place to another, and didn't really care about the contents of the bytes, but I hit numerous bug because of exceptions due to things like "external data wasn't valid utf8" (NOT helpful to throw an exception here), and bugs caused by 'str' vs 'unicode' confusion (made more tricky by lack of static type system).

The one time I did actually need to do some calculation involving unicode I got burned by behavior that didn't match the documentation, because my bullshit package-manager-provided variant of python had some insane compile time option that made it pretend utf16 code units were the same thing as actual utf code points, to which I can only say "ha ha ha...fuck those guys".

Can't speak to python 3, haven't used it, probably never will :)


It sounds like a fairly terrible tradeoff actually. Presumably one of those encodings is the underlying representation, and accessing it in any other representation is going to cause a horrible hit to performance as it thunks between the two.

In practice it feels much better to define a encoding for the platform (say UTF 16 for Java / Windows - though personally I feel UTF-8 is a better choice) and then have a [string encodeIn:UTF16] method if you want something different. Ideally, the encoding would be specified in the type - e.g. String<UTF16> rather than arbitrary unspecified byte arrays.


> Presumably one of those encodings is the underlying representation, and accessing it in any other representation is going to cause a horrible hit to performance as it thunks between the two.

Which does not usually matter, you're accessing it in a specific representation because you need it in that representation, usually for IO. The "horrible hit" is one you'll have to eat either way. And if you're baking the implementation details of your internal strings into your IO… god help your soul.

That aside, there's not much of a horrible hit unless you're preallocating the whole output string every time. Swift has iterators/iterables built in and I may be mistaken but I believe Swift does the sane thing and exposes noalloc iterable views, you're paying for some bit-twiddling (for the transcoding itself) and stack-allocated int8/int16/int32. Not sure how good Swift's compiler is, but I know Rust's can turn such iterations into the equivalent of the corresponding C loop, there's little to no overhead.

> In practice it feels much better to define a encoding for the platform

Why? What does that give you, aside from exposing broken implementation details as the type's public interface and having unfixable string for a decade (see: Java and everything Microsoft, because they exposed strings as being O(1)-indexed UCS2 code units early on).

> Ideally, the encoding would be specified in the type - e.g. String<UTF16>

That's crazy talk, why would you encode the implementation detail of the string's internal encoding in the type interface? I can think of a hundred things I'd put there, but the internal encoding?


A lot of people on the thread are focusing on how hard/not hard it would be to make a wholesale transition of Cocoa from one language to another ... I'm coming from the world of Windows, which might offer an instructive history. In short, Microsoft would think nothing of making a change like this.

The Win16/Win32 API was a thing of beauty and was the whole universe for several years. It was written in C.

Then MFC came. It was the new hotness in the late 90s and it was based on C++.

Early 2000s. Goodbye to C/C++. If you want to build a native Windows app, you would probably think first of C#, which used a completely new, non-C Windows API called Windows Forms.

Today ... want to write a Windows app? Visual Studio's new primary language for you is Javascript, talking to a complete javascript-native API for windows.

And let's not forget Visual Basic, the most popular language in the world, which has its own Visual-Basic esque toolset and Windows API.

Microsoft certainly has a much bigger vision for its development tools, and a much longer history. Its very first product, after all, was a programming language (MS Basic) - and developers have in a sense been a much larger target market for them than they have for Apple in their shared 30+ year history.

But now Apple, with its skyrocketing income, surely must be aspiring to the herculean scale of Microsoft's year-in/year-out dev tool undertakings.

My bet is Apple will come out with a Swift-native version of Cocoa within 18 months or so. ObjectiveC Cocoa will be supported for a LONG time (I believe all the above MS dev technologies are still usable in some form or another ...)


Swift has the potential to become a great systems language but I am cautious it will replace Foundation/AppKit/UIKit. For example, a Swift string IS an NSString (at the moment at least).

It's impossible to know for sure, but when you look back 15 years ago when Cocoa was the new and shiny to the old-trusty Carbon. Apple was actually writing software in Cocoa internally. Everything they were building towards extended from the NeXT Objective-C world. Though true, they didn't publicly commit to Cocoa 100% until OS X 10.4, but you better believe internally they were always in. It's just the world didn't see it until the "We're rewriting the Finder in Cocoa" campaign was announced.

At Apple right now, no one outside of the compiler team is working on anything interesting in Swift. It's still locked away from them. To be fair, it's an evolving language and will cause a lot of heartache for everyone until the language has been baked in more.

I know Mattt is excited for Swift. Plus, a lot of developers are already doing some really cool stuff. So we shall see in a couple years how the story plays out.


Isn't it just an implementation detail that NSString and String are the same? Couldn't the implementation diverge, as necessary, without breaking things? We also have toll-free bridging to CF types.


"Apple is a company with a long view of technology. It's really difficult to tell whether a technology like JSON is going to stick, or if it's just another fad. Apple once released a framework for PubSub, which despite not being widely known or used, still has to be supported for the foreseeable future. Each technology is a gamble of engineering resources."

Apple doubted JSON would be around?


JSON was invented (discovered ;) ) in 2001, the website went up in 2002. Major services like Google and Yahoo were providing their data in JSON format in 2006. If this article is correct, Apple didn't include JSON functionality in Cocoa until 2011, that's 9 years after the public website and 5 years after inclusion in major web services. This is an eternity in technology time. I personally find the explanation that "each technology is a gamble of engineering resources" in light of this a bit ridiculous.

As an aside, I think the lack of built in JSON support in Swift remains telling of Apple's simply puzzling take on this new programming language. When Apple first announced that Swift would be good for both "systems programming" and "scripting", it set off red flags in my mind. That statement is usually only made by people that have worked a majority in one of these domains, and doesn't really understand the other that well. In my mind, if the second you want to grab some data off the internet in the most popular format you either need to 1) drop down to a bridge API that everyone agrees is terrible (as mentioned in this article, NSJSONSerialization is even more frustrating with the optional stuff), 2) download a third party framework, or 3) learn monads or roll your own, then this does not feel like a scripting language by any stretch of the imagination. Just look at this: https://twitter.com/andy_matuschak/status/549268259871002624


> If this article is correct, Apple didn't include JSON functionality in Cocoa until 2011, that's 9 years after the public website and 5 years after inclusion in major web services.

JSON was added to Ruby stdlib in 1.9.2, released in August 2010. Available as a gem for years prior. Ruby was probably slow because YAML was the anointed format, and JSON isn't distinctly better than YAML.

JSON was added to Python modules in 2.6, which was released in October 2008. Likely available as an egg prior, but I don't remember.

There were Objective-C libraries for JSON well before 2011 too. The first one I remember using was in June or so of 2009.

So, I half agree with you. It might be hard to remember nowadays, but JSON wasn't universally seen as a Good Thing initially. It came with a lot of JavaScript baggage, which in some circles hung around for a loong time.

But Apple was definitely late to the JSON party, and it was disappointing at the time.


The biggest difference being that both Python and Ruby were module cultures, so saying that a gem existed actually means something. On the other hand, Mac dev didn't have Cocoapods until 2011. Apple dev at the time (and I'd argue in large part still today), was mainly using Apple's monolithic frameworks (and downloading AFNetworking source code to include in your project), so it was much more important for it to be included in the standard library. However, the point still stands that we now have a new language that still does not have this basic support.


True, the ObjC culture was undeniably different than the CPAN-inspired code sharing cultures of Ruby and Python. So there were fewer quality libraries.

But the libraries did exist, and Apple devs in general came from a background of C and UNIX programming, where static libs weren't an oddity.

This is all different now, due to the huge influx of ObjC devs from the web world. Expectations have changed, CocoaPods emerged in response, etc.

But I have no explanation for the Swift situation, except that it looks and feels like ObjC and Cocoa. I think Mattt's point is that it needn't.

This might be a bit of C culture still showing through...any other conversion process would be, ultimately, magic. ObjC has never been about brevity or implicitness.


The NeXT/Apple ASCII encoding of their PLIST format is very similar to JSON and has been around forever. IIRC TextMate 1 made use of it for much of its bundle system.


NeXT Plists were a lot like JSON, but OSX Plists were XML (or binary).

Apple definitely had their anointed interchange format, and it was not JSON, for several reasons -- first among them that Plists predate JavaScript! -- but also because JavaScript types get a bit ambiguous in a ObjC/Cocoa context.

Interestingly, Plists can now be XML or JSON (or binary).


> NeXT Plists were a lot like JSON, but OSX Plists were XML (or binary).

Old style text plists are still supported:

    $ cat > /tmp/test.plist
    {
        "david" = "great";
        "array" = ( 1,2,3,4 );
    }

    $ plutil  -convert xml1 /tmp/test.plist -o -
    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
    <plist version="1.0">
    <dict>
    	<key>array</key>
    	<array>
    		<string>1</string>
    		<string>2</string>
    		<string>3</string>
    		<string>4</string>
    	</array>
    	<key>david</key>
    	<string>great</string>
    </dict>
    </plist>

    $ sw_vers -productVersion
    10.10.1


Dear Lord in heaven, who can possibly prefer the XML version? That's hideous!


XML parsers prefer the XML version. :)

Less glibly, humans never have to deal with it. Data is serialized and deserialized, typically in binary format. It exports to XML merely as a convenience.

And critically, XML can be typed properly for all ObjC/Cocoa types (sometimes with characteristic XML awkwardness). That XML is wrongly typed, due to the ambiguous source input. JSON would have similar problems. Correct beats pretty, in this case.


> XML parsers prefer the XML version.

Well yeah, but an XML parser would also have no problem with '<tag name="p"><chardata><char>T</char><char>h</char><char>i</char><char>s</char><whitespace type="space" /><char>i</char><char>s</char><whitespace type="space" /><char>i</char><char>n</char><char>a</char><char>n</char><char>e</char><punctuation type="full-stop" display-char="." /></tag>', but that would be completely insane.

> Less glibly, humans never have to deal with it. […] It exports to XML merely as a convenience.

For…humans, no?

> And critically, XML can be typed

It looks like the plist supported types too, although I don't know for certain. At least, the numbers weren't quoted in the plist.

> JSON would have similar problems.

Would it? '"2" !== 2', IIRC.

Of course, my preferred syntax would be:

    (dict (david great) (array (1 2 3 4)))
if one wanted to treat numbers as text and:

    (dict (david great) (array ([int]1 [int]2 [int]3 [int]4)))
if one wanted to indicate that they are ASCII decimal-encoded integers or:

    (dict (david great) (array ([bin-int]|AQ==| [bin-int]|Ag==| [bin-int]|AW==| [bin-int]|BA==|)))
if one wished to use binary encoding using network-transfer order, but I am clearly insane.

Not nearly as insane as whoever came up with that XML abomination, though.


Yes, a plist is defined as containing things of very specific types. See my SO answer [1] for an overview and an attempt at a Swift implementation.

[1] http://stackoverflow.com/a/24051062/20371


> Less glibly, humans never have to deal with it

As long as it works, unless we categorize programmers debugging XML issues as non-human drones.


And that's why we have plist editor.


And that's why we have list editor.


> Old style text plists are still supported:

True. Read-only, however. :)

And the type ambiguity is amply demonstrated.


> Read-only, however. :)

Nope! :-) While it's true that plutil doesn't support the format, open the plist in Xcode and edit it and it will save it out in the same old-school format. Editing through the programmatic API will also keep the format.


Well, that's good to know, thank you.

I will now manipulate ancient Plists with less fear. :-)


Xcode project files are still saved in this style of plist.


> Just look at this: https://twitter.com/andy_matuschak/status/549268259871002624

"A pragmatic and intentionally non-abstract solution to JSON decoding / initialization that doesn't require learning about five new operators."

What's scary about operators? An API with five new functions doesn't cause the same amount of anxiety, does it?


1. The operators are abstract (most people have never seen >>=).

2. Even IF they were functions, (say "bind" and "return"), the actual functions themselves are known to be hard to understand for people.

3. Needing to know these when all you want to do is grab google maps results is the worst place to encounter them. In JavaScript, you don't need to learn "anything", just call JSON.parse. Its fine to encourage learning. When you want to do something completely unrelated that takes no thought in other places though, its a bad place to enforce learning.


You can't get to online documentation easily, the calls have a maximum of 1 or 2 arguments, you can't provide alternatives that pop up on autocomplete, and the syntax can be harder to visually parse. To be fair on the last point sometimes it can be easier to visually parse with operators, but not always. For example compare printf format strings where substitutions look like %f vs C++ output streams where they look like "<<x<<" which breaks up visual flow like nobody's business.


> You can't get to online documentation easily, [...]

That's a problem with common search engine. A specialised API search engine can help there. See eg https://www.haskell.org/hoogle/?hoogle=>>=


Apple also released a framework for Publish and Subscribe in System 7, though I don't think that's still supported! http://en.wikipedia.org/wiki/Publish_and_Subscribe_(Mac_OS)


For those wondering, what OP quoted was a response to the author:

> In defense of Apple, I once asked an engineer at a WWDC Lab why it took so long for iOS to support JSON. Their answer made a lot of sense.

However, I'm not sure that it makes much sense to me... I guess I may be missing something essential?


I think vanilla cocoa is being replaced slowly by cocoa touch. They will probably rename it and then force cocoa touch onto macosx. I don't see foundation changing much though. Maybe foundation touch "universal" is coming ;)


I think Swift has promise, but I don't expect it to replace Cocoa anytime soon.

My main complaint about Swift is the lack of available documentation. In C languages, you have static header files which you can read and learn about APIs & types / classes / whatever. API documentation in Swift is generated on the fly when you search for a specific term that happens to be a function / type / protocol / whatever. But how do I find out what functions / methods are applicable to, e.g. a String? There is no chance to just `grep` existing headers for 'String' or some such trick, to find everything related to that type. This may seem minor to some, but for me it is a major stumbling block making Swift effectively a black box for me.


>Will Swift unseat Javascript as the only viable web scripting language by adding interpreter to Safari?

Adding Swift to Safari would be an interesting development


About as interesting as adding Dart to Chrome.

Except with none of the market share.


Only every iPhone and iPad out there.


Awesome. Let us make Apple-only sites! /sarcasm


I would imagine there would be a javascript transpiler so the browser can request .js or .swift file depending on its own capabilities.

As a web dev myself, I'd love this solution.


Let's be honest, though. Apple ignoring Flash on iOS did help move the whole web ecosystem away from Flash.


But with a whole lot of 'developer share' that Dart doesn't have.


When I saw how quickly playgrounds compile and run, this was my very first thought. The fact that Apple announced FTL compiling for javascript just prior to announcing swift makes me wonder wether Apple has plans to make swift the new lingua Franca.


>only viable

But wait, since when was Safari the only viable web browser anyway?

Disclaimer: I'm an Apple fanboy and rMBP user.


Safari isn't the only viable web browser. Javascript is the only viable webscripting language in browsers.


This has been my biggest worry since Swift was announced. I am really happy with Foundation (much more than with AppKit or post-iOS 6 UIKit). But if 2014 Apple were to reinvent it, I am sure it would suck.

Has Apple recently released anything that they have dog-fed to themselves before, and that was not buggy? Swift itself is a good example of how Apple seems to work now: Let engineers build a toy, release it as v1.0, wait for the early adopters on Twitter to sing their praises, then maybe start using it internally. Maybe.

I wish Apple had instead designed better frameworks for UI and persistence and then built a language to make working with them easier.


With Objective-C they had to do a lot of things in higher level (standard library or the framework). With the new shiny language that had tons of features that are remained unused it makes sense to slim down Cocoa and move things to language level.

A good example is string interpolation. Swift makes it unnecessary to have things like `stringWithFormat` in framework level.


I gave up on Swift when I couldn't just do an import CommonCrypto, and had to do hackish bridging workarounds. For a language that was supposed to be much higher level and just work, it really didn't.


Am I terrible for loving the minimalist design, and simultaneously wondering how I can read the archives from above the fold on the home page, instead of reading the article?


Nope, I love it too. I made my personal website the same way. Long live meaningful and well written text being the focus.


I agree with the click-bait sentiment. This article says "Cocoa" and then goes on to talk only about Foundation. But "The Death of Foundation" wouldn't get all the clicks.

I also agree with the pro Objective-C sentiment. I do understand that some developers thought it strange and just because the syntax of Swift looks closer to what they are accustomed to, they are ready to throw out the baby with the bathwater. Dynamic-dispatch is an excellent design for making GUIs.


I've been working with Cocoa for 20 years, back from the NeXT days. I certainly see where Swift is going, and there's a possibility that it will surpass Objective-C in the future, but for the time being they will live together in relative peace.

Objective-C provides a number of patterns that aren't available in Swift due to the dynamic nature. For example, transparent network proxies are possible in Objective-C using message forwarding techniques. Similarly, it's possible to swizzle method implementations under the covers so that a replacement mechanism can be called instead. These don't sound like much but they are used to implement some powerful mechanisms, like the observable patterns and key value binding that is built upon it.

Swift is really a much better C++ than a better Objective-C. The language is terser and the underlying compilation mechanism can perform more optimisations than C++ (or Objective-C!) can. For example, Swift has the concept of a module (aka framework) and the Swift compilation can provide module-visible functions that can be called between different classes but still be optimised at the module level (e.g. in-lined).

There are a lot of compromises in the language at present; it can fallback to generate an Objective-C class which then means that the Swift optimisations aren't effective; and when data is passed from one layer to another it may have different performance characteristics (a built-in Dictionary will perform differently than an equivalent NSDictionary, even if they can be used in the same way).

A lot of the problems stem from the fact that Objective-C has grown over time, and itself has had new mechanisms added. Blocks were only added relatively recently, and so some APIs have support for callbacks with blocks, whilst older APIs don't have block support. Those that do support blocks work particularly well with Swift, because you can have a trailing lambda(closure) and pass that in as a block; if it doesn't then it involves writing a separate Swift/NSObject class that implements a callback interface, which increases the size of the codebase.

I would imagine that Apple will focus on the important things - like fixing the compiler/xcode bugs (they've got a lot of work to do there still) and introducing newer features/fixes (class members can't be that hard...) before there's an exhaustive overview of the APIs. That said, I think the existing APIs will largely do their own things (being looked after by different teams), perhaps adopting blocks where they are not yet present, but there would have to be a big shift away from things like UIKit in order to make a change.

Close observers will note the UI layer is already being abstracted with CoreGraphics (a C library) and Metal (a lower layer C library) so it's not too much of a stretch to imagine a new UI layer written in Swift -- but Apple will keep that under wraps for a long time until it is baked.

Disclaimer: I published Swift Essentials (http://swiftessentials.org) and expect Swift to only grow from here, even though Objective-C was my first love.


Interestingly, Swift may not even be a "better C++" for all domains dominated by C++:

The memory model is strictly ref-counted which is unsuitable for certain tasks (e.g. high perf game programming) and the runtime performance is still extremely brittle (i.e. dependent on optimization passes).

Lots of apps also use c++ as the common core model. Since Swift isn't multiplatform, it is not replacing c++ there either.


[deleted]


The purpose of Set abstractions is to provide set operations like union, intersection and complement, which Dictionary doesn't provide. Set operations are regularly needed in various algorithms.

Void vs. Bool issue might be due to fact that Nate Cook's implementation was written at the time of Xcode and Swift beta and there were a lot of type inference bugs. Maybe Void dictionaries didn't work correctly. For example, using map that returned void crashed the compiler in one beta version. This was actually used in Nate's set implementation and I had to modify it to work.


Cocoa has a ton of bugs and cruft and could very much benefit from a mass refactoring. I'd like to see everything deprecated actually removed and a bunch of other aggressive actions taken to prune APIs that never worked well. Replace all target-actions and delegates with blocks. Redo anything where the solution is to defer an action by one runloop cycle. Rid the world of NSCell. Redo views using layer composition. Admit CoreData hasn't been up to task for years and still lags behind and ditch it. Work with the security team and introduce new entitlements (still a stupid name) to allow more APIs to work under sandboxing. Introduce more fixes and changes throughout the year, not locked to an annual OS update cycle. Open up the bug base for this thing so I can see when something is broken instead of losing hours on end thinking I'm doing something wrong. Be bold. Break compatibility and call it Cocoa 2 for marketing. Do whatever steps are needed to make Cocoa awesome again.

But this shouldn't be done because of Swift; this should be done because it's necessary to breathe new life into the frameworks and rethink them for how we work now.

Swift is not ObjC without the C, gets the defaults wrong and should be dynamic by default, and is changing too frequently to provide any of the stability and productivity that good frameworks need to provide. To anyone on the Cocoa team: don't cater to fads (you saw what happened with garbage collection). Make them actually give you Objective C without the C or a language better suited than ObjC to app development before you even consider going out of your way to support it. When that happens, you can have a great refactoring again and call it Cocoa 3...


I'm in love with the Cocoa


Oh god, NextStep API:

    NSJSONSerialization
What else does JSON do if not serialization?

    JSONObjectWithData
As opposed to JSON things that aren't objects and don't have data? The signal/noise ratio is so low.


"JSONObjectWithData" means that it produces an object from NSData input. This is perfectly reasonable because valid JSON can also be an array and the input could be something else than a buffer, for example a string.

I don't find Cocoa's verbose names to be a problem because the logic behind them is very consistently applied and Xcode's autocomplete fills them in anyway.

Personally I greatly prefer Cocoa's excessively "literary" style to the arbitrarily shortened style used in many dynamic languages like Ruby. Unlike "JSONObjectWithData", method names like "eql?", "gsub" and "len" can't be deduced, you just have to remember them or look them up.


I agree entirely. It's just as fast to read and typing abbreviations is actually slower than typing full words most of the time because of keystroke muscle memory. Sure, it can make a screenful of code 'look' uglier but in the end it reads explicitly and that can only be a good thing.


I don't think anyone is advocating for abbreviations.


> This is perfectly reasonable because valid JSON can also be an array

Are arrays (and primitives which can be boxed) not considered objects in every OO language ever?

Agreed re: usls ctrctns. That's the other extreme. One wants to convey the required meaning in the minimum amount possible - no more, no less.

Edit: thinking about it, you probably mean that the top level object is a hashtable, since ObjC is strongly typed. If I was going to do that I'd rather check the whole structure (rather than just the top) with an interface.


Yeah, that's what I meant. Because JSON is based on JavaScript, its data types are often called by their JS names: a name/value pair collection is an "object" and an ordered list is an "array"... But of course an array is an object as well, so there's ample room for confusion.


JSONObjectWithData can return an NSArray or an NSDictionary in the general case, and a NSString, NSNull or NSNumber if you direct it to allow fragments. It can also return nil.

So aside from being verbosely named, it's also incorrectly named.

If they'd called it [NSJSON parseData:] you wouldn't have been confused.


I clicked on this link thinking "yeah, hot chocolate has kind of vanquished cocoa hasn't it?" Very disappointing to discover this had nothing to do with chocolatey drinks.


Cocoa was named partly as a pun of Java. Remember when NeXTStep was folded into Apple, Java was getting a lot of attention.


Cocoa was Java compatible, as a first class target.


Not sure why you're getting downvotes. Even as someone who develops with Cocoa, the chocolately drink was my first thought too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: