Hacker News new | past | comments | ask | show | jobs | submit login
OS X app in plain C (github.com/jimon)
249 points by dmytroi on May 5, 2016 | hide | past | favorite | 151 comments



There was a thread the other day where people were talking about languages that were easily interoperable with the "C ABI" (although there is no such thing). Languages listed included C++, Rust, and maybe a few others--but no mention of Objective-C.

The ability to use a high-level, dynamic language (ObjC), C, and even inline assembly in a single source file is unique to Objective-C (at least among the "mainstream" languages), and something I think is often under-appreciated.

One comment on the code here: for message calls returning type 'id' (object type), you don't need to cast the dispatch function. It would make the code much more readable. For other return types, you need a cast, but I'd wrap it in a macro. You could even use a C11 generic macro to handle floating point return values (unfortunately necessary).


You should still cast the dispatch function even when your return type is id.

For one, C vararg type promotion makes it impossible to pass certain types in as parameters otherwise. For example, you can't pass a float.

It's also easy to make mistakes because of type mismatches. For example, if you pass `42` to a CGFloat parameter, this works fine when you cast msgSend to the right type, but will fail amusingly if you rely on varargs to make it through.

Perhaps most importantly, vararg calls aren't guaranteed to be compatible with a non-vararg target (as most/all methods you call will be), and you'll see this in action on ARM64.

Currently, you can #define OBJC_OLD_DISPATCH_PROTOTYPES 0 to have the compiler enforce this requirement. (This declares the functions to take (void) rather than (id, SEL, ...).) It's likely that Apple will eventually make this the default.


You're right and thanks for the correction. One point I'd add is that C generally encourages not using casts because casts themselves can be error-prone. So it's debatable whether it adds any real type-safety.


I don't see how it's debatable. The version without a cast is just plain incorrect, but happens to work anyway through luck and happenstance on many platforms. Even ignoring that, getting the types right without the cast requires looking at the type of every parameter, which could be variables declared far away, and understanding and applying C's vararg type promotion rules. Getting the types right for a cast just requires writing out those types correctly at the point of the cast.


I said you're right and I wasn't trying to disagree. I'm also a big fan of your blog and comments here (FWIW).

As for whether casts add much safety, I admitted it's debatable. C++ prefers casts (e.g. on malloc) whereas C doesn't. A downside of using explicit casts is that if the type changes elsewhere in the program, the casts (spread throughout the code) can mask errors. If you write code such that casts are unnecessary (which I'm agreeing is not possible in this case!), you will get a compile warning/error right away.

I think both perspectives are valid and there are risks either way. I try to avoid casts when writing plain C though.


I agree that unnecessary casts are undesirable, but that seems like a completely irrelevant point here, seeing as how the casts are not optional.

C++ doesn't prefer casts, it requires them in places where C doesn't. Casting the return value from malloc isn't optional in C++ (unless you're assigning to a void *, anyway).


I was referring to this part of your original post:

> It's also easy to make mistakes because of type mismatches. For example, if you pass `42` to a CGFloat parameter, this works fine when you cast msgSend to the right type, but will fail amusingly if you rely on varargs to make it through.

All of your other claims are completely valid. This one, I'd say, is partially valid.

No need to pick nits with my usage of the word "prefers."


I don't understand what's debatable about that. Maybe an example would help?

    objc_msgSend(window, sel_registerName("setAlphaValue:"), 1);
Versus:

    ((void (*)(id, SEL, CGFloat))objc_msgSend)(window, sel_registerName("setAlphaValue:"), 1);
The former will set a garbage alpha value, the latter works. More generally, the former requires much more care and attention to get the types right, whereas the latter just requires that the cast match the method declaration.


    ((void (*)(id, SEL, double))objc_msgSend)(window, sel_registerName("setAlphaValue:"), 1);
If the method signature changes (unlikely for a system header, but possible if you're calling into your own code) then you're hosed. Or you could just mess up and copy/paste the wrong signature, or forget to change it.

Basically, it's pretty error-prone no matter what you do. Using Objective-C directly is much safer (and luckily you usually can, since ObjC is pretty much a drop-in replacement for C).


I totally agree on all counts here. The cast saves you from this one specific thing but there's plenty more to go wrong. I think some people may overestimate the difficulty of adding some ObjC wrappers to a plain C project.


I believe this is already the default for new projects.


> The ability to use a high-level, dynamic language (ObjC), C, and even inline assembly in a single source file is unique to Objective-C (at least among the "mainstream" languages), and something I think is often under-appreciated.

You can even use C++, too, which is awesome. Putting Objective-C objects in C++ structs "just works" thanks to ARC.

One thing most people don't know is that Objective-C was originally implemented as a precompiler for C.


Objective-C was a modification of the GCC developed by NeXT. NeXT didn't publicly release their patches and the FSF/GNU threatened legal action. NeXT released their sources and the GCC now supports Obj-C.

Source: https://en.wikipedia.org/wiki/GNU_General_Public_License#Leg...


> Objective-C was a modification of the GCC developed by NeXT.

No.

Objective-C was developed in the early '80s (as a C pre-processor, as the Grandparent post said) by Brad Cox, and commercialized by himself and Tom Love via a company they created together called Stepstone. Cox wrote a book about their product, Object Oriented Programming: An Evolutionary Approach, in 1986. I own a copy. It's pretty good!

Some time later, NeXT decided to build their user-space APIs around Objective-C, and bought the rights to Objective-C from Stepstone outright. It was at that point that they started modifying GCC to directly compile Objective-C rather than pre-process it into C.


I have the book too. It's interesting as a history lesson and initial motivations for Objective-C.

In particular, in the beginning Objective-C was a lot looser typed at compile time than it is now: no NSString* or NSWindow*, it was all just id. It was an attempt to make C look like Smalltalk.

The language eventually evolved more toward the static-typed end of the spectrum, culminating in support for lightweight generics and ostensibly birthing Swift along the way.


> Objective-C was a modification of the GCC developed by NeXT.

No. Objective-C started out as a set of C Macros, then an actual pre-processor was created, mostly to deal with uniquing selectors. Once the pre-processor was there it was used to create actual syntax.

Documented in "Object Oriented Programming: An Evolutionary Approach".[1] Still one of the best books on OO out there, because it treats OO as an architectural style with tradeoffs relative to other styles, rather than as a religion to be accepted (or nowadays as the devil to be cast into hell).

It also clearly describes the deliberate hybrid style that seems to be mostly forgotten now: object largely implemented in C, but connected via dynamic messaging. "Software-ICs".

I am somewhat surprised by seeing the 1986 release date, I guess I must have gotten it pretty soon after it was published. Used it as a template to implement an Objective-C pre-processor + runtime + basic classes on my Amiga (had just gotten a C compiler for it).

One of the reasons I got a NeXT was because NeXTStep was largely implemented in Objective-C, which to me meant they "got" it. And it also meant I could enjoy programming in Objective-C without having to maintain my own :-)

[1] http://www.amazon.com/Object-Oriented-Programming-Evolutiona...


> No. Objective-C started out as a set of C Macros, then an actual pre-processor was created, mostly to deal with uniquing selectors. Once the pre-processor was there it was used to create actual syntax.

I'm curious. What did that early form of Obj-C look like?


More details:

> In order to circumvent the terms of the GPL, NeXT had originally intended to ship the Objective-C frontend separately, allowing the user to link it with GCC to produce the compiler executable. After being initially accepted by Richard M. Stallman, this plan was rejected after Stallman consulted with GNU's lawyers and NeXT agreed to make Objective-C part of GCC.

Source: https://en.wikipedia.org/wiki/Objective-C#Popularization_thr...


you mean objective-c++? c++ has no idea about obj-c...

another thing most people forget is that the entire programming community rejected obj-c as deficient in its very early days... it only persists thanks to the ego of former nextstep employees afaik. and now apple employees got lumped with it... which i'd suspect helped create the internal pressure for swift... so some good came at the end of it all. :)


Objective-C was the major inspiration for Java - you'll notice how little it looks like C++ - and therefore C# and co, so it's been very influential ever since the 90s.

It's not worth listening to anyone who dislikes Obj-C just because of the [] syntax, which doesn't matter after the first 48 hours.


This about mirrors my opinions about Objective-C.

http://twistedoakstudios.com/blog/Post8237_a-years-worth-of-...


> Lastly, neither has generic types.

Not true anymore; ObjC has generics with type erasure. (But not all the cases can be exported to Swift.)

> Instead, Objective-C has nil, which is like null except it won’t stop the program when you accidentally use it.

That's… not quite right… but I guess he has the visible effects down. Anyway, ObjC has nullability annotations now for this.

> For example, in Objective-C many objects have both a “core foundation” form and a “new style” “NeXTSTEP” form

This is not legacy baggage, it's an intentional C bridge.

> This is justified by conventions strongly favoring you not catching exceptions, and apparently the speed benefits are non-negligible, but it still blows my mind that Apple has their language default to incorrect behavior.

He shouldn't use ARC+exceptions, because Cocoa itself is not exception-safe. Just quit the app if you get one.


> It's not worth listening to anyone who dislikes Obj-C just because of the [] syntax, which doesn't matter after the first 48 hours.

That's pretty much my thoughts these days. If you have only the most superficial understanding of these languages, then that's what you'll get caught up on, but that's like saying Java and JavaScript are similar because they both use curly braces and dot notation for their methods. Yea, that is true, but it's a superficial similarity.


Objective-C is what makes developping applications on iOS so much more efficient than on Android; even if only indirectly, thru the frameworks provided: if Android cannot provide frameworks usable more efficiently than Apple, it's because they're using Java instead of Objective-C.

https://infinum.co/the-capsized-eight/articles/android-devel...


I think it actually was a precompiler for C up til a point. Still Objective-C objects have underlying C functions that are being called whenever you try to call a method or anything on an Objective-C object. Theoretically you should be able to call functions on Objective-C objects from C.


> Still Objective-C objects have underlying C functions that are being called whenever you try to call a method or anything on an Objective-C object.

The same goes for C++ or Swift. Getting the function pointer for a method can be involved, but all functions adhere to the C ABI.


At least MSVC on x86 uses "thiscall" by default which puts the this pointer in ecx. You cannot call those functions from C without specific compiler support.


Sure you can.

Just write an asm stub to call, that sets up ecx and other things right.


The claim was: The same goes for C++ or Swift. Getting the function pointer for a method can be involved, but all functions adhere to the C ABI.

This is clearly false since at least one C++ compiler generates functions that does not adhere to any of the common calling conventions for C on that platform (or "C ABIs" if you will).

Besides that, I said you could not call them from C. An asm stub is not C. Neither is inline assembly even though it's a common extension to C.


thats not true if you inspect the details of the specific ABI and cheat a little or use some inline assembler... but cl is easily the worst compiler for a C++ ABI.

they break it on tiny incremental updates sometimes... at least historically (in fairly recent history at least).


> thats not true if you inspect the details of the specific ABI and cheat a little or use some inline assembler...

Some x86 C compilers supports calling conventions that lets you put things in registers, e.g. Microsofts "fastcall". But I wouldn't consider that part of the standard "C ABI" for Windows given that it's practically never used for externally visible functions. As I said "without specfic compiler support".

Using inline assembler is just cheating, it's not C.

> but cl is easily the worst compiler for a C++ ABI. they break it on tiny incremental updates sometimes... at least historically (in fairly recent history at least).

That is more about name mangling schemes and layout of standard library classes though. As far as I know the calling conventions has been quite fixed for a long time.


What C ABI? Can you point me to the spec?


I don't believe C standardizes the ABI across all OS/hardware combinations. But some OS/hardware combinations are standardized; for example, there's the AMD64 ABI[1], which I think just about every operating system that runs on amd64 uses … except Windows.

[1]: http://www.x86-64.org/documentation/abi.pdf


Yeah, but that's a bit of a long bow... That's not a C ABI :-)


I'm going to miss Objective C terribly. It is, hands down, still my favorite language (and ecosystem). It's interoperability with C got me into C. It was insanely powerful and fun.

I know not many people agree with me. I liked the square braces and crazy long function names. I know Swift is decent... It's not the same.

Oh well. Lamenting my path to software engineering doesn't mean much for anyone else. But I really am going to miss it.


Is objective-C going away?

Also, are there any particular resources that you would recommend for beginners? I'm looking forward to reading this article since it might expose the low-level details of how, say, message passing works under the hood. Most of the guides are a bit too "beginner-oriented"; I'd love to find something that pulls up the carpets to reveal all the infrastructure underneath.


If you are already familiar with high-level Objective-C, I would recommend looking at the headers for the language runtime. All of the inner workings are exposed as C functions you can play with. Also take a look at Mike Ash's blog, which dives into a lot of details: https://mikeash.com/pyblog/?tag=objectivec

Also: http://www.sealiesoftware.com/blog/archive/index.html


Swift is the future Apple is pushing to developers. There is an insane amount of Objective C in the world for sure, so it's not going away completely, but I expect more new work will choose Swift, and eventually Objective C will be relegated to Apple devs working on OS X itself, plus legacy apps.

But maybe I'm wrong, and I'd be happy if I were.


I feel the exact same way about it but hopefully it isn't going to go anywhere any time soon.

WWDC 2016 will hopefully have some nice objective C announcements just like last years did.


Why miss it? Just go on using it! If important applications are written in Objective-C, that will motivate the spreading of Objective-C.


So I will bite. Why does the C ABI not exist?

C calling conventions (cdecl, stdcall) and memory layout is pretty well standardized. So I'm not sure what else you could be referring to here.


The C standard specifies absolutely no "C ABI" standard. The two "standard" calling conventions you cite, cdecl and stdcall, are Wintel conventions.

However, since C is the low-level implementation language for almost every dominant operating system, a given platform's Operating System ABI is almost always locally synonymous with the C ABI on that platform, and people tend to speak of "being compatible with the C ABI" when what they really mean is that they're compatible with the OS's ABI.

But if you move to, say, a Mainframe OS, the OS ABI is dramatically different than the "C ABI" you're imagining from Wintel.


To put it another way: C doesn't have an ABI; C has an API (or a wire protocol, if you prefer.) The abstraction layer of C doesn't exist between the machine-architecture and the C compiler; rather, the abstraction layer of C exists between the C compiler and the user, and takes the form of uncompiled C source code.

The design of C is predicated on an approach to portability that involves shipping source code, not binaries, around, with the destination machine having a compiler for its own architecture. (Given this, it's kind of shocking that things like Docker work at all. Goes to show how large a server-side monopoly those Intel ISAs currently hold.)

---

But wouldn't it be interesting if the "C compiler" were made into a low-level part of the operating system—basically taking the form of a JIT with a persistent cache—and then an executable format were specified, which really was just an archive of C source that was JIT-compiled by the OS when you first execv(3)ed it? Then you would have portable C binaries, because you'd be using the "correct" C ABI: C source-code.

Well, if you tack on a bit of pre-chewing to annotate the C source a bit (and maybe do some arch-neutral optimizations), and you have Apple's current LLVM-IR-based "Bitcode" approach to binaries.


Check out the architecture of TempleOS.


What, like C as a shell script? Gott in Himmel!


How about an interactive C++ interpreter: https://root.cern.ch/cling

Also it's Gott im Himmel ;-) (unless it's a reference I didn't get)


iPhone autocorrect :-(


That seems slightly pedantic. While it's true that there's no cross-platform C ABI, most platforms have a C ABI, and it's well-defined (and defined in terms of C data types) on that platform.

It's sort of like saying that there's no such thing as "rules of the road", because the road makers don't specify how you're supposed to drive on them, and the rules differ by jurisdiction and sometimes you're in a wilderness with no roads. The term is commonly understood to mean "rules for using roads, in the road-containing jurisdiction which you're in". I think the same thing is true of the phrase "C ABI", although if people are coming away thinking there's a cross-platform C ABI, then yes, we should be more precise.


> That seems slightly pedantic. While it's true that there's no cross-platform C ABI, most platforms have a C ABI, and it's well-defined (and defined in terms of C data types) on that platform.

Every platform has an ABI, obviously, and yes it's generally formed in view of C.

This doesn't change the fact that the phrase "The C ABI" is, in a vacuum, meaningless and that "The C ABI on Platform X" is just a convoluted, cart-before-the-horse way of saying "Platform X's ABI"


But what an ABI does is specify a binding between language-level concepts and binary representations. That's why mentioning the language matters. If "Platform X's ABI" says that structs are passed on the stack, or ints are a certain size, or pointers have a certain alignment, or whatever, it (usually) means that C structures are passed on the C stack, C ints are a certain size, C pointers have a certain alignment, etc. Another language can use its own data types and even its own concept of a stack. (For instance, Go uses a C-incompatible stack.) And the reason that we care about "the C ABI" is whether the implementation of a language on platform X has data types, stack usage, etc. that matches the implementation of C on platform X.

Put another way, I'm arguing that "C" is a more useful descriptor than "platform X's", because the sentence "Rust is compatible with the C ABI" is short for "forall X, Rust on platform X is compatible with the C ABI on platform X", and means something different from "Rust is compatible with the SysV ABI" (which implies that it follows the SysV ABI on Windows and OS X, too). And since the typical way of doing that is by interfacing with a C compiler or the C library (... and you could make this exact argument about the phrase "C library", I think), it's worth mentioning C. For instance, Vala and Nim, which both compile to C, are always compatible with the C ABI. If you port them to a mainframe they remain compatible with the C ABI, but not with the native ABI.


C is merely flexible, not in some position of authority where it dictates what platforms must do. C would be ABI-compatible on a platform that wasn't designed for it, too (which, historically, was the case on many platforms it was ported to).


Here is how LibreOffice handles this over Unix, Linux, OS X and Windows:

http://opengrok.libreoffice.org/xref/core/include/sal/types....


Even on a platform that wasn't based on C, the C standard is flexible enough to match pretty much any native ABI. That's a small (and under-used) upside to all of that undefined behavior.


Most of the time, you will need a tiny bit of non-standard stuff in your compiler to do so.

For example, I don't think there is a portable way to let your compiler use Pascal calling conventions, or to pass information in specific registers or in processor flags.

As an extreme example, in classic Mac OS, you could register a function to be called for line breaking in text input boxes that had the ABI "Parameters are passed to the routine in registers A3, A4, and D0, and output is returned in the Z flag of the Status Register." (https://developer.apple.com/legacy/library/documentation/mac..., page 2-31 gives several other special-cased ABI's in Mac OS)

Browsing that, I also encountered "The routine follows the C calling conventions employed by the THINK C software development environment. Arguments are passed on the stack from right to left, and a result is returned in register D0."

(Aside: that compiler must have treated vararg functions differently)

So, apparently, one cannot even count on C compilers to push arguments from left to right. So, if your ABI says "push X first, then y", I don't think you can portably call that function from C.


So, apparently, one cannot even count on C compilers to push arguments from left to right.

Right-to-left is pretty much the most sensible way to implement variadic functions with a stack-based calling convention - the argument that the function uses to interpret the rest of them must be found at a known location relative to the stack "top", and due to how stacks work, that implies the varying portion has to be pushed first, thus right-to-left.

Maybe if C decided to design things a little differently:

    int printf(..., const char *format);
    ...
    printf(5, "There are %d lights.\n");
we would instead have right-to-left being the dominant C calling convention.


Not OP but I'm guessing they were referring to the fact that this stuff is platform-specific and not specified by the C standard.


there is no standard in theory... but in practice one exists.

its a subtle difference some devs will pull out when they want to boost their egos by talking about things that sound smart, but they don't realise it requires that special kind of "smart-yet-dumb naivete" to consider worth caring or talking about.


    > The ability to use a high-level, dynamic language (ObjC),
    > C, and even inline assembly in a single source file
    > is unique to Objective-C (at least among the
    > "mainstream" languages)[...]
Well Perl is mainstream enough to be installed on every *nix box, here's examples of assembly and C functions interoperating with Perl in the same source file: https://metacpan.org/pod/distribution/Inline-ASM/ASM.pod#SYN... & https://metacpan.org/pod/distribution/Inline-C/lib/Inline/C....

    > The "C ABI" (although there is no such thing)
Of course if you do this in Perl every one of your calls will need to go through a foreign function interface where Perl's structures are translated back & forth between its idea of datastructures and the OS's idea, avoiding that is what people really mean when they talk about the "C ABI".


Thanks for the comment!

On OS X if OBJC_OLD_DISPATCH_PROTOTYPES is not defined, you will get this declaration with zero arguments: void objc_msgSend(void /* id self, SEL op, ... */ ), which is possible to call from C without cast by using implicit function declaration, but this call will fail if compiled as Objective-C code with error "too many arguments to function call", I've decided to write in a style that compiles as C and Objective-C code.

PS. Would be nice to be able to use functions from clang Objective-C runtime [1], but for some reason they are not exposed in the public headers.

- [1] http://clang.llvm.org/docs/AutomaticReferenceCounting.html#r...


Haskell has support for inline C [1] and inline Objective-C [2] implemented as libraries.

[1] https://github.com/fpco/inline-c/blob/master/README.md

[2] https://hackage.haskell.org/package/language-c-inline


SVR4 specified standard-conformant calling conventions (stack layout, register use etc). IIRC VMS went one "better" and tried to specify a standard calling convention that was supposed to apply to all programs (sadly, there's more to interoperability than calling convention).


You do actually need to cast the dispatch function, because there is no guarantee for all platforms that the ABI for calling a variadic function is the same as the ABI for calling a non-variadic function with the same arguments.


not really. MS have had this with their C++ for a very long time... easily 20 years +


Unless MS has a custom C++ runtime that I'm not aware of, I wouldn't consider C++ a high level language like obj-c is, for better or worse. The benefits and drawbacks of runtime dispatch are highly debatable, but the amount of runtime metaprogramming you can do in ObjC is comparable to what you can do in a language like Ruby:

* Call any method (send message in objc parlance) based on the string name of the method * Add methods to existing classes you don't own * Swap existing method implementations of a class you do not own at runtime (method swizzling) * Get and set variables by their string name for subclasses of NSObject * Check if an object has a method with a given name (great for invoking methods conditionally)

http://genius.com/Soroush-khanlou-objective-c-isnt-what-you-...


Don't do this at home. It looks so ugly because it's dangerous and vice-versa.

If you really want to write (and read) code like this:

id titleString = ((id ()(id, SEL, const char))objc_msgSend)((id)objc_getClass("NSString"), sel_registerName("stringWithUTF8String:"), "sup from C");

((void (*)(id, SEL, id))objc_msgSend)(window, sel_registerName("setTitle:"), titleString);

instead of this

[window setTitle:@"sup"];

then I guess it would be easier to write a Objective-C to C converter (if there isn't one already) and convert your Objective-C app into C for added coolness. I


One practical example would be tigr [1] - a cross platform window framework for making simple applications, could be compiled as one header drop-in only lib. You simply cannot make it one header without making calling ObjC runtime from C. Why it's necessary to make it work as one drop-in header? Well it's a new trend in C libraries, which allows using libraries with the least resistance possible, it's like a package manager but nicer - you only have one file! Great example of modern one header libs is probably the famous stb package [2].

- [1] https://bitbucket.org/rmitton/tigr/src

- [2] https://github.com/nothings/stb


I haven't looked at tigr yet (it seems interesting!) but could you just use '#ifdef __OBJC__'?


Not really, because everything is C/C++ there by means files are .c, .cpp and not .m or .mm, its a bit tricky from build system point of view to treat the file with extension .c/.cpp as Objective-C.


A .h file can be compiled as C, C++, or Objective-C. Isn't that how single-file libraries usually work?


Yes, and this .h file will be used from .c/.cpp because looks like it's way to bulky to add additional .m/.mm file just for OSX/iOS case when creating cross platform applications. Check tigr - it's possible to run exactly the same C/C++ code on Windows and OSX without any changes at all.


Since Objective-C is a true superset of C (unlike C++) and there's also Objective-C++, you can just tell the compiler to compile all .c/.cpp as if they were .m/.mm. That is, if you can figure out how to coax your build system to pass the right flags.


For the record, this isn't a new trend - its been fashionable in C for decades to create single-header 'library' drop-ins and is pretty much par for the course ..


Seriously, I don't know who's downvoting you. This needs a huge "for academic purposes only disclaimer". Higher level abstractions exist for a reason.


I think it's obvious that this is an academic exercise (or "just because I can" sort of thing), which makes the GP's comment quite unnecessary.


> a Objective-C to C converter (if there isn't one already)

Incidentally, that's pretty much how the language got started -- as a pre-processor add-on for plain C:

https://en.wikipedia.org/wiki/Objective-C#History


There are Objective-C to C converters.

http://users.telenet.be/stes/compiler.html

However, the dialect this one implements is very different from Apple's, and it doesn't quite produce ANSI C (the generated code does some things which you can usually get away with but are technically illegal, like casting object pointers to function pointers). It's mainly of historical interest now.


>> I guess it would be easier to write a Objective-C to C converter

That's what the first Obj-C (and C++ compilers) were


Well, sure. But only in the same way that the first C compiler was a "machine code converter".

* Or B, perhaps, if we're to be really pedantic.


> For some reason if we run the app from command line than menu is not accesible by mouse at first time

This happens with Qt apps on OS X, as well, and it drives me bonkers. If you launch from the command line you have to command tab away and back for the menu to work


Its because of the missing .app bundle and appropriate Info.plist. I just submitted a pull request to fix this - maybe its useful for you too, check it out:

https://github.com/jimon/osx_app_in_plain_c/pull/1


Strangely this happens on my cocoa app too... but it hasn't always been the case and I've had this thing for about 5 years now, so I suspect that somewhere, something got messed up.


It started for me around on the Mountain Lion to Mavericks upgrade, so whatever changed then.



This is what Objective-C looks like if you "unroll" it to plain C. Not too surprisingly, it's very verbose. Other than as an exercise, there's no reason to do this, particularly given that Objective-C is a strict superset of C.[1]

[1] http://stackoverflow.com/questions/19366134/what-does-object...


As someone who has done a lot of Win32 programming in plain C, I can recognise some common patterns like an event loop and window creation, but it's quite amazing how much extra "cruft" there is just to deal with what appears to be the contortions of OOP'ing everything. All those string constants are rather surprising too. The interesting thing is, the few times I've had to use a Mac the GUI didn't feel quite as responsive as Windows on the same hardware --- it wasn't jerky/stuttering, just what I'd describe as "smooth but sluggish", and the binaries were noticeably larger, and I now wonder if this increased abstraction and complexity has anything to do with it.

For comparison, a similar Win32 app in plain C:

http://www.winprog.org/tutorial/simple_window.html

X Windows app in plain C:

http://www.paulgriffiths.net/program/c/srcs/helloxsrc.html

...and just for fun, the same simple Win32 app from above in Assembly language:

http://win32assembly.programminghorizon.com/tut3.html

Looking at the code again, if I were forced to write an OS X app in plain C, there would be plenty of macro usage around objc_msgSend.


Windows is effectively already an OO system; you "override" methods (message codes) in your window procedure, with the base implementation provided by the default window procedure. Windows even uses the terminology of window classes to describe the behaviour of window instances.

Personally, I'd blame UI latency more on a deeper composition stack and rendering sophistication rather than object orientation. Unless you're using late bound method calls for pixel-level drawing primitives, they shouldn't be a significant percentage of the event dispatch loop that turns input into visual results.


This example is extremely contrived, of course it's going to be a nightmare in C. GUI latency probably has little to do with the overhead of objC message sending and more to do with tuning how things respond for UX purposes. For instance it's well known among gamers that on OSX the mouse has acceleration which makes it smother but less responsive in gaming. I wouldn't be surprised if this carried over to scrolling, animations, etc.

> If I were forced to write an OS X app in plain C, there would be plenty of macro usage around objc_msgSend

You will have re-invented the early versions of Objective C which were all done in the preprocessor :)


It's amazing and sad for computing that somehow managing to use a sane language to develop software for a very commonly used platform is seen as a serious feat of accomplishment.


C was not designed to create gui's. So its a feat of accomplishment as opening a bottle with your teeth.


IMHO it depends, C/C++ work very nicely with immediate mode UI frameworks, for example nuklear [1], imgui [2]. Though retained mode C/C++ UI frameworks are also quite popular, for example Qt, wxWidgets, GTK.

- [1] https://github.com/vurtun/nuklear

- [2] https://github.com/ocornut/imgui


Ummm... I wrote plenty of GUI code in C and so did lots of other people back in the late '80s and early '90s. Quite common. I don't know what it would mean for it to have been "designed to create" GUIs; it's a general-purpose language.


Windows, X11 and Gtk would argue with you.


There's Carbon as well, which was replaced by Cocoa. And as of 4 years ago (and I suspect today as well) there are still some things which you could only do in the Carbon's C APIs, so lots of CoreFoundation types which you can bridge directly from their objective C equivalents. Pretty neat stuff how well the backwards compatibility works.


Windows is written in C++ mostly.

As for X11 and Gtk, well, I argue with THEM.


Windows itself perhaps, but not the Windows API. If you've ever done any low-level Windows development, it's pure C.

See here for a tutorial using it for basic GUI stuff:

http://zetcode.com/gui/winapi/firststeps/

I've even seen several games that use it directly for keyboard handling and such to this day. And all the new, fancy GUI frameworks which exist on Windows are basically just wrappers on top of WinAPI.


Yes, but WinAPI (which is all that is required to do GUI programming in Windows) is C-centric.


Yeah, but nobody (ie. very few if anybody) does GUI programming in WinAPI, so I don't see how it's relevant in determining whether C is good for GUIs.


Unless something has changed drastically since when I was at MS, the Windows kernel and the major OS DLLs is pretty much pure C. The GUI/window manager are more C++.

Even Word and Excel were still C (although Office shared much C++ code in a common DLL)

Gotta love legacy code from the 1980s/1990s.


When were you there?

Since the new "Going native" wave, C++ has taken the role of main systems programming language, even the DDK now supports it.

Regarding office check their CppCon presentation how the code was ported to C++, refactored and made portable across OS. About 2h session.


"Going native" meant salvaging Vista's low performance by rewriting everything in C++ that was written in .NET -- .NET was the "wave" here, but it's also the culprit for Vista eating memory like cheap sushi.


Yet, the replacement of Win32 API model is with everything being written in the original design of .NET, with AOT native compilation and classes being COM objects.

What killed Vista was politics between OSDev and DevTools units.


Few languages have intrinsic GUI components.


I had a project once that required me to build a GUI in C using Motif. It was a joy to use. I recall there was a simple concept called RowColumn. All the GUI widgets were either in a Row or in a Column. It was highly orthogonal and allowed you to build good-looking UI's that acted in a reasonable way when resized, etc.


that analogy though


C is a sane language?


Yes.


Really? I don't write systems software but it seems most here would not recommend writing in C/C++ unless it's absolutely the best choice.


I don't know who you've been talking to, but that's ridiculous. C++ is one of the most widely-used languages for writing native apps for any OS. Adobe products, Chrome, anything built with Qt, even the Swift language itself, all made in C++.


> Adobe products

Adobe products have codebases going back decades, of course they are written in C/C++. Furthermore most Adobe apps have fairly advanced processing needs that your run-of-the-mill CRUD GUI does not.

> anything built with Qt

Qt isn't even stock C++, they had to hack a meta-compiler on top of C++ to effectively serve the needs of their GUI system.

> even the Swift language itself

By this logic, thousands of GUI apps are written in x86 assembly.

Between C#, Objective-C, Swift, JS+Electron, Mono, and many other mature, high-level frameworks that don't trash memory at the slightest provocation, youd need some very good reasons to use C or C++ for a GUI app these days.


Most apple platform apps are written in mostly objc or Swift and not C++, I wouldn’t say it is one of the most widely used languages on these platforms by any means. The Cocoa SDK exposes an objc interface so many system framework calls have to use swift or objc.

Secondly, compilers are a very different programming task to developing GUI apps; the Swift team choosing to use it has nothing to do with Apple’s assessment of how you should write an application—Apple are the people who have been leading objc development (they own the objc IP) in the last 10 years and use it in all first party apps and many system frameworks.


It depends, most of important and heavy lifting stuff in Apple platform, aka Core frameworks, are written in C, for example a quote from Core Text documentation [1]:

> Many of the low level libraries in OS X and iOS are written in plain C for speed and simplicity.

And don't forget that OSX/iOS are based on Darwin BSD, where most of heavy lifting is done by programs written in C.

- [1] https://developer.apple.com/library/mac/documentation/String...


Yeah, I still wouldn’t say that c++ is “one of the most widely-used languages for writing native apps” for OS X.

And internally apple use other languages a lot, but frameworks like UIKit, AppKit, and many of the other public frameworks they expose in the SDK are objc, with foundation for example abstracting a lot of CoreFoundation C code.


Perhaps most smaller apps, yes. And some big ones too. But a very significant percentage of the commercial software industry writes apps for OSX in C++, including Autodesk, Adobe, many major games written using engines like Unreal and Unity (which also support other languages), etc.

That's not a knock against anyone who does or doesn't write apps in a particular language; but folks should know what the reality is, and Objective C or Swift are by no means the only or even necessarily the most profitable ways to writes apps for OSX, though those languages are perfectly fine for many circumstances.


Is there an actual half-popular production app for OS X written in Swift yet?


Probably not, but Lyft is an iOS app written in Swift.


This is a very particular crowd, largely into web software and ecosystems for rapid growth of SaaS. I wouldn't extrapolate this crowd into software languages in general. You should always use the best tool for the job.

Many people write quite excellent software in C/C++ every day. Just as many people use chainsaws, drive cars, fire guns and take pills every day. These are all productive activities with low floors and high ceilings. Anyone can jump in and make a mess. It takes care and attention to do them well.


agree.

If this was an enterprise/cio/it/gov discussion web site then all the conversations would be java and .net vs c++

if this was a scientific computing website then python, c/c++ and fortran and would be mentioned etc.

if this was a hardware/firmware website it would be c vs (whatever black magic is used by hardware wizards).


VHDL vs Verilog flame wars everywhere. Also, discussions on people's favorite idiosyncratic, vendor-specific dialect of C for embedded devices.


Gah, this. Also noobs who want to "replace their 2,000 device-per-year microcontroller application with an FPGA so that it can someday be put onto a ASIC." They're basically vinyl hipsters who fancy themselves "EE/chip designerz" and want to immortalize themselves.


Why recommend Objective-C over C/C++ though? It's more or less subject to the same "pitfalls" that go against C, PLUS it is a syntactical nightmare.

[whatTheHell:is this:shit];

Is ARC really that much of a savior here? At least method calls get resolved at compile time.


> It's more or less subject to the same "pitfalls" that go against C

Only if you write C code in Objective-C.

> PLUS it is a syntactical nightmare. [whatTheHell:is this:shit];

Honestly, get over it. This is largely a surface-level detail that makes no difference once youve used the language for more than a week. I take it you have similar complaints against Ruby, Python, Lisp, or any other language that dares to break ranks with the C++/Java syntax idiom.

> Is ARC really that much of a savior here?

Yes, manual memory management is one of the main things that C/C++ programmers seem to consistently screw up.


> Yes, manual memory management is one of the main things that C/C++ programmers seem to consistently screw up.

There's no reason why memory management _should_ be something that C++ programmers screw up now with containers like smart_ptr and std::string. Yes, if you malloc/free stuff and use things like memcpy, you're playing with fire, but the whole point of "modern C++" is that we have better and safer alternatives.

On the other hand, Obj-C message passing (which really means method calls that are resolved at runtime) is another way, like manual memory management, to really shoot yourself in the foot if you make a mistake.

> I take it you have similar complaints against Ruby, Python, Lisp, or any other language that dares to break ranks with the C++/Java syntax idiom.

I actually don't. Forced named parameters and "array brackets" denote method calls are arguably poor choices for syntax.

On the other hand, getting rid of semicolons and enforcing whitespace a la Python is okay, and Lisp is really just an extension of RPN.

The point is your point that I dislike anything by default that is "not C/C++/Java" is actually pretty incorrect.

I do have opinions, but that isn't the basis through which they are formed, and quite honestly, I'm offended that you even made such an assumption about me in the first place.

> Honestly, get over it.

Bad advice, I'm allowed to have an opinion (and I do substantiate it above).

Regardless, I have "gotten over it" given that a large portion of my day job is spent writing Obj-C, and while I do like it better than Java as a language overall (even not having to deal with JNI for interop is a dream), god help the syntax.


Unsubstantiated complaints of "omg brackets" were already tiring in 2008 when they were repeatedly voiced by many the new Objective-C programmers flocking to iOS development. If you have some reasoning for why Objective C's message passing syntax is bad, please voice it. (You haven't.) Otherwise you are just adding noise to the discussion.


I'm not biting that bait, sorry.

You've already made the point that my opinion is a popular one. To wit, I find it amusing that the Google autocomplete for "Objective C syntax" is "horrible."


If you compile it, you'll see that the ObjC executable with no ARC and the C-only executable are exactly the same size stripped. The ObjC version with ARC is 392 bytes bigger.


And introduced in OS X 10.10 is the ability to build an OS X app in pure JavaScript:

http://tylergaw.com/articles/building-osx-apps-with-js

(Atwood's Law in action)


A few years ago I dove into the idea of trying to make a C only iOS app (well actually, it was a quick exploration of how hard it would be to build your own UI kit without using UIKit). The dive bottomed out pretty quickly, as I couldn't figure out a way to access windows without resorting to UIWindow methods, and `UIGraphicsGetCurrentContext()` acted super wonky calling it without 'proper' bootstrapping around it. I was expecting more, given the usual layered approach that Apple takes with their system libraries.


I'm not a developer, but does it count when they use objc header files?

#include <objc/objc.h> #include <objc/runtime.h> #include <objc/message.h> #include <objc/NSObjCRuntime.h>


I think those are all valid C header files. They just define typedefs and macros that are about ObjC, but themselves are plain C.


Richard J. Ross III did it a few years ago for iOS:

http://stackoverflow.com/a/10290255/77567


As a side note: http://stackoverflow.com/questions/525609/use-c-with-cocoa-i...

A commenter at stackoverflow provides some code that "shows how to access Cocoa GUI from pure C/C++ and build a truly functional GUI application"

He notes that you must "link against Cocoa framework".


My interest in this comes from wanting to build Objective-C based OSX-UIs using scripting languages that are only embeddable in C, like Lua. When (if?) that day comes, I'll be like a kid in a candy store all over again.


I would recommend you to try CodeFlow from celedev [1], it's a dynamically generated Lua binding to Objective-C. Plus a nice editor with hot reloading support and other cool features.

- [1] https://www.celedev.com/en/codeflow/


I don't buy it. Yes, the language may be C, but if you're linking the ObjC runtime and frameworks, all you're really avoiding here is the nib-loading machinery.

To be fair, I think that's a perfectly reasonable goal in and of itself - showing how to set up an OS X app "from scratch" is definitely something I'm interested in - but we shouldn't pretend this isn't strongly relying on Objective-C to actually get things done.

EDIT: indeed, the makefile copies main.c to main.m before invoking clang, which means it's using the Objective-C compiler.

What would be much more impressive is if this used only the C Core* frameworks (Core Foundation and Core Graphics in particular) to do the same thing.


Look at the makefile again. It builds five different binaries from the same source file, one of which using a C compiler.

This is no different than writing a COM application in C. Both COM and objc runtimes provide similar dynamic OO services over a plain C API.


I just ran the code through clang (Apple LLVM version 5.1) with no problems. You don't have to rename the source - but perhaps there are issues in other configurations.

clang -framework Foundation -framework OpenGL -framework Cocoa -o test test.c


So obviously it's impossible to write a linux app in anything pure but C because the kernel is written in C. That makes no sense.


I'm sure the Windows version would be just as bad. Most of the useful parts of the Windows API don't have a C interface, so you have to use COM, which essentially means editing vtable-like structures by hand.

https://en.wikipedia.org/wiki/Component_Object_Model


I've been programming Windows GUI apps for over a decade. Most of the Windows API, i.e. Win32, is not COM.

I have also used COM from C, and while it's not as pleasant as the pure Win32 API, it doesn't involve the massive amounts of function pointer casting shown here. vtables are involved but they can be defined as C structures, whereas this example seems to show objc_msgSend() returning a very large number of different possible function pointer types.


You wouldn't have to use COM for this at all. You could use the Win32 API directly, which is a C API and would be much less code. In fact I'll link a comment from above that shows this: https://news.ycombinator.com/item?id=11641545


The two replies to my comment both need a response, so I'll make this a sibling.

Looks like I was wrong, and I was working harder when I wrote Windows C programs than I needed to. The Windows API documentation is sort of confusing and I got the impression that only the CRT functions have a C interface.


That's a lot of boiler plate code. Is there a way to make it smaller? For example, if all I need is to open and process a file with no GUI (maybe a progress bar).


Outstanding! I have been wanting to see this for a long time. I really want to build OS X apps but I personally prefer to build GUIs programatically.


So build programmatically. But use Objective C or Swift.


Well, that looks rather unpleasant.


Would love to see this annotated.


I don't buy the "plain C" claim, given the gratuitous use of objc_msgSend().


...objc_msgSend() is a plain C function that grovels around in various runtime data structures to call the correct function pointer for GUI and OS functionality.

Any alternate method of finding and calling those functions would amount to simply reimplementing objc_msgSend()


What else would it be? objc_msgSend() is called as a C function, to interface with ObjC code.


sorry, but mediocre at best, full of mistakes at worst.

i'm sure it was a good exercise in learning objective-c but this is not an amazing achievement. i'd expect much better from someone competent reading the god awful docs and reverse engineering by inspection.

don't give up though. doing things like this over and over, especially in the face of nay saying arses like me, is what makes great programmers. :)


As a beginner to Objective-C, I found reading the source here highly informative, because it gives you a rare peek at the actual machinery underneath everything that's going on.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: