Hacker News new | past | comments | ask | show | jobs | submit login
Is Objective-C BOOL a boolean type? It depends (jviotti.com)
63 points by ingve 8 months ago | hide | past | favorite | 22 comments



IMHO there wasn’t any reason to bother fiddling with the definition of BOOL.

It was well-understood that Obj-C BOOL and C++ bool are different types. Anybody working in Obj-C++ will deal with a ton of these differences, some minor (like BOOL) and some major (e.g. how NSString and std::string are not at all the same thing). Unifying a few bits here and there doesn’t deliver a lot of practical benefit.


Agreed. BOOL in Apple ObjC is a `char` though. That is a constant thorn in the side, when you are doing slightly more sophisticated stuff with argument passing and return values as you actually have to cast a lot (BOOL (*)(id, SEL, ...)).

As the implementer of an Objective-C dialect, I am considering to go the other route and elevate BOOL to NSInteger (sizeof( NSInteger) == sizeof( id) in my case).

Conceivably the compiler could (just) for @properties combine all the BOOL fields into a bitfield, so there would be less grousing about wasted space. Maybe or maybe not.

* https://developer.apple.com/documentation/objectivec/bool?la...

* https://github.com/iterate-ch/rococoa/blob/master/ObjcMsgSen...


> IMHO there wasn’t any reason to bother fiddling with the definition of BOOL.

There were good reasons to change this to a real boolean. I think it was a WWDC, where kind of discussing this topic, one Apple engineer said they did an audit of production Obj-C code and found multiple instances where their BOOL values were values other than 0 or 1 (which was considered surprising and ultimately bugs).

This can lead to all sorts of subtle problems. One common one encountered by framework users is where something returns a BOOL and the user explicitly compared against == YES or == NO. If the value is say 2, then this code usually results in wrong decision.


Author here! We write Objective-C++ to connect existing Objective-C code (that uses BOOL) to C++ (where we want to just expose `bool`), which got us into this fun rabbit hole :)


jviotti, the reason BOOL was not changed to bool sooner was because it would break ABI compatibility. So on Mac for example, they had to wait until the x86_64 to arm64 to finally do it.


When I did Objective-C applications with C++ business logic, I recall our company's rule was to keep Objective-C code totally separate from the C++ code, in separate files, different headers, even different static libraries if possible--really quarantine them from each other. Then have a single, standalone, hideous Objective-C++ file that served as the smallest possible interface between the two. It's the only way we could keep sane. This was a trauma response from a previous project where no such discipline was maintained, and people just causally intermixed C++, Objective-C, C, and Objective-C++ willy-nilly all over the code, with now predictable results.


I work on a decade-old codebase where we freely mix C++, Obj-C++ and Obj-C as it makes sense and I can't say it's ever been an issue. Obj-C++ has some oddities, but on the whole mixing obj-c and c++ features works totally fine and I've never encountered a reason to want to separate them.


This is the best way for having a objc and c++ in same project.


As a historical note, this was also a thorny issue for language bridges to Objective-C, going beyond just Obj-C++. I'm thinking of bridges like PyObjC, RubyCocoa, CamelBones, and LuaCocoa.

BOOL was one of those things that would have been really nice to be a real bool because when bridging to other languages that had a real boolean type, the idioms could be completely natural when writing in the bridged language. But since BOOL was a 'signed char', using only the Obj-C runtime introspection, this by default would typically bridge to something weird like an integer.

In Mac OS X 10.5, Apple officially supported PyObjC and RubyCocoa, and introduced "BridgeSupport" which provided additional metadata and dylibs containing symbols for inline functions, so any language could create a full bridge to Cocoa. The metadata could be used to interpret if a 'signed char' could actually be used as a boolean or not. But BridgeSupport was not available for 3rd party APIs (unless the authors decided to support it, which was almost never).

There were bug requests filed for Apple to properly redefine BOOL to a real boolean for the 64-bit Mac ABI, before Apple had finalized it, but Apple didn't fix this in time. My memory is hazy, but I think when the iPhone ABI came around, they didn't have time to address this. So it would be another full arch ABI before there would be a chance to address this again.


I didn't comprehensively investigate the issue to learn whether this reflected a more systematic change in reported types, but FWIW a code comment from some Lua bridging code:

  -- 2022-12-30: On macOS 12/x86_64 methods returning BOOL return integers
  -- from objc.msgsend as the type encoding uses the 'c' code for char rather
  -- than 'B' for _Bool (C) or bool (C++). But on macOS 11/arm64 the type
  -- encoding is 'B' and we get a boolean return type.
  local function itob(v, err)
          if isinteger(v) then
                  return v ~= 0
          elseif isboolean(v) then
                  return v
          else
                  return error(err or sformat("expected boolean or integer, got %s", type(v)), 3)
          end
  end


What an odd coincidence. One of my most popular answers on StackOverflow covers some of these differences, and I just got another upvote on it yesterday. For an answer from 2010 it keeps going.

https://stackoverflow.com/questions/3016846/is-there-any-dif...


I miss the chaos of Obj-C. Swift makes way too much sense.


Made me chuckle. Objective-C was my _first_ language. When I say first, I mean it was the first one I put any effort into learning when I was in 7th grade or so. It was chaos and the only way I managed to get any apps out the door was basically pattern-matching what people did in stack overflow.

I wouldn't wish Obj-C as a first language on any soul. Swift on the other hand? Jokes aside, that's a beautiful language.


Sure, but that was intentional: the assumption was that you had already learned C. Objective-C is a great language if you know C and you want a SmallTalk-style message dispatch runtime.

It was better before ObjC 2.0; dot syntax makes it hard to tell out of context what a line of code is doing, which was never a problem with the original language.

It was also better before UIKit; The killer app is Interface Builder, which is unbelievably painful to use with UIKit compared to how it used to be with pre-CoreAnimation, pre-autolayout AppKit.


Funny, I'm the opposite. Objective-C was my first language, and I absolutely love it. It's so beautiful and chaos-free.

Swift on the other hand? Full of chaos and ugly (func, let vs var - really).


The only part about Obj-C I liked is that raw C is valid Obj-C. This made writing iOS apps in 2016 feel so weirdly antiquated.


Obj-C++ is the ultimate in glorious insanity.


Ran into Steve Naroff a few years ago and the first thing he said to me was "Wow, I still can't believe we managed to get Objective-C++ to work!"


Except for the Obj-C related parts of Swift!


They just left that in to make it fun.


@YES


It's been many years since Obj-C was relevant to my life, but I remember reading about some of the implementation details in the runtime and just absolutely boggling. Not in a bad way, a "wow, they've really done about as much as anyone could possibly do to get this performant" way. Distinct mental image of starting at a nice clean relativistic scale and then zooming down into quantum nuttery.

We need a Powers of Ten with programming languages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: