Hacker News new | past | comments | ask | show | jobs | submit login
Apple starting to alert users that it will end 32-bit app support on the Mac (techcrunch.com)
186 points by bbrunner on April 12, 2018 | hide | past | favorite | 244 comments



Could somebody point me to a technical explanation of why it's sometimes non trivial to just compile your app against x86-64 and call it a day?

For example, something I encounter every day is Visual Studio and it's helper processes being 32 bit. Because Visual Studio regularly, even on the latest 15.7 preview shits the bed with OutOfMemoryExceptions on our large solution, I'm inclined to rage "why don't they just make it 64 bit? If it could just load more into memory it could get past this indexing hurdle and give me back the UI". But I also understand that if it was that simple they would have done it by now.

Something else, that I understand more, is the LabVIEW RT and FPGA modules only working on 32 bit LabVIEW. I would assume it's related to the compiling and deploying to the 32 bit ARM/x86 RT target.


The main issue would be that 64-bit may cause an app to use more memory. Every pointer doubles in size, the alignment of a structure containing a pointer may grow, etc. Basically, if you needed a lot of structures allocated and they all grew, your memory use becomes noticeably bigger.

Sometimes legacy code can make assumptions about pointer size. These hacks were more common in the days of porting older systems to 32-bit but it could still happen moving to 64-bit.

If there’s code that tries to manually populate the bytes of data structures, sometimes bugs appear when the target field size changes (e.g. somebody ends up not initializing 4 of the 8 bytes in a now-wider field).

In the case of Apple, the huge pain will be their decision to not port all of Carbon to 32-bit (despite Forstall getting on stage years ago and stating that Carbon was going to be 64-bit “top to bottom”, it never happened). This can mean addressing a whole pile of nothing-to-do-with-64-bit-whatsoever problems before even starting to solve 64-bit problems.


> despite Forstall getting on stage years ago and stating that Carbon was going to be 64-bit “top to bottom”

This is the second time I've seen you make that claim. Can you, by any chance, recall on which occasion he said this (or dig up the exact quote)? AFAIK the plan was always to use the 64 bit transition as an opportunity to shed deprecated APIs, so to imply otherwise in public would have been rather irresponsible.


I don't know about that particular quote, but this has a photo of the WWDC side claiming 64-bit Carbon and Cocoa, so that was definitely the original plan

https://www.wired.com/2007/06/leopard-won-t-support-64-bit-c...


Hmm, that is plausible. I stand corrected.


> despite Forstall getting on stage years ago

Well we all know what happened to Forstall ... and the entire software development roadmap after he left ...


Apple's 64 bit strategy was in place by 2007 (It would hardly make sense to omit some 64 bit Cocoa APIs in MacOS X 10.5, only to bring them back later, would it?). Forstall left in 2012. Your alternate history timeline may need some work ;-)


I don’t think so. Was there a projected date for completion of the migration? We’ve only got pervasive 64-bit in the last 6 years so doubtful it would have been before then.


As far as I recall, userland APIs had all their 64-bit support in place in 10.5 (2007). Kernel 64-bit support came in 10.6 (2009). Which 64-bit features are you thinking of in the last 6 years?

To clarify, I'm referring to macOS releases. On iOS, 64-bit support obviously came within the last 6 years, but Carbon was never in the picture on iOS anyway.


Carbon, as per GP ...


The increase in pointer size is trivial in todays memory limits. An application with a million pointers will only use 4 megabytes more memory.


CPU cache will not magically double in size to accommodate, so performance will suffer


I'm pretty okay with Visual Studio using significantly more memory than it does now, if it meant it would stop thrashing the disk constantly and locking up the text editor. My dev machines have between 16 and 32 GB of RAM, and even the wimpiest laptop I've used in the past five years had 8GB. But anytime it gets close to 2GB of memory used, it starts turning into "Microsoft Visual Studio (is not responding)"


Usually it's old code that assumes sizeof(int) == sizeof(void *) - you could stuff a pointer into an int on 32 bit platforms, you can't when an integer is 32 bits long, and a pointer is 64 bits. In C (or Fortran, ...) it's pretty easy to make this mistake, especially in pre-K&R C - so there's some work to be done porting over.


I've always been confused by this: why aren't ints 64 bits on x86_64? They're 16 bits on 16-bit architectures, and 32 bits on 32-bit architectures, and that distinction was originally rather the point of using "int" as your type rather than short/long/etc., so... what happened?


"Word" only means 16 bits on x86; on 32-bit ARM, for example, it means 32 bits, and the architecture manual uses "doubleword" to refer to 64 bits. [edit: the parent's post was edited but originally asked about this]

Having int be 64-bits is known as ILP64 (int, long, and pointers 64-bit); some obscure systems handled the 64-bit transition that way (Cray), but Unix went with LP64 (long and pointers 64-bit), and Windows went with LLP64 (long long and pointers 64-bit). Here's an interesting document from 1997 comparing the approaches:

http://www.unix.org/version2/whatsnew/lp64_wp.html

Basically a matter of tradeoffs around compatibility, performance, and consistency.


Use "stdint.h" in all new code. I only use 'int' and 'long' for counters I know won't be that big, etc. I use stdint types for anything where I might care about the size.


When x86_64 was introduced, other 64-bits systems had made that choice, too. It was natural to follow that.

See http://www.unix.org/version2/whatsnew/lp64_wp.html and http://www.unix.org/version2/whatsnew/login_64bit.html.


A counter-example (the only one I’m aware of) is tru64 UNIX from Digital (later Compaq, later HP), now EOL. If you used “int” in C on that platform, it was 64-bit.


In C, you only have char, short, int, long and long long (signed and unsigned). Other integer types like uint32_t and size_t are typedefs to those. If int is 64 bits and char is 8 bits, you have to choose if short is 16 or 32 bits, and other one won't exist at all.

long really should be 64 bits of 64-bit systems, and it is on Linux. Windows kept it at 32 bits to make porting existing code easier.


Then you're instead going to break apps that assume ints are exactly 32 bits. There's no easy solution here.


If I recall correctly, Win32 let you stuff a 32-bit value into a window. If you wanted to write an object-oriented application, you'd stuff your this-pointer in that 32-bit value. Since every little thing (button, label, checkbox) is a window, you'd have a lot of that. Visual Studio might be old enough that they wrote the pieces that are still 32-bit in straight-up Win32, or maybe MFC. If so, I'd bet no one wants to touch it to port it over to modernity...


You're probably referring to SetWindowLong. A 64-bit compatible version (SetWindowLongPtr) of that function is available since Windows 2000.

I'm pretty sure there's a guide on how to properly port applications to 64-bit on MSDN somewhere.


First, it could be as simple as you use a legacy proprietary software that you don't have access to the sources to recompile it 64 bit, on macOS is less of a problem because 32bit was used for a short transition period, but if you think maybe you want to run 32bit Windows software with wine.

Another reason is that the program was not written with portability in mind, and so it works well on 32bit and on 64bit it has strange behaviors, this could be due to a infinite number of possibilities, and so if you want to use it on a 64bit you must not only recompile the program but debug and fix it.

And then it could have be done on purpose, yes Microsoft compiles Visual Studio 32bit on purpose, the reason is that the main advantage of 64bit is a bigger address space, and a couple more registers, otherwise on the Intel architecture the performance is the same, but with 64bit you consume significantly more memory, because every pointer inside you program is now twice as big: so if you program doesn't need an address space bigger than 32bit and you want to save some RAM, it's not a stupid idea as it would seem to still compile it 32bit. As it's not a stupid idea to use a 32bit OS on a PC with 2Gb of RAM or even less.


It could have been all of those things, but the real reason was to keep Visual Studio fast.

Here's a rather opinionated blog post on the subject, from one of the people originally responsible for making the call: https://blogs.msdn.microsoft.com/ricom/2015/12/29/revisiting...

In a nutshell, his position was basically, "If you can't do what Visual Studio needs to do in 4GB (four gigabytes!) of RAM, you really need to think a little bit harder about your data management."

I personally didn't have a terribly strong opinion either way, up until about a year ago when I switched to Java and started using IntelliJ on a daily basis. Now I have come to agree quite vehemently with Mariani's opinion on the subject.


And yet on large programs unless you are very careful about which symbols get loaded VS will crash deterministically while debugging. This gets worse and worse over time as windows itself pulls in more modules for the same code with every update.

4GB simply isn't enough address space to keep every symbol in memory. I could manually manage them but that should be the computers job not mine.


All the hacks that make assumption on some type sizes like pointers that will change on 64bit architectures will break your program.

Hacks are seen as "good" C programming, and encouraged, so this kind of thing happens.


Not even a case of hacks; before modern (C99) C, standard C didn't have fixed-width integer types at all, so behaviour can very easily change with architecture.


>>Hacks are seen as "good" C programming, and encouraged, so this kind of thing happens.

Perhaps among new developers. Us old-timers who have to maintain this stuff have learned to avoid "clever" code rather quickly and admonish those who write it. Elegant, to me, includes "easy to read and understand."


They've actually talked about why they haven't done this here[1]:

"So why not just move Visual Studio to be a 64-bit application? While we’ve seriously considered this porting effort, at this time we don’t believe the returns merit the investment and resultant complexity. We’d still need to ship a 32-bit version of the product for various use cases, so adding a 64-bit version of the product would double the size of our test matrix. In addition, there is an ecosystem of thousands of extensions for Visual Studio (https://visualstudiogallery.msdn.microsoft.com) which would need to also port to 64-bit. Lastly, moving to 64-bit isn’t a panacea – as others have noted (https://blogs.msdn.microsoft.com/ricom/2016/01/11/a-little-6...), unless the work doesn’t fit into a 32-bit address space, moving to 64-bit can actually degrade performance."

Also, a lot of people don't realize that there is a 64-bit version of the toolsets[2] (for C++ at least). I don't tend to have high memory use by Visual Studio itself but often run out of heap space using the compiler and linker so having access to those can be very helpful.

1. https://visualstudio.uservoice.com/forums/121579-visual-stud... 2. https://docs.microsoft.com/en-us/cpp/build/how-to-enable-a-6...


I always scoff when I read this. “ I don’t run out of memory that means nobody else does”

My team regularly (on a daily basis) runs into high memory usage VS issues, which inevitably end up with VS hanging and being force killed and restarted. I d gotten to the point that I restart VS in the morning and at lunch every day to work around the issue.


It's funny, Microsoft Office has the same problems to an even greater degree (vastly more users, more 3rd party extensions, an entire suite of apps each probably more complex than VS), and yet they've had a 64-bit edition available since 2013


They've always recommended the 32-bit version to users who require compatibility with those 3rd party extensions:

https://technet.microsoft.com/en-us/library/ee681792.aspx

https://support.office.com/en-us/article/choose-between-the-...


I suspect that being one of Microsoft's primary cash cows, they'd also have vastly more developers to throw at the problem.


Took them long enough, too. For the longest time they couldn’t switch because they were using Carbon instead of Cocoa.


The linker (especially when doing LTCG) can run out of space, even if it's the 64-bit one. There was some 2GB limit (or was it 4GB), but don't remember. Happened some time ago (years?) when tried LTCG on the WebRender (chrome/bink?) that Qt bundles.


> Could somebody point me to a technical explanation of why it's sometimes non trivial to just compile your app against x86-64 and call it a day?

Because the program uses a deprecated 32-bit API.

Once the deprecated 32-bit API is dropped, the program simply won't compile.

Also, the modern 64-bit API is different. And by different I mean it has a completely different design and set of interfaces. So to get the program to work again you have to port the program from using the old 32-bit API to using the non-deprecated 64-bit API. That's non-trivial work.

Edit: clarification


Are you talking about Carbon/Cocoa?


Because it is non trivial when using C or any derived language that has copy-paste semantics with C.

It starts by C not having fixed sizes for its datatypes.

Sure there were always macros/typedefs with such fixed sizes and C99 introduced stdint header.

However not everyone actually uses them.

Then there are the bit fiddling algorithms, unions and casts that assume a specific memory layout.

Followed by code that might actually become UB when switching to another memory model.

All of that scattered across hundreds of files, not written by a single person, with an history of decades of code changes.


But Visual Studio is not C. It's C#, and even it's C++ parts would be somewhat com-ish, dot-ish. I'm sure Microsoft is well prepared for 32-bit -> 64-bit transition. My guess is that there are probably massive amounts of plugins (not done by them) that are not prepared, especially ones that have to use native code. So Microsoft can't say No there. Just guessing, I dont' know for sure.


Visual Studio is more C++ than you're giving it credit for, it's largely old C++, and making assumptions about how forward-looking that C++ is is probably pretty questionable.

Also, yes, plugins are a large concern.


> making assumptions about how forward-looking that C++ is is probably pretty questionable.

Especially since Visual Studio didn't even compile C++ correctly until fairly recently. Variables declared with `for (int i = 0; ...)` would remain in scope after the loop, the same as if it had been declared outside the loop. I don't know when they fixed that, but I think I ran into this problem as recently as 2010. Meanwhile, GCC and everyone else had been doing it correctly for years.


The last version of the compiler with that problem was VC6 which came out in 1998. This version was superceded in 2002.

I wouldn’t say 16 years ago was recent. If you really ran into this in 2010 it would have either been using an old toolset or the compiler flag that lets you enable the old behaviour for compatibility if you need it.


Nope, C++ is the undisputed king of coding and I have no love for C++. Not only Windows is mostly written in C++ (and I would not be surprised if they use some C and Assembly as well) but even external projects like Unity (a popular game engine) that suppose to be used only as C# libraries they are just C++ projects. As long as Windows is by far the most popular OS on desktop and Android on mobile devices C++ remains the most essential, well supported, mature although infinitely ugly Programming Language.Because the OS remains still in the center of software development.


I doubt that there's a lot of C++ code in Windows, at least in the lower level system libraries. It should be mostly C (and that's a good thing). The same for Linux/Android and the BSD parts of OSX/iOS.


There is plenty of it, specially since Windows 8, which was when C++ code was officially supported on the DDK.

Since they decided they are done with C, Microsoft has been cleaning up Windows code to make their code compliant with the C subset of C++.

Since Windows Longhorn failure, with Vista COM got the main API role for the components that were originally designed to be written in .NET.

Which lead to the design of UWP as improved COM, using the ideas they originally had for Ext-VOS, but decided to create .NET instead.

https://www.reddit.com/r/cpp/comments/4oruo1/windows_10_code...


Ah alright, thanks for the clarification. Nothing in those direction changes convinces me that Windows will improve over WinXP and Win7 though ;)

I really wish the Visual Studio team would give C a bit more love. Getting at least full C99 support in is most likely less work than any random C++17 feature.


Their are done with C, there is no more love to give beyond what ANSI C++ requires.

ANSI C++14 requires C99 library, while ANSI C++17 upgraded it to C11.

Microsoft has contributed to improve clang on Windows for those that still want to keep on using C, but then good luck accessing COM and .NET APIs from C, it is possible but I wouldn't do it for fun.

Even the new C runtime library is actually written in C++.


It would be fine with me if I could simply compile any valid C99 or even C11 code with Visual Studio C++ compiler, but C99 support is only half-assed, and C11 support probably will never happen.

D3D11's C API works fine (that's the only Windows API I care about apart from the usual Win32 windowing stuff), but already D3D12 let that rot and is only usable from C with workarounds (doesn't matter though since D3D12 is fairly niche).

All in all it's a shame though because the newer C features are much more sane and useful than anything in C++14 or C++17.


The name says it all, Visual Studio C++ compiler.

There is no written rule that C++ compilers are obliged to be C compilers as well.

Again, as C++ compiler, they are only required to be as compatible with C code as ANSI C++ requires.


> Because Visual Studio regularly, even on the latest 15.7 preview shits the bed with OutOfMemoryExceptions on our large solution, I'm inclined to rage "why don't they just make it 64 bit?

There's technical reasons alone and there's decisions being made based on other factors.

Iirc the product manager for Visual Studio was against blindly moving VS to 64-bit just because it uses a lot of memory. He considered that fixing the symptome and not the cause. He wanted the team instead to spend their time trying to identify inefficiencies and memory leaks.

Now... If you can say that decision has paid off or not is not for me to call, but I do appreciate the reasoning behind the decision.

If they can make it work well with a 32-bit constraint, that's clearly better than yet another app which uses 8GB memory for no reason.


>Could somebody point me to a technical explanation of why it's sometimes non trivial to just compile your app against x86-64 and call it a day?

https://docs.microsoft.com/en-us/cpp/build/common-visual-cpp...

http://www.informit.com/articles/printerfriendly/2339636

BTW: The 32bit limit is per-process, not for the OS. You can easily have multiple 32 bit processes consuming more than 4GB memory in total. You don't need to port to 64bit to get that extra memory.


macOS makes the 64-bit transition a bit harder than that because a lot of the older user space APIs were purposely not ported to 64-bit. If your app was written against the Carbon APIs, you have to do a fair amount of work to change all the GUI code.


Yep. But developers have no excuse for being late to that show -- Carbon was always explicitly a transitional technology, for a transition which ended over 10 years ago. Warnings have been plastered all over the documentation for ages.


There's something to be said for the MSFT approach of your code will run forever. I understand there's tradeoffs, but there's something to be said for it.

Pointing a bunch of whiny users at devs -- some of whom gave code away for free -- is not awesome for the ecosystem. IMO.


Not awesome in the short term, but better in the long term.


Backwards compatibility is a very tangible benefit in the long term.


Truth.


Imagine if you have a dependency on a third party library, and they only offer 32-bit DLL, or the company is out of business and you are stuck with the 32-bit dynamic library. You can't just "compile against x86_64" in this case.


In many cases for small dlls you can thunk it and provide a compatibility standard library.

Essentially what wine does but cheaper. The ease of doing it is related to number of dependencies.


Assuming that you know that those 3rd party extensions are sticking to the published API and not stuffing around by scratching around in Visual Studio's internal APIs and data structures......

Because Windows programmers have such a great track record of sticking to the published API....


I think the long term plan is to keep adding stuff to VS code, keeping it modular so that it can be anything from a simple editor to the full IDE VS is, and then phase out Visual Studio. Achieves multiple goals in one go, and gives a smooth transition path without the second system syndrome pressure. Everybody wins.


Completely agree, and usually the answer I get is - well plugin developers should just move their stuff OutOfProcess, and deal with it. I went to GDC and talked to JetBrains about ReSharper C/C++ - which is a phenomenal product, yet limited by what else lives with it in the process space. So since last week I had to kill (uninstall/disable) anything else but it, because I depend on it. I briefly had to re-enable th Qt plugin, the P4 one, but had to disable a lot of Microsoft's (whatever I can), Intel's, and had to leave Sony's SN-DBS otherwise compile times are slow as hell... So it's manageable, and there was way to have different environments setup (haven't looked deeply into it)... but really moving to 64-bit would solve a lot of it... even if it would take more memory... (bigger pointers, that's usually the stupid excuse)


I gave up ReSharper in 2015. It's a great tool and I miss it, but our solution has nearly doubled in size over the past few years and even vanilla VS 2017 struggles with it. We have dev ops working on nugetizing some of the bloat and Microsoft premier support is helping us to understand why our solution is so stressful on Visual Studio.

The in process limitation is a real killer for ReSharper.


I recently didn't renew ReSharper for my team. It makes VS crawl, and I honestly don't understand why on earth it cannot utilize some of the other 15 cores available on our machines. It's a fantastic tool, but it doesn't scale with solution size at all.


> Could somebody point me to a technical explanation of why it's sometimes non trivial to just compile your app against x86-64 and call it a day?

"In 32-bit programs, pointers and data types such as integers generally have the same length. This is not necessarily true on 64-bit machines. Mixing data types in programming languages such as C and its descendants such as C++ and Objective-C may thus work on 32-bit implementations but not on 64-bit implementations." [1]

It's non-trivial to ensure every developer writes portable code multiplied by the varying degree of skills for developers.

In a language like C++, assuming the size of an integer is 4-bytes, or that the size of all pointer types are the same, can introduce hard to track down bugs, which may result in 64-bit versions of apps to be unstable.

[1]: https://en.wikipedia.org/wiki/64-bit_computing#64-bit_data_m...


If people have done architecture-specific things they may not compile (or worse, crash inexplicably at runtime).

I tried doing more or less that (compiling a C++ Windows app for 64-bit) a few jobs ago and eventually discovered that way down in a library somewhere a dev (who had long since departed) did some trickery in string processing routines that relied on arguments being lined up on the stack, which is no longer the case with the x86_64 calling convention. That remains the worst piece of code I've ever worked with - it's innocuous to look at, and any developer who understands why that works should also have known better than to do it.

More generally, integer size issues can arise - if `int` remains 32 bits it's no longer enough to capture the difference between two pointers (and obviously should never have been used for that, but often these things happen).


Two basic APIs are not available in 64 bits: Carbon and QuickTime. Steve Jobs promised to port Carbon to 64 bits when macOS X was introduced. So many of us developed for Carbon, and a few years later Carbon was deprecacted, and Apple announced they would never port it to 64 bits. Even software from Apple such as iTunes was built and Carbon, and it took years for Apple to remove the Carbon code out of iTunes...


You’re forgetting the part where this was long time ago. You’ve had at least a dozen years to migrate your code to Cocoa.



> Could somebody point me to a technical explanation of why it's sometimes non trivial to just compile your app against x86-64 and call it a day?

The reason for that is quite similar to the Y2K problem.

https://en.wikipedia.org/wiki/Year_2000_problem


Dependencies can be an issue also. If you're dependent on some 32-bit libraries this can cause you problems moving to 64 bit.


If you assume longs or pointers are 32 bits then it's a problem.


I learned that when pointers were 16 bits wide. You should never assume the size of your words or addresses. Ever.


You never should but people do, and that's when there's a problem.


This is particularly heinous for those of us that do music production. A lot of plugins are orphaned at some point (meaning the developers no longer update them), and if you created a piece of music with those plugins (instruments, effects), if you can't load those plugins, you can't open old files.

I often open up sketches from several years prior and consider working more on them. I keep a full 32-bit stack of music stuff left installed exactly for that reason. Usually, if I open them up, then I'll go ahead and move them over to 64-bit semi-equivalents, but that's difficult if you can't even hear what you were doing with the old one.


As an ambient composer, myself I understand your pain but that is the unavoidable fate of all abandoware technologies.

Abandonware technologies however can be used through the use of emulators. If I can use my first computer that my father bought in 1988 (Amstrad CPC 6128) with its magnificent 4mhz CPU and mindblowing 128kb memory from any OS (including iOS and Android) via emulators. I am sure using 32 bit VSTs and AUs should be a walk in the park with Virtualbox.

All you will have to do is bounce those plugins to audio and move the audio to your 64bit OS or even (probably its possible) make the two OS talk to each other so you can use both 32 and 64 bit versions at the same time.


A friend of mine who does music production too, keeps a mac pro (2012) on snow leopard for backward compatibility purposes (compatibility with old specialized hardware, pci express cards and software).


For plugins it might well be possible to create a 64 bit shim plugin that fires up the old 32 bit plugin in a virtualised 32 bit environment, with thunks to translate backwards and forwards.


While you will possibly have to upgrade your OS for some software sooner rather than later, MacOS High Sierra should continue to get security updates for a while, and you will be able to run it in a virtual machine like Virtualbox or Veertu, or something else built on Apple's native hypervisor for a very long time.


Yes, I've considered such. Unfortunately real-time audio processing is one of those areas where virtual machines are still very finicky.


I run into this problem all the time. I’ve gotten into the habit of bouncing all the individual tracks on a project I put down for a while; it’s not perfect, but at least I have something left if a plug-in breaks later.


Yep this is the the best real solution I think. A recording is a snapshot in time and it makes sense to print track/s to stems after plugins so can be remixed later if so desired.


I've been saying this for years, but Windows is genuinely a better platform than OS X for media production because Microsoft actually gives a crap about supporting legacy software.


I keep two versions of my DAW (Ableton) installed 32bit and 64bit. It's not just so I can open my older files but also because I find some of the older VSTs to simply sound better.


> A lot of plugins are orphaned at some point ... if you can't load those plugins, you can't open old files.

In other words, your music production business depends on old, unsupported, proprietary software. You may actually have to pony up and hire someone to fix your problem or help you switch to another platform. Welcome to the real world.


Yep. I'm not excited about the prospect of moving back to the PC for making music, but it looks like that's what Apple want me to do.


Apple started making this transition in OS X 10.5, which came out in late 2007. The 32-bit-only Carbon APIs have been deprecated since 10.8 (2012). It's not like this snuck up on developers; the writing was on the wall by the end of the last decade.

I'm not unsympathetic -- I have some 32-bit programs lurking around, including Dramatica Story Expert, an expensive one that's still theoretically being updated. But if that program stops working, I'm inclined to blame the developers, not Apple. Apple isn't forcing them to put out an app that feels like something from 20 years ago, or to consistently wait until beyond the last minute to update for system transitions that they've had literally years of warning about. (In Dramatica's case, they didn't even transition to Carbon until the classic environment went away, and didn't transition to Intel code instead of PowerPC code until Rosetta went away.)


Nope, I am afraid that won't protect you from the curse of abandonware. Only an emulator can. Your problem is not related to what type of hardware, software, OS you use but how old they are. Nothing is supported forever. With emulators we keep using technology that is more than 30 years old. This is a walk in the park.


This, or go with idea that for some years ahead you can work with El Capitan on a old hardware. Windows for me is dead end also, until someone in pro industry realizes that its time to move to Linux. Apple is mobile and entertainment company now. Microsoft wants AI/Cloud. The logical answer is Linux.


>until someone in pro industry realizes that its time to move to Linux

Some already are. When I bought my copy of Pianoteq 6 [1], I was surprised to see they also offer a Linux version (and even a Raspberry Pi version). The DAW I use called Renoise [2] has a Linux version, and the JUCE [3] development framework common to many VST plugins also has Linux support.

Listening to podcasts like Sonic Talk [4], I'm mostly hearing about studios switching from Mac back to Windows, but they have been talking a lot more about Linux too.

[1] https://www.pianoteq.com/

[2] https://www.renoise.com

[3] https://juce.com/

[4] https://sonicstate.com/sonictalk


Other notable big(ish) name DAWs with Linux versions include Bitwig (quite similar to Ableton Live) and Tracktion


Is the exact same software/plugins that will open your current files available on Windows?


Pretty much, yes.


A lot of apps never made the switch on ios. I have some useful ones that were orphaned. I feel like only the mac original intel chips were core2 32bit one, and those long lost os support.

Its weird that dos (box) and old windows programs are often still runable, but somehow mac applications just don't age nearly as well.

I really think if some linux distros can get it together and get some good application support, now is the time for them on the desktop/laptop.

going to Apple-Menu->about this mac->system report ->software applications

will show a list of applications with the right most column indicating 64 bit support. 95ish % of mine are currently 64 bit


Microsoft and Apple have different approaches to backwards compatibility based on the markets they're in. Microsoft makes most of their money on enterprise, whereas Apple makes most on consumers. Enterprise values running old apps, consumers value running today's apps. Each makes sense for their market... and desktop Linux doesn't make sense for either.


Backwards-compatibility is an interesting story on the mac. The Objective-C ABI on 32-bit macOS is what is called the "fragile" runtime because it has the fragile base class problem. Originally Objective-C had the same design flaw as C++: the ivars of every class had to be in the header for all to see so the compiler could rely on the binary layout. A subclass' ivars would be offset by the compiler by examining the layout of every superclass, then baked into the binary.

In the modern (all iOS and 64-bit macOS) runtime the ivar offsets are patched up at runtime so the layout of classes (including base classes) can change without breaking compatibility.

Working around the limitations of the 32-bit Objective-C runtime imposes significant costs, it is far from a simple recompile with a different pointer size. It also adds a lot to the size of the installed OS.


There are also a whole bunch of APIs left out of 64-bit macOS, including almost all of Carbon that dealt with UIs (HIToolbox, ATSUI, QuickDraw, Nav Services, Dialog/Menu/Font Manager, etc). None of these APIs have received anything more than bare maintenance in the last 10 years, and evolving the system while keeping them operational has not been easy.

In a way, this is the true last step of the Mac OS X transition that started in 2001--the removal of the transitional APIs that made it possible.


I find it interesting in perspective of the Swift language their are now pushing.

Theoretically would’nt programms written in pure Swift today be able to compile for any new hardware plateform, thus avoiding the kind of issues that 32/64 switch faced?


Instead you get issues with updating from Swift 1.x to 7.x.

The educational story of this sort is the transition between python 2 and 3... not ever done for some packages.


Swift is a new language, so it’s sure to have source breakage in the first few versions. The compiler is now backwards compatible, though.


So much this. When you're a small team, this sort of thing can basically mean having a product vs not.


It's kind of sad we're in this state in 2018 because Apple launched 7 low end machines [1] back in 2006 that didn't have 64-bit support. Even though those are long gone, their software support legacy remains. Apple could have transitioned directly to x64 and saved themselves the headache (afaict, maybe these problems would have persisted?).

[1] http://lowendmac.com/2006/core-duo-macs/


Apple couldn't have gone straight to 64-bit apps on Intel--the OS and frameworks just weren't ready yet. The Intel machines shipped starting with 10.4.3, which only supported 64-bit command-line apps/daemons. Only with 10.5 (released in late 2007, 18 months after the first Intel machines) were full graphical 64-bit apps supported (on both ppc64 and x86_64).


Firmware was an issue, too. The first generation of Macs with 64-bit processors still used 32-bit EFI and booted a 32-bit kernel that could run 64-bit applications.


True, it wasn't until 10.6 that a 64-bit kernel was available, and only newer machines got to use it.

https://arstechnica.com/gadgets/2009/08/mac-os-x-10-6/5/


I had a blog article about the entire mess that was often confused (there was actually three modes): http://yuhongbao.blogspot.ca/2009/09/mac-os-xs-64-bit-modes....


The oldest Macs that support Sierra (10.12) or High Sierra (10.13) are from 2009. You may be able to bypass these restrictions which are often based on graphics chips through kernel extensions, but discontinued support for 32bit probably has much more to do with older software than older hardware.

If you want to find unusual problems with older Apple hardware for the sake of reusability, take a look at the firmware from around that time. It was just weird enough to make it similar to standards but often too difficult to support a decade later. In a decade, people are going to say the same about the not-quite NVMe SSD interfaces in current Macs.


What's worse is that they had already been transitioning to 64bit hardware with the PowerMac G5 and iMac G5 models since 2003.

Anyhow, I doubt they'd have had time to make a solid 64bit x86_64 SDK even if they had delayed the hardware to late 2006. They had never stopped maintaining the x86 port of OS X (aka Rhapsody, Open Step, NeXT Step), just stopped shipping it as a product, because their strategy was and still is to package their OS in hardware boxes. They started shipping x86 boxes to developers in the form of the x86 development box, which was a (32bit) Pentium 4 shoehorned into a Power Mac G5 case. Its OS is also what leaked to the early x86 hackintosh scene; before Apple shipped their own x86 Macs. I ran it on an custom Athlon64 box and a ThinkPad T42.

Since they already had the x86 port, it was an easy switch at a time when all x86 hardware from Intel was still 32bit. If they knew Intel would adopt x86_64 from AMD, one option could've been to ship with AMD CPU's on the first models, but I'm not sure how well that'd have turned out logistically or whether it'd have made a huge difference.


They wanted to avoid NetBurst (including Prescott and Cedar Mill) and picked Yonah which was 32-bit only instead.


I never did understand why this couldn’t be backported to 32-bit. Was it something like a lack of bits for tagging, or something more mundane like nobody wanting to support 32-bit?


Is the Apple/consumer thing still true? Not questioning your general business-model assumptions, but anecdotally all the “consumers” I know with Macs are holding onto them for six years, while every dev shop I’ve worked in in this decade goes through new MBPs like commodity office supplies. I’m just curious if there are recent stats on who Apple is selling computers to (I haven’t seen any for a while.)


Apple doesn’t really have a product combo to replace AD+SCCM. Macs are popular in companies that are still small enough to have people manage their own machines.


Is that really true?

We have 20000 Mac users in a company of 90000, I don’t think this is a corner case for large technology companies.

My machine is managed in the sense some things are preinstalled and IT has a way to push software updates.

What more management is required? Anything more and IT would probably fuck it up like they did when our division was still on Windows.


Unified image on all machines, auditability of installed software, etc. End users absolutely do not have admin rights.

Not saying your company has to be like this. Just saying 1) this is where MS has very little competition, and 2) where they make absolute bank.


Our macs are integrated with AD and are remotely managed similarly to windows.


> in companies that are still small enough to have people manage their own machines.

Company size doesn't have much to do with this. Macs are popular at IBM.


Last time i read anything about that, those computers were still largely self-managed (though with a in-office wiki for collected solutions).


> Macs are popular in companies that are still small enough to have people manage their own machines.

140,000+ employees in this company. Macs outnumbers Windows boxes. Pretty much the only place you see Windows machines are in the various accounting departments, and in the call centers where they're just used as terminals to big iron.


Frankly i think consumers value backwards compatibility as much, if not more, as companies.

At least as one gets older and just want to get that thing done that they always have (a certain big name author still writes using WordStar in DOS i believe).


Enterprise values running old apps, consumers value running today's apps.

Do they really, or is it just that consumers are more likely to be persuaded by Apple's marketing?

Personally, I have some apps which I use regularly, and are at least 20 years old. I use mostly Windows for that reason too.


For development purposes I'll take a Ubuntu loaded Dell laptop over anything windows, and wouldn't even consider a MacBook.


It all depends what one understands by development.

My Ubuntu netbook is nice for travelling purposes, but for the kind of work I get paid for, it is Windows and occasionally macOS.


If I am developing for a Ubuntu server, I'd rather just use Ubuntu directly. Plus the user experience is just as good if not better than osx or windows, but that is a highly personal opinion. I've been using gnome and kde for two decades now. Windows just just has nothing to offer me, and Macs have always been fairly "meh" in my opinion since I had to deal with classic "hard crashes" and those ugly jelly beans


lol, nice little zinger there at the end for "the year of Linux on desktop".


Haha I mean we've been talking about it for two decades and in that time the market has reached maturity and is now in decline. It's always been a pipe dream.


Consumers like it when their 10 year old video games still work, but those users probably weren't buying Macs to begin with.


Someone else will know the details a lot better, but Apple has changed the licensing details on many of its older OS X versions to legally permit virtualization if the host is a Mac. Which OS version images I can download for free in the store right now appear to be linked to which ones I had ever used under my current Apple Store account.

Given that a lot of current Macs come with only 8GB of ram it is not ideal to run a lot of applications in a virtual machines, but applications that need much RAM have almost certainly moved to 64bit anyway.

Again, someone else will understand it better, but I think products like Veertu were built on Apple's built in hypervisor framework, and it would not be difficult for people to build additional free virtualization options in MacOS. My point is that the reasons for choosing Linux over MacOS are already sufficient without targeting users of older Mac software, and that people who are using older Mac software probably also use new Mac software, and won't have too difficult a time using the older stuff too.

A while ago I would have said that something like Wine would be great for using the best software I can ever remember which only ran on early Macs. Except that these days you can do that in Javascript in any browser, and while the software was good, I've already spent a few weekend afternoons playing with them, and I don't need to anymore.

I miss my 32bit iOS games more than I'd miss any 32bit MacOS applications, which I could run anyway through virtualization. I also wish they'd stop disappearing features from Apple software in general, but I think sunsetting 32bit apps or Carbon are generally better for the Macintosh ecosystem.


Re: iOS, yeah, I was looking at my purchased app history to get inspiration for a new project and a decent chunk of apps I bought around the time of the first iPad (as well as newer ones) are no longer available, which is a shame. I can’t really blame the authors for not going back and updating a £1.99 app 6 years after the fact, but it’s a shame and I hope there won’t be another “extinction event” like this. Am I right in understanding that Bitcode should make architecture changes much less of an issue for iOS in future?


Apple is pinning a lot on bitcode, but they really aren't willing to do what Linus does and take a firm stance to not break userspace. Microsoft does that to a fair degree, hence old WinXP apps still running on Windows 10. When you get back to the DOS era tho, better to just use DosBox!

Due to the closed ecosystem for iOS, this type of abandoned app preservation will be extremely hard to do :c


The only ones I have that aren't 64 bit are games interestingly enough. And Calculator and DVD Player, which is odd, I'd expect that to have been ported by apple by now. That said the DVD Player app is not a big deal as I don't have any dvd drives any longer I realize, oh wait i bought a usb one, would have to find it.


Somewhere in the recesses of internal Apple email exchanges, there's probably a debate about whether they should just drop support for DVDs when they do the 32-bit transition. They're ruthless about old formats.

Calculator is 64-bit


Apple still sells a USB CD/DVD burner, I don't think they'll be dropping DVD playback any time soon. I think it's more likely that they've had an intern rewriting DVD Player.app in Cocoa and Swift, and it'll ship in 10.14.

https://www.apple.com/shop/product/MD564LL/A/apple-usb-super...


Do you suppose it'll support BluRay, by any chance?


I wonder if they whitelisted Apple 32 bit apps so that they won't trigger the new warning message. It would look a bit janky if they didn't.


The current version 1.21 of Kindle is only available in 32-bit.


My report shows Calculator as 64bit. (Version 10.13)


So the calculator I had as 32 bit turned out to be from some iOS app I compiled god knows when or how long ago, oops. :)

Calculator:

  Obtained from:	Unknown
  Last Modified:	12/27/16, 9:00 PM
  Kind:	Intel
  64-Bit (Intel):	No
  Location:	/Users/mitch/Library/Application Support/iPhone Simulator/6.0/Applications/610B1057-FF52-4D59-AE7D-DDD406E83F18/Calculator.app


Looks like you started the Stanford iTunes U course


Sounds about right actually, good call.


Ah, good old Android File Transfer isn't 64-bit. This should be fun.


Looking a the list I see: Steam stuff, Blizzard stuff, WebEx and something for the logitech mouse. Sadly unsurprising, the biggest players are the slowest to move.


The only 32-bit apps on my machine are a bunch of Adobe helper apps...


I suppose dropping x86 support makes for one less architecture for Rosetta II to support.


Rosetta II being the x64 -> ARM translation layer for when they start putting the AX chips in Macbooks?

HP already has it's Snapdragon convertable out that has Windows 10 x86 emulation. But surprisingly this is only _32-bit_ not 64. Looks like performance is understandably poor as well.

I'd expect Apple to take their usual strategy and wait a bit until chips are a little faster/they have a more mature emulator before they make the switch.

Interesting times though!


I actually expect Apple to implement some "emulator-acceleration instructions" into their own chips and get excellent (almost as good, as good or better) performance once they switch, as always, because otherwise people just won't switch en masse.


If Apple make the move, I kinda expect them to do nothing, and make their customers deal with incompatible software. They've been super hostile to their customers of late.


I dont think Intel will allow that to happen without some patents issues.


x86 is now 30-40 years old depending on how you measure it. Even "modern" x86 (post Pentium Pro) is over 20 years old. An awful lot of the patents have expired.


X86-64, that is the most important one.


It's not even by Intel; it's by AMD and they did nothing to prevent Intel from using it.


Intel did license it from AMD IIRC.


Microsoft has announced 64 bit emulation coming in a few months.

Snapdragon Single Thread Performance isn't even great, hence with added emulation performance penalties it is going to be even slower.


Yeah Apple's "big cores" do have better single threaded performance over Snapdragon, but we're not talking an order of magnitude.

I think chips need to be a bit faster AND emulation overhead reduced before it gives a "can't really tell the difference between an Intel or ARM MacBook" type experience. At least for developers/creators.


Re: HP Envy

It's an ARM64 chip running ARM64 OS and ARM32/x86/ARM64 Win32 and UWP apps. Just doesn't handle amd64 > arm32/64 conversion.


Can't imagine why this push is necessary. One of the primary advantages of x86_64 as opposed to other 64-bit architectures is that normal 32-bit applications run natively with no performance hits. As others notice, this is likely going to serve no other purpose than to make abandoned 32-bit applications completely unusable.

Of course, it could also be related to Apple's intention to switch to ARM chips in the near future, and getting everything on consistent 64-bit to aid the porting effort. I can't imagine the developers are going to enjoy low-level mapping of 64-bit x86 instructions to 64-bit ARM though...


> I can't imagine the developers are going to enjoy low-level mapping of 64-bit x86 instructions to 64-bit ARM though...

Why would the average developer even have to care?


I meant Apple's developers.


I recently lost a bunch of iOS apps, some 8 years old, due to the ABI cutoff.

I don't have the ecosystem tie-in that I used to have. What good is owning a bunch of apps if you can't use them?


8 years is quite a bit of time in the technology space. There's no point holding back the rest of the ecosystem if some developers have abandoned their apps.


Does it mean macOS users won't be able to play 32-bit games in Wine anymore?


No, it means MacOS will no longer have a 32-bit runtime but this does not prevent the CPU executing in 32-bit mode - just that all system calls have to happen in 64 bit mode.

Wine will have to be a 64-bit application linked to the 64-bit libraries and do a bunch of thunking and mode switching, but it already does this to support 16-bit applications.

I have no idea how easy this is but the the OS already does it to support 32-bit applications running on the 64-bit kernel.


When Wine runs 32-bit application, it links against 32-bit libraries, including OpenGL. So how would that work in pure 64-bit system? 16-bit is really some special case, I don't know how Wine is doing it, but I suppose it does some special translation.


The same way those 32-bit libraries call the kernel now - they marshal the call to 64-bit land. A lot of the user space is compiled separately for performance and to maintain ABI compatibility so this would have to be taken up by a third party. I would expect to see a slowdown but it depends on the exact nature of the changes.


You can only execute in 32-bit mode if the kernel gives you a compatibility-mode descriptor in the GDT, or lets you put one in the LDT.

(or you use virtualisation)


Maybe in the distant future. Last I checked, Wine could still run 16-bit programs on macOS.

I could see a 32-bit Wine needing to bring its own copy of more libraries that Apple wants to stop shipping two copies of, but I doubt Wine would be much affected by not getting access to new 64-bit only APIs.


On Linux, Wine requires 32-bit libraries to run 32-bit games, even if it's 64-bit Wine. I'd imagine same applies on macOS. So if let's say they'll drop support for 32-bit OpenGL, there isn't much Wine can do about it.


Right, but I doubt Apple will go that far, especially in the next few years. They have strong reasons for abandoning 32-bit (or at least breaking binary compatibility) for the stuff written in Objective-C, but they'll probably leave the basic Unix/C stuff alone.


The ‘basic Unix/C stuff’ includes most of the architecture-specific code, in both the kernel and userland, so keeping it around would have a large cost. And there wouldn’t be much point: command line tools are probably much more likely to be 64-bit than GUI apps, as they’re more likely to be open source (allowing recompiling even if the original developer is gone), and more likely to be portable between operating systems (reducing the chance of running into either architecture-specific dependencies, or macOS APIs that happened to be deprecated with the move to 64-bit).


They can write a 32-bit OpenGL API that calls the system’s 64-bit OpenGL, or does everything in software and renders to a memory range that they somehow make show up on the screen.


It doesn't, 32 bit apps still run, Wine and Wine-hosted apps included.


presumably they are going to scrub 32 bit support entirely, including from the kernel.


Eventually. Certainly not in this or next major release and the rate of adoption/updates will very likely influence the schedule. There are a lots of widely used apps out there that are 32 bit. Steam, for one, which probably matters more in that market segment than Wine.


I suppose for those who play games on macOS, it's not Steam itself that's a problem, but quite a big amount of actual 32-bit only games. Both macOS and Windows ones (that could run in Wine until now).

I hope when Linux distros will start dropping 32-bit support, they'll still keep multiarch around for such purposes.


> I hope when Linux distros will start dropping 32-bit support, they'll still keep multiarch around for such purposes.

A bit late for the comment there. Pretty much the only distros that still support x86-32 as an installation/host option are Debian, Slackware, and Gentoo. Everybody else (including Ubuntu) has gone 64bit-or-bust.

Mind that the latter half still applies. The kernel is still capable of running 32-bit applications in a 64-bit environment, and no distro has removed multiarch support, and likely won't.


> and no distro has removed multiarch support, and likely won't.

My only concern is, that with general dropping of 32-bit, multiarch versions of packages will be bitten by bit rot, because no one will be maintaining them.


Well, if people are using them those people will report bugs, if no-one is using them then they will rot but it won't matter. It hurts if you're using one obscure package that no-one else is, but that's the software life.

/I was quite possibly the last gentoo/sparc32 user


Right, but that's the gist of what I mean - I don't think 'macOS update that suddenly and irrevocably kills most of your Steam library' is on the horizon.


Why not? on iOS they started warning on starting 32-bit apps in 10.3 and killed 32-bit app support in iOS 11.

I can imagine that they will be a bit more gentle with macOS, given its longer history, but I wouldn't be surprised if they yank all 32-bit system libraries in High Sierra + 2.

This is typical Apple and has it upsides (unlike Android/Windows, applications do not rely on ancient APIs) and downsides (breaks compatibility).

Also, part of the motivation here may be to kill Carbon as soon as possible. If Marzipan materializes, they probably do not want to maintain Carbon, AppKit, and Marzipan/UIKit simultaneously.


February 2015 was the deadline for apps to support 64 bits with any new updates in the App Store. So developers had 2.5 years to prepare for iOS 11.

Users had less warning, true, but the situation was different since they also had no way to acquire apps outside the App Store.


You can still have VMs for 32bit games, and if there's a big demand for them, bet on someone creating a gaming-oriented VM.


Apple has in fact stated that High Sierra is the last major release to run 32bit apps "without compromise" (their words). So probably a macOS 10.14 could show up in a couple of months at WWDC without 32bit app support, just like iOS 11 did last year.


> So probably a macOS 10.14 could show up in a couple of months at WWDC without 32bit app support

I don't understand your reasoning.

They're implicitly (but strongly implicitly) saying 10.14 will run 32-bit apps. What's uncertain is what "compromises" they'll entail.

Even Apple wouldn't drop all support for 32-bit apps with just a few months' warning. They're ruthless about moving forward, but not quite that cruel.


But that's exactly what they did in iOS 11?


They also announced exactly that. Not the case for this or the next major revision of macOS.


February 2015 was the deadline for apps to support 64 bits with any new updates in the App Store. So developers had 2.5 years to prepare for iOS 11.


Additional requirements for new apps or updates to apps is something completely different from disabling existing purchased apps and denying end users access to their documents contained within.


That I cannot deny, but neither were users forced to upgrade immediately.


If you want your device to be secure against published 0days / CVEs, you have to update iOS and when iOS 11 came around, that's the only branch available containing those updates. :-/


I’d imagine that’s the case.


There goes Ambrosia's back catalogue.


Well at least this time they alert. Support for Universal binaries was dropped silently.


> Support for Universal binaries was dropped silently.

Since when? I have universal binaries that run fine on my computer.


This will be what finally moves me away from Adobe software.

I use one Adobe app: Illustrator CS5. My needs are fairly specialist and I haven't needed any new features introduced since CS4 (2008; multiple artboards, finally).

Adobe's only upgrade option is £240pa for a single-app subscription. Fortunately, alternative drawing programs have come on a long way since CS5, so I'll almost certainly jump ship to one of those.


It seems like only yesterday Macs were 24-bit and we were all busy getting “32-bit clean”


For the past few years, Apple have been doing everything they possibly can to get creative professionals to leave their platform.


I'm a creative professional and I, along with other people I know, still prefer to remain on Apple platforms.

I guess now you have to argue about and form a consensus on the definition of "professional" huh?


So Apple is dropping 32 bit computing? Hmm ....

Gee, I'm building, configuring a server based on an Asus motherboard and the AMD FX-8350 processor, 8 cores, 64 bit addressing.

Surprise! I discovered that Windows XP 32 bit Professional SP2 (service pack 2) will install and run! It sees all 8 cores, and the version of Microsoft's TASKMGR plots the activity separately on each of all 8 cores. It also sees the full 16 GB of main memory and is willing to use 2 GB of it with 5 GB of paging space.

And I discovered that the Western Digital (WD) Data Lifeguard Tools CD, IIRC version 11.1, boots and runs! This is amazing since what boots is old DOS! The DOS part will boot from a CD/DVD USB (universal serial bus) drive, but then the WD software doesn't run. But if boot from a SATA (serial advanced technology attachment) CD/DVD drive, then the WD software does run.

If have the Windows version running and put the WD CD in the SATA drive, then the WD software appears to run as a Windows application!

My most important application is 32 bit editor KEdit, and I've discovered that it runs fine on Windows 10 64 bit Home Edition on an HP laptop with a 64 bit Intel processor with two cores and 4 threads.

So, lesson: With Windows, AMD, Intel, and ASUS, a lot of 32 bit computing still works! Sorry Apple!

My first intention installing Windows XP was just to run some experiments on using the WD Tools to backup and restore a bootable partition, but I've since discovered that apparently my trusty old copy of Nero for CD/DVD reading/writing that I long used on XP appears to install on Windows 10 on the HP laptop but as far as I can tell won't read or write CDs or DVDs. So, for routine reading/writing CDs and DVDs, apparently I should keep a bootable partition with XP.

Sorry, Apple, 32 bit computing won't go away soon: The reason is standard and old in computing -- there is a lot of old software people very much still want to run.


Apple is a bit special though, since their hardware and software go together. They don't sell 32 bit Macs and have never emphasized backwards compatibility like Windows (or Linux) has and quite often they have emphasized the opposite. Supporting 32 bit is dead weight for them from a business perspective and has been for a while, this shouldn't be a shock for their users.


> have never emphasized backwards compatibility

This is mildly misleading. Apple emphasizes backwards compatibility, temporarily. 68k apps ran on PowerPC systems. Classic MacOS apps ran on OS X systems. PowerPC apps ran on Intel systems.

Apple is a master of architecture shifts - they've had so many of them! But the compatibility is always temporary. Enough for customers to get their apps shifted over.


They also learned from their first architecture shifts, which didn't go all that well. Apple II to Apple III flopped early. Apple II to Lisa was nonexistent. Apple II to Mac could've been a 1984 to 1985 thing if executed well, but instead it took until 1993 for Apple II to be killed. Lisa to Mac was just recycling the old hardware for the new OS via MacWorks.


Don't forget that for a while you could buy an Apple-IIe-on-a-card that went inside your Mac and let you run Apple II software.

https://en.wikipedia.org/wiki/Apple_IIe_Card


That's why I qualified it with "like Windows or Linux", but yes they haven't totally disregarded backwards compatibility. They definitely have a history of dropping support for many things a couple years after they stop selling hardware that supports it, but they don't do so without warning.


> never emphasized backwards compatibility

As does the latest debacle about Display Link show: https://support.displaylink.com/forums/287786-displaylink-fe...


> Apple is a bit special though, since their hardware and software go together.

You mean that there are no applications programs written by other than Apple for Mac computers? So, there's no Apple version of Autocad, Adobe Acrobat, Distiller, or Premiere, no version of D. Knuth's TeX, no LaTeX, no way to compile and run LINPACK, no SPSS or SAS? Or for each of these, have to get a 64 bit version? Maybe a 64 bit version doesn't exist? Maybe the user already has licenses for 32 bit versions and wants to continue using that software?

I don't know anything about Apple; I've been on mainframes, super-minis, DOS, OS/2, and Windows. So, I'm asking, trying to understand. Some of my most valued software is old, for 32 bit Windows, and I wouldn't want to be without it. So, I'd guess that no longer running 32 bit software would disappoint a lot of long time Apple customers, but I don't know anything about Apple.


> Some of my most valued software is old, for 32 bit Windows, and I wouldn't want to be without it.

But, as you've said, you're not an Apple customer, so this isn't irrelevant is it? Apple customers in general don't work like you do.

> I'd guess that no longer running 32 bit software would disappoint a lot of long time Apple customers

I don't know what to tell you except that your guess is wrong. The vast vast majority of Apple customers don't need to run old 32 bit software, and Apple optimise for the well being of the majority of their users.


> But, as you've said, you're not an Apple customer, so this isn't irrelevant is it? Apple customers in general don't work like you do.

Thank you, this was exactly my point. Buying Apple and expecting long-term backwards compatibility is a mistake. If that is what you need then buy something else. I'm sure some people are being screwed by it, but it's not like this came out of nowhere.


All of the applications you listed have had 64-bit versions for years. Several have been 64-bit only for a long time, eg. Adobe Premiere dropped 32-bit support in 2012. (Distiller and LINPACK have equivalents built in to the operating system.) You'll have to try harder to find a reason to be upset about this deprecation.


> Adobe Premiere dropped 32-bit support in 2012.

And if someone bought Premiere in, say, 2010 and have been using it since then, to buy a new Apple computer they will need to buy a new license for Premiere?

Look: (1) I don't know anything about Apple. (2) I'm trying to learn. (3) I'm not a Windows fanboy. (4) I'm not against Apple. (5) I'm not in a computer hardware, operating system, programming language war. (6) It's perfectly reasonable for a person with my background to think about software compatibility. (7) I'm not trying to criticize, catch, trip up, denigrate, talk down, etc. Apple.

Currently I have sore spot about old software: My computer quit, and I had to rush out and just buy a computer. So, at Sam's Club I got an HP laptop with Windows 10 Home Edition. With that computer working for just simple, routine, Internet access and little more, I'm building my new computer and first server for my business.

But I'm finding problems with Windows 10 right along. One of the big problems is that there is too much software that I had working fine on Windows XP SP3 32 bit Professional that (A) won't work on Windows 10 and (B) for which I know of no replacements unless I start paying money, not being sure the software would work even if I did pay money, and even if it did work I'd have to discover and document the usual obscure nonsense, set lots of options, etc. before I could use the software. So, as soon as I have my new computer working, the HP laptop will get its top closed, its Ethernet and AC power connections unplugged, put away some place, and left to gather dust.

For my new computer, I won't have Windows 10 on it at all. I will install Windows 7 64 bit SP1 Professional. But I'm also installing Windows XP SP3 32 bit Professional to be sure I can run all my old software in case Windows 7 won't.

Software compatibility means a LOT to me. So, it was easy enough for me to guess that Apple made a big mistake dropping support for 32 bit applications software.

There is a lot of third party software out there, so much that it's tough for me to believe that somehow Apple has duplicated all of that. Then some of that software was written for 32 bit computing and was important for some users. And some of the companies writing that software went out of business and, thus, won't be writing 64 bit versions. So, I have to believe that some Apple users will still want to run some such old 32 bit software. But, maybe not.

I'm surprised. But I was trying to learn.

And, I'm not a Microsoft fanboy, e.g., I want nothing to do with Windows 10. Maybe I should try to buy a copy of 64 bit Windows XP Professional!!!! "If it ain't broke, then don't fix it.".


The way people use software has shifted over the last decade. Most people (obviously not you, but believe me most people) now licence and use software as a service. I pay yearly to use my IDE and word processor and things like that, and they keep updating it because I keep paying. I don't think I have any software that I have paid up front for. Maybe a couple of little apps and a disassembler I use. If they stopped being maintained I'd just move onto something else and not worry about it.

Almost nobody has a archive of old software they still want to use on macOS, or worry about using an old version of something like Photoshop. That's just not how almost anyone uses software any more.

> There is a lot of third party software out there

If you mean unmaintained software that won't be updated to 64 bit... well I really don't think there is in the Apple ecosystem. That's just not how people use Apple computers.

Software compatibility means a lot to you, but you're in the minority there. It doesn't mean much to most people. Everyone else has gotten used to being quick to adapt to changes to things like instruction set and pointer size.

And finally of course - people just have a lot less native software at all any more. Most people use web apps for the majority I would guess of their work now. As long as the browser is updated (which it will be, because Apple make it) most people will be never know the difference.


Do you have any examples of 32-bit only software you'd like to run that are actually relevant to Apple users? Listing Windows/DOS utilities providing functionality that is largely built-in to macOS or is specifically intended to work around limitations of Windows isn't a very compelling argument.


[flagged]


> curiously, seems to be possible to some extent on the products I mentioned

It sounds like you think you've somehow caught Apple out by showing that other systems can 32 bit somehow. Apple know other systems can still run 32 bit. Nobody thinks it isn't possible any more. Apple's own systems can still run 32 bit at the moment.

What Apple are doing are voluntarily dropping support for 32 bit, in order to simplify their code base. There's nothing 'curious' about showing that it's voluntary. They know it is. We know it is.

> Again, I'm just saying that it appears that Apple is in some sense dropping support or some such for some old software

It doesn't 'appear' 'in some sense' that this is what they're doing. You haven't caught them out here. They are dropping support. That's what they've announced. It's deliberate and public.

> Standard in computing: Being able to run old software remains important.

But there isn't really any old software in the Apple ecosystem. That isn't how Apple developers work and it isn't what Apple users expect. There may be some edge cases but in general people developing for and using Apple computers don't care about old software.


> It sounds like you think you've somehow caught Apple out by showing that other systems can 32 bit somehow.

No. I have no interest in having "caught Apple out". I'm not attacking Apple and am not trying to start a computer tribal blood bath.

Instead of a tribal blood bath, my interest is just the issue of continued compatibility with old applications software.

Easily, being able to run old software has long been just super important in computing. IMHO that goal has been one of the most important reasons for the success of Microsoft and even more important for IBM's mainframe business, virtual machine, etc.

E.g., long there have been really serious IBM mainframe shops that were betting big bucks and much of a big business on old software and looked at new software, including from IBM, as about as welcome as a sick cow in a flock of 200 prime milk cows. So, for any new release, they ran it on a virtual machine or on the "bare metal" on the side, far from production, for months, constantly testing, before their carefully planned switch over to the new code. Point: Some serious people put a lot of emphasis on old software that has been working very well and are darned reluctant to change over.

So, not knowing about Apple, naturally I wondered about their dropping 32 bit support. For contrast, I mentioned some facts I've observed for the computer I'm building now: The processor has 8 cores and 64 bit addressing, and the main memory size is 16 GB. Then Windows XP SP2 32 bit Professional sees all 16 GB of main memory and sees and is willing to use all 8 cores. And the thing, right away, booted and ran the old DOS program Western Digital Data Lifeguard Tools -- I'm really happy about that since it promises to mean that finally, finally, after months of time wasted with NTBACKUP and years suffering at risk of losing an operating system installed instance, which has happened several times, finally, finally, at last, I may be able to have a good means of operating system installed instance backup and recovery. It's big stuff for me, and that the Tools are just a DOS program looks terrific to me.

I'm not looking for a computer tribal blood bath. "I have no tribe, and no one [should be] offended".


I agree Microsoft excels at backwards compatibility, sometimes at a great cost. This often prevents them from performing aggressive innovations. But it's the right compromise for a company whose business clients represent a significant portion of their overall revenues.

Apple is mostly a consumer-focused company. Thus, backwards compatibility is sometimes a liability. If you are forced to keep old code working, e.g. you can't optimize for new instructions that may reduce energy usage.

Apple has historically done smooth transitions from one architecture to another, offering plenty of time for consumers to migrate, and providing emulation of older architectures in several forms.

If you like sane backwards compatibility, things like Nix and Docker are likely to provide great stuff in the future. Not applicable to Windows, though.


> Again, yet again, over again, once again, one more time,

wow


They're likely doing it so they can support the current x64 apps easily on the ARM chips they're supposedly building in a Rosetta like fashion.

You could use all sorts of ways, but if the underlying CPU changes it becomes emulation vs virtualisation. The performance hit is likely enough a cost for a large amount of people to move on despite the old ways never going away


I'm not installing a version of macOS that can't run QuickTime 7.


I work with video, particularly in "legacy" formats, regularly. There's nothing I used to do with QuickTime 7 that I can't now do with ffmpeg.


Why? If I may ask. Is there still a lot of video out there that only QT7 supports?


I haven't checked what version Visual Studio (not Code) is on the Mac, but good heavens it might be the only 64-bit Visual Studio nowadays :)


What is the point of an OS if not to run the software the user has purchased? I don't care, and more importantly don't want to care at all about the OS beyond it being a launcher for software that allows me to make money. Such a bizzare approach from Apple to reducing the user to an open wallet willing to repurchase software that already works. Also with Apple forcing the user on the rolling release treadmill, its rather annoying that one can't simply stay on a stable version.


There’s nothing stopping you from keeping your machine running the same version of the OS you’re on now. It will continue to run fine indefinitely, and should run all the software you use now.

Surely Apple is allowed to release updates to their OS that eventually break old software at some point. Or do you expect Mac software you bought in 1984 to continue run on the current OS?


> do you expect Mac software you bought in 1984 to continue run on the current OS?

That is to some extend how it works in Windows land. It was not until 64bit was a thing that they removed support for 16bit programs, and there is still many ways to run them.

Everyone that cares about backwards compatibility just needs to read this http://ptgmedia.pearsoncmg.com/images/9780321440303/samplech... to get an idea of the history MS has of backwards compatibility.


Microsoft's 64 bit transition was more awkward with more compatibility breaks, compared to Apple.

Microsoft sold separate Windows 32 bit and 64 bit versions, under different SKUs. The 64 bit version broke compatibility with 32 bit drivers, but not all software would work on 32 bit, so it was important to know which version you had. Many computers advertised as 64 bit were sold with 32 bit Windows, so it was possible to download an incompatible app version. It was a confusing time.

Apple had only one SKU. Applications had 32 bit and 64 bit versions in the same binary, so there was only one version to download. The kernel supported 64 bit addressing and 32 bit drivers, so there was no hardware compatibility break. It was invisible to users.


I agree completely, but that doesn't change the fact that non-kernel software backwards compatibility is very high priority for Microsoft.

I find it a bit dishonest to claim MacOS did so much better without mentioning that they have a fraction of the hardware to support, compared to Windows. When the different hardware combinations are not greater than something you can setup in a warehouse, it's straightforward to test them all.


>Surely Apple is allowed to release updates to their OS that eventually break old software at some point.

And I am not allowed to complain?

>Or do you expect Mac software you bought in 1984 to continue run on the current OS?

That is a false comparison. There is no technical reason why 32bit software shouldn't work on a desktop 64bit intel processor.


No reason that you know of yet.

Apple has done things like this in the past, and people screamed about screwing over compiler publishers and so on, and then suddenly the Mac ran on Intel processors and everything they'd been doing clicked into place.

I don't know exactly what the reason here is, but I've been a user of Apple products for long enough to believe there probably is one, and that we'll know it at some point.


Yes, I am unable to use future information that I don't possess to form an opinion today.

>Apple has done things like this in the past, and people screamed about screwing over compiler publishers and so on, and then suddenly the Mac ran on Intel processors and everything they'd been doing clicked into place.

What "things like this". Can you be specific? Are you simply saying "People doubted Apple, and then those people were wrong". How about "People doubted Apple, and then they turned out to be right". Or just because Apple makes money, it automatically means they have never screwed over anyone?

>I don't know exactly what the reason here is, but I've been a user of Apple products for long enough to believe there probably is one, and that we'll know it at some point.

Oh there is always a reason. I've been a user of Apple's products for long enough to believe that the reason usually involves taking cash out of my pocket and putting it into Apple's pocket. I see no reason for me as a customer to empathize with the financial situation of a fabulously wealthy corporate entity.


I was thinking specifically of changes they made prior to switching from Power chips to Intel chips. They introduced a series of more onerous requirements for developers that made it harder and harder for developers using the leading third-party IDE/compiler from MetroWerks, and the criticism and complaints sounded eerily similar. Then they announced the switch to Intel processors, and it retroactively became clear how much an already-insanely-difficult transition would have been even more difficult if they had continued to support developers not using Xcode. It sucked for developers having to switch, and it sucked even more for MetroWerks, obviously, but in retrospect, it's hard to see it going down any other way.

Apple screws stuff up on a small and medium scale on a regular basis. Monthly, it seems like, and sometimes even more often. But they don't tend to screw up on the big stuff. People disagree with their focus, and that's fine. But I've been following them since the 1980s, and I've come to trust that they'll make good on the big stuff even while they screw up on the small things right and left.

Your mileage may vary.


>There’s nothing stopping you from keeping your machine running the same version of the OS you’re on now.

You can say that about anything. Is there anything stopping you from writing your own OS? Take a programming course, Learn programming, convince people to help you, etc, etc.

>It will continue to run fine indefinitely, and should run all the software you use now.

If it was bug free, I wouldn't care. Apple as the sole distributor, introduces bugs in their software, and then ties the bug fixes to updates which change functionality of the underlying OS.


> There’s nothing stopping you from keeping your machine running the same version of the OS you’re on now

Only if you're ok with dismissing an update dialog though a drop down every single day.

The only options are "Install Now", "Later" (install automatically tonight), "Remind me tomorrow".


Just go to the App Store and hide the update if it bothers you so much.


If backwards compatibility is important, Windows is the only sane choice for a desktop os.


FreeBSD has always had good binary backward compatibility.


I specifically said desktop OS for that very reason. I am aware that you can use Ubuntu and get a similar experience, but it's not in the same league as Windows.


Ubuntu is specifically a desktop OS (to the point where I'd say putting Ubuntu on a server is a bad idea), so I don't understand what distinction you're drawing.


My original comment say

> If backwards compatibility is important, Windows is the only sane choice for a desktop os.

So I'm not saying anything about a server, for which there is multiple choices of long backward compatibility.


So what was with your comment about Ubuntu? You seemed to be implying it wasn't a desktop OS but it is. And why are you dismissing FreeBSD, which can perfectly reasonably run on a desktop?


I've got an old core duo (32 bit) iMac which is still working fine despite the old age (I think it's a late 2009). My small brother use it to navigate the web.

The sad thing is that it became useless with OSX. Safari couldn't be updated nor any browser, thus leading to the inability to browse the web because of https certificates compatibility.

I had to install (with a lot of tricks) linux and it works flawlessy.

It's sad to see that a working computer has become obsolete in only 10 years, and while I will probably continue use Apple products I feel like there something _wrong_ with this. That's one of the reason I am worried about buying an apple watch. Obsolescence.

All in all I get it and I know that it'll probably pay off for them, just like it did with the dvd player removed, but it'll take time (for me) to get used to the fact that, at least on the apple ecosystems, things last more than usual, but they also become useless more than usual.


On the other hand, it is a bit crazy and impressive that OS X/macOS has had to support 32-bit x86 binaries for >12 years because they sold 32-bit Core Duo machines from January 2006 (introduction of Core Duo iMac) until August 2007 (replacement of Core Duo Mac Mini with Core 2 Duo version). 12 years of binary support because they sold 32-bit x86 CPUs for 20 months.


Yes. Indeed, and as I wrote I can’t blame their choice


Safari can indeed be updated. I use a 2008 iMac sometimes on El Cap and it still gets regular security updates and updates for Safari/Pages/etc.


My iMac can’t be upgraded to el cap because it’s not 64bit :) if I recall it was snow leopard the os. Maybe I wrote the wrong model date


can confirm, I was referring to an early 2006. Quite an old piece :)




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: