In fact the first programming I did on the Macintosh was using Borland's Turbo Pascal for the Mac.
I took an introductory programming class at the University of Kansas. it taught us Pascal and it was a new language for me (having really only known BASIC, noodled with assembly). It was a bit surprising that the class used Macintosh's — these running a kind of "project-less" version of Turbo Pascal as I recall where you simply fed in your source file with ReadLn() and WriteLn() functions sprinkled throughout and observed the result in some kind of console window.
I'm not sure if I was aware of THINK Pascal (or THINK C) at that time but when I soon after bought my first Macintosh (a Mac Plus) I went looking for software to program it and it happens that a professor was selling his copy of Turbo Pascal for the Macintosh.
So perhaps it was serendipitous, but Turbo Pascal became my introduction to Mac programming. I stuck with it for perhaps a year or two before moving over to the richer THINK Pascal. (And later I migrated to THINK C to run with the big dogs — having to learn about pointers and C parameter passing was a right pain in the ass for this naive young coder).
(By chance, I happened to be on a mini-quest recently to try to recover these old programs that I started and abandoned back in the 1980's. And I have come across my original Turbo Pascal floppies as well as some of the crude apps I tried creating with the tool.)
As others have said, I suspect Borland jumped into the game early gambling that the Mac might be the Next Big Thing, bailed early too when it turned out that it was not (that of course would quickly be the IBM PC clone market).
Pascal was a first class citizen. Indeed, quite alot of the original Mac was Pascal and 68k assembler. For example, strings were prefixed with a length byte as is popular in the Pascal world.
Apple even shipped a Pascal compiler with their MPW Shell - a development environment.
In Portugal we only had a single distributor in Lisbon, Interlog, that we would need to visit in person, or call in via ads in computer magazines, only to buy super expensive computers, versus the PC market alternative.
For us, Apple only became relevant after NeXT's acquision gave it a 2nd life.
Even in the US, the Mac was very marginal early on. It wasn't until it established a foothold in the DTP market in the late '80s that it was clear the Mac would even survive as a platform. A ton of important business software never made it to the platform.
There's a lot of revisionism that stems from Apple having made a magnificent comeback over the past 25 years, but a lot of people are forgetting that Apple was not the dominant player it is today back then. They spent most of the '90s on the verge of bankruptcy, and were sustained only by a few niche markets -- the Mac was not regarded as a serious platform for business computing back then. If things had gone slightly differently, the Mac might have shared the same fate as the Amiga and the ST.
The products (both hardware and software) were also struggling. My first laptop, a PowerBook 5300c had terrible production / build problems. I can’t remember what they were, but even though I quite liked mine, it was a bad computer. And Copeland, the next-gen OS that would be on par with WinNT kept getting delayed and never shipped.
Until Apple bought NeXT’s tech and talent, it was very bleak. I still loved my Mac, don’t get me wrong. It was clearly “better,” (I’ll never understand how our antialiased text didn’t make that obvious), but it was also dying. Getting rid of John Sculley and letting Steve Jobs micromanage everything and make unpopular cuts was a miracle save.
Apple also had a good market share in education. The first Mac I used was at the university, and many schools had Macs even when it was viewed as a very niche system.
The Apple II was very dominant in education, but Apple never managed to retain that dominance through the migration to the Mac. Schools started moving to Wintel boxes in the mid-90s as the Apple II was becoming obsolete -- Macs were present in schools, but had stiff competition by that point.
I recall my elementary school (1985-1991) exclusively having Apple IIs, my middle school (1991-1994) having a mix of Mac LCs and IBM PS/2s (along with lingering Apple IIs), and my high school (1994-1998) having all of the computer labs stocked entirely with Tandy 486es, with a handful of Macs in specific classrooms for particular use cases.
I remember learning how to type and work the macintosh version of Paint on some Macs in 1st/2nd grade computer lab in 1995/96/97 somewhere. We even had crusty old Apple II's or the like hiding in corners.
We had a lot Mac’s (Apple IIe and iMac in school). Like everywhere
I only knew one person who owned one at home. He was a diehard Mac person as that’s how people who owned Mac’s acted as I remember. They refused to own a Windows whatsoever.
Same in Eastern Europe. You'd only see Macs in professional sound, video, graphic design and publishing houses, which was only a drop in the ocean compared to Windows PC market share. Their super high cost and inability to run most popular SW like games made it a no-go for consumer and businesses, especially in a low wage market given that Macs here were more expensive than in the US due to VAT and import duties but wages were 20x lower.
From what I remember in Hungary thirty-ish years ago: Professional AV was Commodore Amiga. DTP used Mac because of QuarkXPress. Professional design used SGI Indigo.
IBM PC was office and in this period it became the gaming machine. Even before Commodore went bankrupt, the Gravis Ultrasound and Wolfenstein 3D dethroned the Amiga. https://youtu.be/wsADJa-23Sg has an excellent explanation why Wolfenstein 3D couldn't be done on the Amiga. The GUS was key in moving the demoscene to the PC and while obviously that's extremely niche it literally demonstrated what the PC is capable of and had a huge effect on the game creators. The first Assembly was organized in 1992 by two Amiga groups and Future Crew but just one year the latter released Second Reality and the Amiga was no more.
And also important to note that the SGI machines were a factor more expensive again than the macs. If the macs were already nearly unfeasible in Eastern Europe the SGIs certainly would be.
Oh yeah I'm sure because they were pretty much the only game in town for that.
But what I mean is: they were purely professional workstations. A Mac was something that a wealthy private person could have, easily. You wouldn't buy an Indy, they were over 20k$ or something. I was really referring to private use.
We had some at university but even those were donated. Though we were mainly a HP-UX shop.
Same in the United States. Outside of schools who got the machines for a substantial discount, they were bought for specific professional use cases like electronic publishing.
Steve Jobs saw that it wasn't just the Mac but the entire PC industry that ignored the consumer market - Windows and the vast majority of software (excluding games) were built for the professional market, and the same beige boxes and components were rehashed for the consumer market with the idea that people would "want what they use at work". That was what restarted legitimate consumer market sales, and gave them a ton of ways to meaningfully differentiate their products (Microsoft didn't really start noticeably responding until the Zune and Vista)
In the early 2010s I stumbled upon a defunct Macintosh LC II or III in some office at my Russian technical university and was very surprised by that. I guess it was being used for CAD in the 90s, since the office belonged to someone from the mechanical engineering department.
It was that way in the US too until the iPod and then iPhone took off and MacOS X became viable. If it wasn't for Apple's legacy hold in education (a lot universities in the 90's sold Macs in their bookstores) Apple may have gone the way of Amiga before that happened. They were seen as the expensive, pretty computer used by creative types or by students.
Famously, Apple was the stock everyone expected to fail, for over a decade.
You couldn't short them because they kept not dying. You couldn't buy'm because they kept not making money.
IMO, Apple was only able to stay alive because Microsoft invested so heavily in them. Microsoft was worried about being seen as a monopoly, so investing in Apple ensured that Microsoft could both /not/ be a monopoly but also benefit no matter how well Apple did.
IIRC they sold their stake after the iPod was released and Apple was back on its feet.
Widespread adoption of Macs by students really went hand-in-hand with the transition to Intel, likely because of a boost to battery life that allowed students to take notes for at least two lectures before needing to plug in.
>> You'd only see Macs in professional sound, video, graphic design and publishing houses...
> It was that way in the US too until the iPod and then iPhone took off and MacOS X became viable.
No. As someone who had experience with existing in the 90s, I can say you're dead wrong. Like I don't even know where to start. Macs weren't as popular as PCs, but you'd see them all over in the consumer space:
You'd see Macs in department store computer displays (Best Buy, Sam's Club, Officemax, etc.).
The iMac was released in 1998 (and was so influential you had PC makers copying its style). That was the real beginning of Apple's turnaround. The iPod wasn't released in 2001, and was pretty niche for a long time.
Several families I know had Macs as their family computer during the 90s (including my own).
Growing up, I didn't know a single person personally who owned a Mac. I only knew Macs from schools and the handful of teachers I knew who'd bought the new PowerBooks ca 2002.
I also had experience existing in the 90s. They were extremely niche.
And going into the 2000s, I knew plenty of people with an iPod, but no Mac.
>And going into the 2000s, I knew plenty of people with an iPod, but no Mac.
Which took some doing as iPods were all Firewire up until ~2004 when Apple transitioned to USB.
Firewire was pretty unusual in the PC ecosystem - usually requiring purchasing a PCI card.
First gen version also didn't have Windows software. I'm thinking more once they were convenient for PC users to use, I guess 2005 onwards? I don't think I saw any iPods before then.
> I can say you're dead wrong. Like I don't even know where to start.
This kind of aggressive riposte is never called for.
It's true that the Bondi Blue iMac arrived three years before the iPod, and was influential. It bought Apple some breathing room, and took it off life support. A success by any measure.
It's also true that prior to this, and for a considerable time after, Macs were largely found in exactly the niches described by the post you're responding to. I would add education as well, as did, let's note, the directly parent post to your own. Sure, you'd find them in other places, like <checks notes> less than 5% of household computers, including your household it turns out. I can see why that would distort your impression of its ubiquity.
The iPod was an immediate sensation. It didn't start to sell in numbers for a few years, but everyone knew what it was (speaking from the US perspective) and craved it. There's no question in my mind that it was the touchstone product which gave life to the entire company, and the halo effect it produced gave a crucial leg up to the iPhone, which is what made Apple the multi-trillion-dollar company it is today.
Leaving off a few details and compressing a timeline doesn't make someone wrong, let alone 'dead wrong'.
I suppose it's possible to think that, if you jump straight from the title of the Fine Article to my post, without reading the thread which leads to it.
This. In Europe, Macs were for people around arts, printed press, audio/video and such. CMYK people.
For genZers, think about THE platform to run Adobe software fast and reliabily + audio tools.
These people must be either young or delusional. No one used Macs at home. Even under Mac OSX for Power PC, if you weren't a media producer, your interest in Macs was zero.
3Com made an Ethernet card for the Mac in the late 80's. 3+File/Print/Share existed for the Mac as well. I knew people who worked on it. Claris Software was initially an independent company selling software for the Mac.
PowerPoint was initially Mac-only (1987). It got bought by Microsoft so they could port it to Windows.
Claris was originally spun out from Apple so that MacWrite/MacDraw etc would not be seen as competing with third-party developers with an unfair first-party advantage.
> So yeah, Macs were around, but hardly ubiquitous.
And I'm not claiming they were ubiquitous, just that they weren't so niche you'd only see them in schools or on the desk of a graphic designer. They had something like a 5-10% market-share in the US. Anecdotally, that played out in my community as far as I can tell (e.g. 1 in 20 kids having a Mac at home sounds about right). You didn't have to go anywhere special to buy them, but the store might have 20-30 PC models on display with 2-3 Macs.
The teach who was put in charge of updating and design our new computer lab and educational material, back in the late 80s, was heavily into Amigas. The Amiga had a pretty large market share in Northern Europe, but our teacher wasn't naive or blind sighted by his own preference, so he opted to equip the schools lab with Macs. Now that was not common at all, the obvious choice would have been PCs running Windows 3.0, but I think that might have come out exactly to late, so he would have been looking at Windows 2 or 2.1 when starting out, and coming from the Amiga, that would probably have been unacceptable.
Still the Mac wasn't big in Denmark at that point, but in the late 80s, early 90s our school had an insane amount of Macintosh computers. Had the teacher in charge been a DOS guy, that lab might have looked very different.
In the early 90s, my primary school in "rural" Texas (45 minutes outside of Houston) got it's first computer lab. It had 30 Macs. Every class room had an ancient Apple (not sure what version at this point - IIe or III?).
Apple had BIG BIG discounts for education, that IBM did not. Even being a town outside of Houston, we never got Compaq PCs.
That said, once Win95 hit, EVERYTHING was swapped out for PC district wide. I remember my parents complaining that a new school tax was getting levied on our town to upgrade technology just a couple years after a previous one had already hit.
Education was the niche that kept Apple afloat back then -- they'd managed to make the Apple II the de facto standard for school computing, and when they wanted to transition schools to the Mac in the early '90s, they had to go so far as to design an Apple IIe on a card [1] to allow the Mac models they were offering to schools to remain compatible with the huge library of Apple II educational software.
They never succeeded in actually turning the Mac itself into the standard platform for school computing, and as you point out, once the Apple II platform was long in tooth, schools
began migrating in droves to Wintel boxes, and Apple's finances took a major hit.
Apple barely made it out of the '90s intact. They had a massive turnaround after Jobs returned, and are a major powerhouse today, but people forget just how marginal the Mac was in its early years.
… limited and crazy overpriced. The first Mac was an awesome demo but not able to do much because it only had 128k; within a few years people had shoehorned that demo into much cheaper machines (even had something like Plan 9's UI for Os-9 on the TRS-80 Color Computer)
Meanwhile you had the Atari ST, Amiga, Sinclair QL on the low end with color graphics 68k machines against the still monochrome Mac. These were affordable, good for games and other media, and in principle more scalable than the PC and AT architectures of the time. A little later you got very powerful 68k machines running Unix from vendors like Sun Microsystems. I first saw a Mac 2 in college after I had used a Sun cluster with huge, mostly monochrome, monitors and was blown away by the refinement of the desktop (small monitor had something to do with it) but the price tag was insane.
Some people swore by mac for desktop publishing, but the Suns had great software for that too.
I did own a 286-based computer which was tremendous value in terms of compute power for the cost, much better than the minicomputer machines I was using, able to emulate the Z80 at 3 times the speed of any real Z80, etc.
The graphics sucked but as we went through EGA, VGA and then various Super VGA. In 1993 I got a 486 machine and ran Linux and X Windows and stomped both the Mac and Sun in terms of value.
> Meanwhile you had the Atari ST, Amiga, Sinclair QL on the low end
I don't know that you would call some of those (I'm familiar with the Amiga) as "low end" compared to the Mac. Same CPU, more memory (the Amiga could go up to 138MB, even then), more capacity floppy drives. 4096 color display at higher resolutions than the monochrome Mac. Better sound quality, 4 channel stereo versus monochrome.
Yeah, definitely not sure how the Amiga was low-end versus the Original Mac.
People forgot the Ashton-Tate merger, which destroyed Borland's internal culture and saddled it with products that were almost impossible to move to a WYSIWYG model.
They were targeting both Windows and OS/2 for the GUI implementations they did deliver with much reduced budgets.
Microsoft buying FoxPro and introducing preditory pricing also didn't help.
Neither did MS sniping the Quattero team.
Add in Philippe Kahn and the Borland board fighting on direction didn't help.
The company simply was troubled and didn't have the resources to port anything besides Pascal, which as one of the teaching languages of the day, was easier to justify.
Speaking as a person that worked on the original Mac OS, the "story" that the OS was originally based on Pascal is not really correct. It was written in Assembly, with the belief that later developers would be using Pascal, and so Apple wrote the developer facing Inside Macintosh with Pascal as the language used for the examples. However, as far as I know no major software houses seriously worked in Pascal on the Mac; C and Assembly were the kings during the 80's and continued with C/C++ during the 90's.
Mac OS used Pascal strings rather than C's null-terminated strings (Inside Macintosh volume 1, p90 -- Using assembly language / Calling conventions -- "Pointer to string (first byte pointed to is length byte)" https://vintageapple.org/inside_o/pdf/Inside_Macintosh_Volum...
They were an absolute pain to switch between when writing C code for Mac OS, and are evidence of the Mac OS designers having Pascal on the brain when designing the OS, not just their choice of example language in the documentation.
> as far as I know no major software houses seriously worked in Pascal on the Mac
Adobe Photoshop for the Macintosh was written in Pascal:
Yeah, I remember. The string issue was easily fixed by simply placing a null at the end of one's strings and maintaining a Pascal string pointer for OS traps and using the C string pointer for everything expecting C strings.
I still have a work-in-progress mimeographed and hand written copy of Inside Macintosh that was used by the original 3rd party developers, back when a good 2/3rds of the OS was still being completed.
As far as I understoond by several interviews, and podcasts, that I cannot now refer to, MPW and the whole migration from Object Pascal and App Toolbox into C and C++, was a kind of submarine project from a couple of folks that weren't into Object Pascal.
As they succeeded, MPW came to be, followed later by the partnership with Metrowerks and their PowerPlant C++ framework.
The point is, back then you didn't end up writing nice software because your language was awesomer than other languages. You wrote nice software by escaping to assembly, so support for that was important for state of the art tools.
Yeah, and while both ecosystems allowed for using external and inline Assembly tools, C only had inline assembly keyword (and nothing else beyond that) as part of ISO C89, a couple of years later after 1972, usually it is only pointed out as negative when Pascal based applications reach out to it.
Another thing that usually escapes, is that actually writting cool games on any 16 bit home computers, required the full deck of tricks of Assembly programming, neither C nor Pascal dialect were up to the job, with the first two decades of their existence.
Even arcade systems like Midway Games units, used mostly Assembly, despite their TMS34010 having a C SDK available.
I had the vague sense at the time -- and, to be clear, I wasn't a Mac person until the late 1990s so it's before I knew anything about the platform -- that for reasons I never understood it was more difficult to write code for the pre-MacOS Macs actually ON those Macs.
The other thing that may escape a modern reader is that the Mac timeline is really two eras: The original classic Mac era, with its Chicago-and-sepia look, and then a huge and stark transition to what was initially called OS X and is now called MacOS.
They are entirely different systems, but people outside the Mac world often mistake the transition to something more like Win95 -> NT/XP. It's not that. It's way more.
OS X is when the Mac got shiny to technical people, because it was (and is) based on FreeBSD, and even shipped (and ships) with a host of the sorts of tools you'd expect from a Linux distro. And it was introduced in a time when LAMP development was huge, so if you were in that world writing an interpreted language targeting MySQL and Apache it was EASIER to work on a Mac than it would've been to stay on Windows.
So, sure, today we see lots of devs and even whole software companies standardizing on the Mac, but in 1995 this would have been unthinkable.
Back when I was taking programming classes in the early 90s, everything was done on "IBM compatible" PCs because of the job market. I remember one professor talking about how everyone that learned to program in high school did it on Apple computers, but said they had to switch because there were no professional programming jobs for Apple.
The school had Macs, but those were for graphics-oriented classes, and they had Apples, but those were for word processing.
I think the hurdle from a programming language to actually creating mac-like software was too great for Borland and for casual programmers. MS-DOS became where you could write and share ugly but useful programs.
Hypercard changed that but it was too little too late and Apple didn't have its heart in supporting casual programming. Professional devs don't realize what a big deal Visual Basic was.
Professional devs at the time certainly did. Some thoroughly eye-watering percentage of business applications were written in VB. I'm still not sure that any modern tech has replaced the sheer "it doesn't need to be beautiful it just needs to be functional and inexpensive to build and maintain" power that we had with tools like Visual Basic.
Nowadays I'm accustomed to multi-person teams needing months to build what a single skilled VB6 developer could bodge together in a couple weeks, but I still sometimes marvel at how we got from there to here without enterprise and B2B development shops calling foul along the way. Perhaps it's because they know they can cover any increase in development cost with the larger amount of money they can squeeze out of customers using modern SaaS subscription pricing schemes.
I used all the major mac tools: think c, codewarrior, and mpw. I think the money just wasn't there. The PC market was the corporate market, and they were the ones who had lots of developers.
Both lots of developers and willingness to provide regular revenue. The indie developers would hang onto an expensive purchase as long as they could between upgrades since it was often their second greatest business expense behind the comparatively much more expensive hardware.
They weren’t dominant but it wasn’t even remotely newsworthy to remember that entire industries like publishing, education, audio/video, science & medicine, etc. existed.
Since PCs were starting from so far behind the industry had something of a chip on its shoulder and it was common to see print ad comparisons where they were showing how a PC was just as good for less money, where they’d list the optional components like a sound card and mouse not adding as much as the base difference.
Were you there? From the late 80s to mid-late 90s all my classrooms had Apple computers and Macs. Late 90s seemed to be the transition to "IBM compatible" PCs.
I was there (and then) living across different European countries in the 90s and IBM compatibles were the de facto computing standard followed by niche products like the Amiga which had a dedicated following, mostly hobbyists. But for all intents and purposes the mac was just a zombie platform over here. The first Mac that I had ever seen was the original blue iMac.
I was. There were lots of Apple computers, yes. There were not very many Macs, though. The labs full of IIe and IIgs gave way to PCs, with no era for Macs in between. There were never more than a handful of Macs around at all, and they were the cheaper ones (LC and low-end Performa). There was a small resurgence in schools with the first iMacs, but by then PC was solidly dominant. The iMacs weren't that popular with students, though, because they lacked floppy drives. Flash drives were still small and expensive, the cloud didn't exist yet, and CD-Rs weren't a good fit for small, short-lived, often-changed files.
In my high school (late 1980s, early 1990s) we had mostly Apple IIe systems for student use and a couple PCs in the school office. When I was copy editor for the school newspaper, articles were done in AppleWorks, printed on an ImageWriter in NLQ in columns and literally pasted up on a master for layout which we photocopied.
AFAIK, the MACs at the time was all point and click. Building GUI applications is not easy. Borland's IDE was text base, making that a GUI may have been to expensive based upon the amount of $ that could be made.
Also, most MAC people considered themselves Artists, not developers. Most individual developers (all) were hacking on DOS at the time. So that may have played into it too. To me, it is all about potential revenue vs expense.
I wonder how much games played a part. In my case, deep down all I wanted to do is just play games. Ataris and Amigas had nice games but at one point the PC just blew their doors off. The Mac was a sad non-contender you used in your print shop to design calendars.
My suspicion from early in my career is it was people that came from IT who just assumed that 'MAC' was an acronym for something, like pretty much everything else they dealt with, as opposed to the shortening of the proper noun 'Macintosh.'
Great company. Long long back bought their C++ compiler which came in those thick binders with a very well written guidebook on the language, OOP concept etc.
Fun to read their " no nonsense licensing" legalese
If I recall, it was mainly due to the market size, and to a lesser extent Apples xenophobic posturing toward 3rd party developer control.
Information appliances like Mac are a streamlined workflow for Desktop Publishing, but are not intended to be repurposed in an arbitrary manner.
For example, you would see a dozen choices for CAD or IDE solutions on PC, but there would only be 1 option that cost 1200% more on Mac (OS 7.5.3).
Eventually the MacOS application/compiler options users had would slowly diversify under OSX/intel-cpu, but the M1/M2 architecture shift would drop again to burn down any progress people made opening the platform. The easy to cross-port GNU applications still cling to the underbelly of modern posix based MacOS, but you can guarantee it is an uphill battle to get anything deployed outside the Apple App store ecosystem build tree.
Apples irrational need to monopolize their own ecosystem meant they always locked down their platform though obscurity and or DRM security. It is unfortunate Microsoft Windows 11 and Google Android still copied Apples worst design choices, as users had their control slowly stripped off hardware they purchased.
Thus, most modern computer user interface design is just a sales funnel for App stores, and gentle racketeering. =3
> Eventually the MacOS application/compiler options users had would slowly diversify under OSX/intel-cpu, but the M1/M2 architecture shift would drop again to burn down any progress people made opening the platform. The easy to cross-port GNU applications still cling to the underbelly of modern posix based MacOS, but you can guarantee it is an uphill battle to get anything deployed outside the Apple App store ecosystem build tree.
This entire paragraph is completely mystifying to me. What are you trying to say here? What uphill battle? What burned down progress?
Any CTO foolish enough to rely on Apples partner offerings, had better plan on mitigating chaotic shifts in their architecture/legal/DRM policies... often primarily focused on locking out general purpose small firm software/hardware. i.e. The thread asks "Why borland" (or any sane company) would avoid Apple's walled garden ecosystem over the years.
"What uphill battle?"
Try to publish anything on their platform without signed software DRM, or integrated 3rd party hardware. Essentially your group will end up paying Apple with a tithe and legal encumbrances, as it is essentially a closed ecosystem from the users perspective.
"What burned down progress?"
The generic hardware around Intel cpus meant better support for standardized game engine GPU drivers, easier compiler ports, and alternate software sourcing ecosystems. The M2 was nice silicon... but few folks are going to invest years porting to Apples whimsical unicorn product trajectory for lower market share. lol
"This entire paragraph is completely mystifying to me"
That is because you don't yet understand most successful commercial software companies make money by reselling the same software, and not mired in the perpetual cost/liability of software porting to product-run specific chips.
I was not specifically picking on Apple here if that was your concern. You would likely have needed to experience the OS8 and Desktop PowerPC deprecation to understand why people avoided Apple for years after. =3
> to a lesser extent Apples xenophobic posturing toward 3rd party developer control.
Classic MacOS didn't even have a first-party IDE. Everything was third-party.
> but you can guarantee it is an uphill battle to get anything deployed outside the Apple App store ecosystem build tree.
It's hard to even parse this sentence, to be frank, but any possible reading of it is nuts. It's trivial to install Nix on macOS and have basically every package available to you. Same with Brew, though it's less nice that way.
It is OK, people wouldn't know what is missing if they never saw the alternative.
"It's hard to even parse this sentence"
Try to cross-compile something on modern MacOS, and handle the DRM signing on another platform. For example, you are still going to need an active Apple Developer account to run https://quasar.dev for MacOS/iOS targets, or users are going to incur a bit of hassle running your code.
It is ok, most people have trouble understanding each other at first. =3
Indeed, App stores must sign the software on their platform to deploy, and offline binaries now require a valid registered Developer Account signed code in order to easily run. A bodged on ecosystem like Brew doesn't practically count here, as a signed-check block is functionally also preventing most users running unapproved code. It is marketed as a security feature, which is why it sounds odd to people inside the ecosystem.
"Is there even meaningful overlap between these two things?"
If one wants to target every platform, than people are ultimately forced to use a Mac with a Developer Account subscription. Accordingly, there is no practical guarantee your project will reach market (app store rejections are common)... or worse... some dimwit chooses a platform ecosystem specific language to really double-down on a bad investment.
Hence, my opinion on the answer to the threads question: "Why did Borland ignore the Macintosh market?"
> App stores must sign the software on their platform to deploy
This is true, or rather "developers must sign software in order to sell it on the App Store" is true, and I believe this is what you meant.
That's attestation. It isn't DRM.
> offline binaries now require a valid registered Developer Account signed code in order to easily run
"Easily" here means one click. "Not easily", then, is three clicks. The dialog boxes tell you exactly what to do. This is needed the first time you open a program, after which it just opens.
> A bodged on ecosystem like Brew doesn't practically count here
Why don't Homebrew, Mac Ports, and Nix, count here? Practically, I mean.
> as a signed-check block is functionally also preventing most users running unapproved code
This just isn't true though, it isn't even in the same neighborhood as the truth. Most users are, in fact, able to: read a dialog which says "go to Privacy and Security", go to Privacy and Security, and click the button which lets them run unsigned code.
> forced to use a Mac with a Developer Account subscription
Yes, it's true, you do need to have a Mac (or borrow one, cloud code signing does exist) and pay $99 a month, to sign code. Or you can just release it. Weren't we talking about Digital Rights Management? I thought you were going to explain how code attestation is DRM. I haven't seen you do that yet, did you want to?
> there is no practical guarantee your project will reach market (app store rejections are common)
You can just sell software for the Mac. The App Store is completely optional. The checks involved in the attestation process are quite minimal and focus on whether your program is malware. Failing that, you can sell your mal^H^H^H software directly, and users will have to endure three clicks, instead of one, to open it for the first time.
True, but only if the validation code is vulnerable to an unpatchable vulnerability discovered in Apple M1, M2 and M3 chips. Otherwise your hardware, drivers, and or software likely still needs approved by Apple.
"> Why don't Homebrew, Mac Ports, and Nix, count here? Practically, I mean."
Generally, most users will never touch CLI, and the ones that do often know how to deal with nag-ware in the OS.
"This just isn't true though, "
Right, you try to publish some kernel level driver that touches the hardware signatures. These modern machines will usually brick into a lock-screen on most platforms now. Thus, no one will be developing 3rd party hardware/drivers/Software/Firmware inside that box. You must pay Apple to play... even to replace many broken components.
"> The checks involved in the attestation process are quite minimal and focus on whether your program is malware"
Arguably, modern Win11 and MacOS are already technically Malware collecting user telemetry, content, and metadata... in my opinion they arrive broken out of the box. Perhaps you are arguing some corporation is ethically superior to regular thieves. =3
Do you realize that Borland were out of business by the time code signing for basically any platform became a thing most anyone was vaguely concerned with?
Not much of a standard on PC, other than being the way to produce Palm, Epoch and Symbian apps, until Nokia replaced it with Eclipse based tooling, Carbide.
Metrowerks stuck around a long time in the embedded space.
Eclipse was more of a phenomena in the Java/OO paradigm, but was often slow on older platforms. Still, many folks embraced the open ecosystem when it became popular.
Symbian was 10 years ahead of their time, and brought a lot of new paradigms to mobile. Leaving the master signing key on a reset device was a teachable moment for most folks. =3
Borlands tools were mostly used by business professionals. Sure they sold at discount to students/ hobbyists etc but professionals were the core of their market. It wasn't cheap software.
The Mac market (until very recently) was predominantly for home not business use.
>Borlands tools were mostly used by business professionals. Sure they sold at discount to students/ hobbyists etc but professionals were the core of their market. It wasn't cheap software.
No, the early Borland of 1980s was the opposite of what you describe.
Borland Turbo C with the lower price of $99 was also advertised to hobbyists compared to competitors such as Microsoft Professional C Compiler costing $299. (E.g. https://archive.org/details/PC-Mag-1987-05-12/mode/2up) ; Microsoft responded to Turbo C's pricing with lower-end products such as "Microsoft Quick C".
It's the later years of Borland trying to go up-market with more expensive "enterprisey" products such as Interbase and subsequently Embarcadero, etc.
The main reason Borland didn't create much software for Macintosh was that they were a small company and didn't have the manpower to build tools for the tiny Apple customer base.
And qualifying the mac market as “for home use” is pretty wild, during Borland’s heydays design houses were pretty much mac only unless they needed SGI’s prowess. Borland had started tripping over its own feet before Photoshop was even ported to Microsoft.
Photoshop 1.0 was 75% pascal by LOC incidentally (the rest was 68k assembly).
Before 2005 I only ever saw Macs in offices, studios and universities. There was one guy I knew had a Mac at home, but he was a designer. For context, I'm in EU.
One of my first jobs was working with printing/scanning software and my company would install Macs in some large enterprises because of AppleTalk's printer sharing capabilities.
In late 2001 (in the US) I switched off of desktop Linux to a used iBook G3, and have never really went back.
We were a small company and I switched to using that iBook for some work (C++ development for *nix-based systems). I felt I was an early adopter there by a few years. I converted my workstation (which for reasons was way more powerful than the ones the rest of the development group had) into a build server and a local RedHat/Debian cache for the team. That poor iBook did not have the horsepower to build the software locally in a reasonable amount of time :-)
Apple basically created their own retail stores to accelerate their consumer sales channel, and the first one in the EU was in 2004.
Before 2000 that was also my experience (I was in the EU then too). Between 2000-2005 there was a steady increase of iBook/Powerbook owners thanks to being the only laptop with a reasonable battery that could run a Unix.
Design software for Wintel and DOS was actually pretty good around 1993-1994 time frame. Aldus PageMaker for PC was released in 1991 and QuarkXPress in 1992.
On the other hand, Mac hardware was super expensive. Today Mac hardware commands a little premium but pre OS 9 the difference in price was huge. So Mac owners could not have been very price sensitive in general.
It fluctuated. During the first few years of the Mac era they were stupid expensive. In the later Scully & Amelio years they dropped in price a lot as the component prices in the 68k Macs became way cheaper, and they became fairly affordable esp if you could get an education (teacher or student) discount, which they were aggressive about.
Still was not at all a common machine in people's homes.
A baseline Mac wasn’t any more expensive than the competition (IBM), it just wasn’t inexpensive like Commodore, Atari, and PC clones. Apple also had essentially the same price structure for decades, while PC clones raced to the bottom on both price and quality.
And the Mac II was actually priced better than most competitive systems—because those competitive systems were 16MHz 68020/68881-based workstations from Apollo, HP, Tektronix, Sun, et al. In early 1987, a name-brand 16MHz 80386 system with 80387 was comparably priced, which is why most people buying PC clones didn’t get a 386 until 1990-91 or so, around when the 80486 (and 68040) came out.
As jasode said. Borland's low pricing was a revolution and suddenly made it possible for hobbyists to get a real compiler without paying a fortune. This created an explosion of activity back then. We all jumped on Turbo Pascal for CP/M and DOS at the time.
It was much later that things changed, but then it was another world already.
My recollection is the Mac market was certainly not home use in the late-80s, 90s. They were relatively pricey and in fact not common in homes, but mostly in offices that did DTP, schools (we had 1 in my primary school office for staff to use, and then a handful in a "visual communications" lab at my high school), and, later, "multimedia" type shops. I worked in a shop that was full of them in 96, used by web developers doing photoshop stuff. There were a few in my university computer lab.
Late in the early-mid 90s this changed a bit as the Mac II series dropped in price a lot and became affordable at home. My mother (a teacher) bought one as her home machine because she got a good educational discount. This was on the tail end of the 68k era and just as the PowerPC transition was starting.
Borland sold by mail order for $99, at a time when "respectable, business-class software" was only sold in stores or direct, and for $299 or more. dBase sold for $699, as I recall.
I went to a talk by Philippe Kahn in the early 80's, and he was very much a rebel for going so much against conventional wisdom.
You used Apple Pascal on the Apple II, not the Mac, and it wasn’t actually free but a couple hundred dollars a seat. It was based on UCSD Pascal, as were many implementations at the time, but it was a commercial product and one used by many Apple II developers. Your school either licensed or pirated it for you to use.
Apple didn’t even ship self-hosted assembly tools for the Mac until the Macintosh Development System later in 1984, and when Apple did ship Macintosh Pascal it was a learning environment with a (non-UCSD) bytecode interpreter rather than a native compiler with Toolbox access. That was still something most people used a Lisa for until after both the Mac 512 and the HD20 came out.
I definitely used ucsd pascal version II (not apple pascal, which came a bit later) on Apple II. Looking at this source[1], it looks like it both existed for Apple II and wasn't free so I must have assumed that incorrectly.
I also realise I was talking about Apple II not mac. My assumption/point was that the market for apple pascal devs had been captured previously by the "power" of the UCSD p-system before the mac came along so borland figured they didn't have a chance.
Personally I wasn't that much of a fan of ucsd pascal vs turbo pascal for reasons I can't remotely remember. I think ucsd pascal you could only do things in the "p-system" bytecode thing which meant it had a slightly more restrictive/pure pascal variant vs turbo pascal had some language extensions like being able to do dynamic memory allocation so you could make trees and linked lists and stuff that iirc you could'nt do very easily in vanilla/ucsd pascal. It's been a while so I may be misremembering.
Borland would definitely have had a chance with Turbo Pascal on the Apple II (if there had actually been a 6502 version available), for a couple of reasons:
- Speed
- Size
Speed: I used Apple Pascal as well as Turbo Pascal for the same purposes (steering satellite dishes, and also multi-tasking data collection) on dual-CPU Apple II clones (6502, z80). Using Turbo Pascal was a different world w.r.t. speed - way, way faster.
Size: When I developed my multi-tasking data collection system in Apple Pascal I had to use four floppy disk drives, set up for "swapping" (the UCSD/Apple Pascal system had that ability, it could segment itself) simply so that there would be a tiny bit of RAM available for the Apple Pascal editor. No such problem when using Turbo Pascal on the z80 system, with equal amounts of RAM.
When that's said, UCSD Pascal and Turbo Pascal weren't that dissimilar as far as Pascals were concerned - Wirth's Pascal wasn't very practical, so every useful Pascal version had their own extensions. UCSD and Turbo had some commonality there which made it easy to port between them.
IIRC the Wizardry games were all written in UCSD Pascal. I wouldn’t be surprised if a Mac port existed, it got ported to damn near every platform out there.
I like to imagine they couldn't make a dent in the Mac market because their products were too ugly to appeal to Mac users.
Sometimes I think I'm only half joking on that.
Back then, making software run on more than one platform was much more painful than it is now. Not only were the range of features provided by the OS vastly different, it was common to write performance critical parts in assembly - which made them need to be almost completely rewritten (there were macro assemblers that made that slightly easier). Unix was heralded as "open" (not to be confused with open source) because it was less painful to port stuff and make it interoperate with other Unixes.
As many people have said, it's only an interesting question if you don't have the context of Apple market share in the PC market, which spent over two decades between 1980 and the early 2000s in a state of almost constant decline.
To me, that's the more interesting part here. Yes, everyone knows Apple is huge today, but many people apparently don't know how far from inevitable that was. If you had asked me in 2000 who would be the bigger company in 2024: Apple, Blockbuster, Toys "R" Us, or Enron, I'm not sure how I would have answered except that I'm positive I wouldn't have said Apple.
This is such a great point. Blockbuster stores were so ubiquitous, and I never thought streaming would be a thing considering the state of the internet back in the late 90's, early aughts. Toys? Who thought a toy store would ever go bankrupt? And Enron were heralded as pioneers in the idea of buying commodities on demand and how this was going to change all kinds of different industries with their business model of how they sold their wares. Enron was considered light years ahead of companies in the energy sector.
At the same point, Apple seemed to be, like you said, constantly failing, moving backwards, and essentially looked like it already had one foot in the grave when every other company was seemingly destined for a lifetime of success.
Having experienced the rise and fall of all the companies you listed; I think a lot of people are unaware of how destabilizing the internet has been for companies who were unable to grasp how seismic the changes that were coming and didn't move fast enough to adapt.
I think the real question is if someone other than Steve Jobs was running Apple, would they have gone the same way the companies you listed go as well?
And Apple's success arguably isn't just a result of suddenly starting to execute well. It also took Microsoft giving them a golden opportunity by executing rather impressively poorly for a good long while. 1998 - 2009 was a tough time to be a Windows user.
I switched from PC to Mac for home use in about 2003. I liked OS X from day 1 (I worked the help desk at my college when it was released, so I had to learn it fairly deeply), but that alone probably wouldn't have been enough to entice me to make the switch, especially in light of how big the Apple tax was back then. I had also spent several years living with Windows Me and Windows XP, and I was probably switching away from them as much as I was switching toward a Mac.
> 1998 - 2009 was a tough time to be a Windows user
Is that true? I was too young to really have an opinion but to me most people cite XP and especially Windows 7(ignoring Vista which was bad) as the height of Windows. Of course outside of Windows, like mobile, it really was bad but if we’re just talking Windows then I can’t help but disagree.
XP had a very, very long run, and I think that people tend to mostly remember the time period when Vista was out but they were sticking to XP, which had stabilized pretty well by then. Recency bias and all that. Also, at this point it's easy to just be nostalgic for when Microsoft regarded Windows as an operating system rather than a vehicle for delivering advertisements to a captive audience.
The initial rollout, though, was frustrating for users who were beset by hardware and software compatibility issues, confused by a significantly altered user interface, and still experiencing the blue screens they had been told that XP would banish.
I was a bit young to have an opinion myself but in the early part of that, you had Windows 98, which had a pretty solid reputation, and Windows XP, whose brought the stability of NT to home users. Windows ME was poorly received but it wasn’t on the market for long before Windows XP came out.
Yeah, I think the decline of the Windows platform came a little later. Windows Me and Vista were crap, but 98SE was perfectly usable until XP came out, and XP was usable until 7 arrived. But, OS X and the Linux desktop made huge advances during that time period, vastly increasing the number of business software and gaming titles that would run well, so that by the 2010s, switching operating systems became a lot more viable for a lot more people. Especially with the rise of web-based applications in that time period, meaning many home users were no longer locked into a desktop client for things like email and office software.
I was a Mac user through those years and it wasn't roses during much of that time. In 1998, Microsoft had Windows 95/98 which didn't have the nice polish I loved from my Mac, but it was a modern OS running on much faster and cheaper hardware. Mac OS 8 was really good looking, but it wasn't a modern OS - cooperative multi-tasking and no separation between processes. Mac OS 9 didn't change that.
At the same time, Intel was just demolishing the PowerPC. You could get a much faster Windows PC for a third of the price. Yes, Apple has a price premium today, but it's marginal and you're usually getting better stuff. Back then the price differential was huge.
Things didn't get better fast. Windows XP gave home users Microsoft's NT OS while Mac OS X was so slow it was basically unusable. Windows apps would start up instantly while OS X would let you watch an icon keep bouncing in the dock. Intel kept pummeling PowerPC both on price and performance.
You started using OS X in 2003 so you never used 10.0 and 10.1, but it was painful. Even in 2003, performance was still slow and compatibility could be hard, but it was getting substantially better. More apps were fitting in with the OS by this time. A lot of the early ports from Mac OS Classic weren't very good and the UI elements didn't always look right.
OS X did give Apple a big advantage: it was a Unix with good laptop support. This brought a lot of techies to the platform. But it's hard to say that Microsoft wasn't doing well for most users at this time. Windows XP was so fast and the hardware was so cheap.
Microsoft did have some fumbles. Windows Me should never have been released. Windows XP had a lot of security issues for a while. But most people weren't even looking at Apple - until the iPod.
I think Apple's resurgence was their own doing. The iPod got people interested in Apple again. I think delivering a laptop-friendly Unix brought a ton of techies and developers to the platform. I think the move to Intel processors closed a huge performance gap that had been plaguing Apple.
Even today, while Apple's Mac business is doing great, how much of that is attributable to the impact that iOS had? Even then, Windows is still the vast majority of the market (85-90%). Most people never saw a reason to leave Windows.
If Apple hadn't gotten hit products like the iPod and iPhone, would we have seen the same huge resurgence in the Mac? Or would people continue buying faster Windows machines at a third of the price?
I'm a big Mac fan, but there were some painful days in there and 1998-2005 could be pretty painful. It wasn't all bad. Mac OS X was getting better by 2003-2004. But there was a lot that wasn't so great. Still, I know people who did switch then. Windows could be annoying as hell - but it didn't start being annoying in 1998. Windows 3.1 was very basic and Windows 95 would have all the same complaints as Windows 98. But Windows 95 was so much better than the Classic Mac OS. Windows XP was more usable than the early Mac OS versions. But OS X started showing promise and it was getting fans - and many were pissed off with Windows.
But I don't think that's what gave the Mac its resurgence. The iPod and iPhone brought users who wanted Apple's experience on their computer.
>Blockbuster should have eliminated late fees and offered DVDs by subscription in-store, but they made too much money off the late fees.
FWIW they did do exactly that. It was a monthly subscription fee, you could have three movies or games out at a time, no late fees. I don't remember what the plan was named but I think it was movie pass, to start a new membership you typed "pass" into the terminal. It was pushed somewhat strongly in store I remember having a new sign ups quota per shift.
So strange that this somehow became a relevant tidbit in a post about Borland ignoring Macs.
This made me go search the wikipedia page. From what I can tell they had a dvd-by-mail service introduced in 2004 via Blockbuster.com. And between 2005-2010 they had a phony "no late fees" policy that got them sued for deceptive advertising..
I was young at the time but pretty sure subscribers in my store did not get late fees. I could only find two sources below to corroborate my memory. I would not be surprised that they were doing something shady to get sued though.
>I think the real question is if someone other than Steve Jobs was running Apple, would they have gone the same way the companies you listed go as well?
I think the true lesson to learn from the CEOs who weren’t Steve Jobs was that one of them had the foresight to get Steve Jobs back, after he had proven he was a keen leader with NeXt and especially with Pixar.
Clearly you're not as keen business-wise as noted terrible CEO Gil Amelio, who correctly saw that NeXt being in the dumpster was a good thing, since they needed its software, not its revenues.
I'm not sure I understand your point. Yes, likely if NeXT had been doing better financially Apple couldn't have afforded it. You can check NeXT's aborted S-1 here.[1] They had an accumulated deficit of $273 million as of a few month's earlier, were almost out of cash, and were losing money.
Apple's purchase price of $400 million was not exactly a bonanza to their investors...
I've been a Mac fan since it was released. I was 8. My dad wouldn't let me get one because he disliked them pretty intensely, said you couldn't tell what they were doing and couldn't fix them, and they were overpriced. And I mean, he wasn't wrong. I remember once, probably 1985 or 1986, we were at a computer show where they were raffling off, I think, a Mac 128k or Plus. My dad told me if either of us won we would sell it and get an Amiga. In retrospect that would have been cool, TBH.
The only thing that ever got my attention instead was when NeXT came out. It was just so badass, this ominous black cube with the cool 3d grayscale UI. But even more out of reach financially, so I just hung out at the local university computer base that had a bunch. I think the guys in the store viewed me as a kind of mascot. But TBH NeXT struggled just as hard as Apple and didn't even have their legacy brand equity.
Those of us who suffered through the bad years are still the most loyal, I find, even though they are stagnating, and even though the modern OS and hardware, though unquestionably vastly better in functional terms, are just not quite as interesting and unique.* In the late 90s, when Apple was at its nadir, I had to reluctantly mostly abandon the Mac. I still had one but most of my time was in the unix/Linux/FreeBSD world. So when NeXT reverse-acquired Apple, and the classic look and some of the classic feel of the Mac married the unix foundations of NeXTStep, it was game over, and the first chance I got I convinced my boss to let switch and I've never looked back.
Similar story for me. After using Macs for years, I drifted into Linux-land, and it took me a while to realize “Oh, wait, Macs are UNIX now” and jump back in. Been very happy since.
"They didn't" pretty easy answer. Also the idea that they lost "even though their products were technically superior to Microsoft's offerings" is sorta unsupported and is obviously subjective.
Having used both MS and Borland products during the late 90s and early 2000s, Borland tools were definitely far superior from the standpoint of ease of use, time it takes to develop serious applications etc.
The primary advantage of the Borland's tools apart form itse its ease of use and was its very interesting component architecture which allowed for very easy development of third-party components and so many high quality third-party components were available free or at low cost. While this was happening MS got lost in the bushes trying to get ActiveX, COM etc to be the bridge for component inter-usability but it didn't come close to the ease of development of components in Delphi, CppBuilder etc.
The issue with Borland was the poor quality of management after their founder left. They tried to get into the Application Development Lifecycle space, bought up a lot of companies in that space and increased the cost of their dev tools to the point that it was no longer affordable to smaller dev shops who were their primary customers.
This problem hasn't gone away. Embarcadero still has ridiculous pricing for their products even though now they have a very stripped down IDE which can be used for free till you hit the USD 5000/year revenue limit.
We used both at university, for designing UI and corresponding events Borland's dev studio was easily 10x faster to design if your UI was a bit more complex, especially if you are not very familiar with whole ecosystem.
MS design of their stuff in those years was often... shitty on multiple levels to be polite, ie MFC comes to mind, over-complicated for no good benefit. People jumped to literally anything else if they could, be it Borland for C/C++, Java had much saner object-oriented design model too (which could be compiled to native code with native UI if needed, since their default stuff didn't look the best).
Borland made the best dev tools in their day, but they sorta lost the edge early in the Builder run and other tools became more desirable. At least, that's why I stopped using them.
I don't know about Mac support. I did Apple development during those days as well, but I didn't use Borland tools for that, I used CodeWarrior.
LightSpeed C, later renamed THINK C, had the Mac market sewn up before CodeWarrior. They had an excellent hypertext help built in, before web browsers. They were bought by Symantec, added C++, and then got beat out by Metrowerks.
I think CodeWarrior was just the superior IDE for anything Mac back in the day. I wasn't a mac guy, but I recall it basically being the only option my buddies would consider back then.
Their products were, and still are in many ways tehcnically superior to Microsoft offerings.
Back in the day, Visual C++ 6.0 was when we finally migrated into Microsoft development tools, and I used Borland tools for ages before that, starting with Turbo Basic in 1990.
Additionally to this day, Microsoft doesn't have anything on the C++ front that can compete with C++ Builder for RAD GUI development, MFC is a fossil, while WinUI with C++/WinRT is a bad joke.
People at Microsoft would probably agree. In 1996 they hired Borland's chief engineer Anders Hejlsberg, who designed Turbo Pascal and Delphi at Borland, and C# and Typescript at Microsoft.
Actually it is a bit more nuanced than that, Anders Hejlsberg was so pissed with Borland's management that he finally accepted the occasional invites from ex-Borland people working at Microsoft.
He tells the story in this interview,
"Anders Hejlsberg: A craftsman of computer language"
By the way, he also contributed to J++, that where P/Invoke, events, Windows Forms, properties came from initially (Yes, Delphi also had events and propertiers by then).
Well, he adapted Clascal & Object Pascal from Lisa & Macintosh to the DOS/Windows PC world, and added a couple features from CLOS to turn it into Delphi, and married that with Sun’s C++-syntax bytecode-compiled variant of Objective-C to produce C#.
He certainly deserves credit for what he did, but not what those whose shoulders he stood on did.
Actually to produce J++, followed by a lawsuit, which made cool from MSR become C#, and J# come into existence to ease the porting from J++ code into C#.
Ironically 20 years later, Microsoft is again a Java vendor, and OpenJDK contributor.
Was and still is, I don't really grasp what kind of Stockholm syndrome goes by at WinDev, that they finally had something that could match against C++ Builder (UWP with C++/CX), only to kill it via a mini-riot, replacing it with a development experience akin to doing ATL with Visual C++ 6.0.
If only IDL tooling, and related C++ code generation wasn't frozen in time, just like it first came in Visual Studio almost 30 years ago, besides updating the actual language to MIDL 3.0, that is.
And now while WinUI 3.0 folks tell the story that you can use C++ with WinUI / WinAppSDK, what they don't tell is the castastrophic state of C++/WinRT tooling in Visual Studio, that there are no plans to improve it past C++17, and the only thing happening ot its repo is bug fixes.
Stay away from it as much as possible, and using it with C# and CsWinRT is not much better, as many errors are surfaced as HRESULT exceptions, and you need to actually single step into C++/WinRT code to actually find out the real cause.
I took an introductory programming class at the University of Kansas. it taught us Pascal and it was a new language for me (having really only known BASIC, noodled with assembly). It was a bit surprising that the class used Macintosh's — these running a kind of "project-less" version of Turbo Pascal as I recall where you simply fed in your source file with ReadLn() and WriteLn() functions sprinkled throughout and observed the result in some kind of console window.
I'm not sure if I was aware of THINK Pascal (or THINK C) at that time but when I soon after bought my first Macintosh (a Mac Plus) I went looking for software to program it and it happens that a professor was selling his copy of Turbo Pascal for the Macintosh.
So perhaps it was serendipitous, but Turbo Pascal became my introduction to Mac programming. I stuck with it for perhaps a year or two before moving over to the richer THINK Pascal. (And later I migrated to THINK C to run with the big dogs — having to learn about pointers and C parameter passing was a right pain in the ass for this naive young coder).
(By chance, I happened to be on a mini-quest recently to try to recover these old programs that I started and abandoned back in the 1980's. And I have come across my original Turbo Pascal floppies as well as some of the crude apps I tried creating with the tool.)
As others have said, I suspect Borland jumped into the game early gambling that the Mac might be the Next Big Thing, bailed early too when it turned out that it was not (that of course would quickly be the IBM PC clone market).