Hacker News new | past | comments | ask | show | jobs | submit login

Ah, memories. Used to use BeOS as my primary OS for a year or two. I think it's the only OS I ever found to be truly intuitive and pleasant to work with.

That said, I don't think the world would be in a better place had Apple chosen Be over NeXT. The elephant in the room is security: NeXTSTEP, being Unix-based, has some amount of security baked in from the ground up. BeOS didn't; it was more akin to Windows 95 or classic Mac OS on that front. Consequently, I doubt it could have made it far into the 21st century. It would have died unceremoniously in a steaming pile of AYBABTU. Taking Apple with it, presumably.




I used to work for Be in Menlo Park in a previous life and I can confirm that the code base quality would have made for a very bad outcome for Apple. Security was the least of the numerous serious issues. That said BeOS somewhat still exists in spirit as a lot of folks from Be went to build/contribute to Android.


> a lot of folks from Be went to build/contribute to Android.

Does that include the quality and security perspective as well? ;-) j/k

Having never crossed paths with a former Be employee before, __thank you so much__ for your contribution. BeOS was so instrumental to my perspective on computing and operating systems (and potentially the conception of my disdain for what Microsoft did to the world of operating systems around the turn of the century).

From a user perspective, BeOS was nearly perfect. Great UI and utilities, POSIX command line, so fast and responsive. The "install to Windows" option was amazing for trying things out. BeFS was wonderful (it's nice to see Mr. Giampaolo's work continue in macOS).


> a lot of folks from Be went to build/contribute to Android.

That's correct, the IPC in AOSP, Binder, is basically borrow from BeOS


I too used to work at Be (Hi!) as well as developed applications for BeOS. I also worked at Apple on various releases of OS X. NextStep was far ahead of BeOS on multiple fronts. BeOS was a lot of fun to work on, but only scratched the surface of what was needed for a truly commercial general purpose OS. If Apple would have acquired Be instead of Next, who knows what the world would be like today. Apple ended up with a large number of former Be employess as well (some directly and others from Eazel.)


I can never let a thread about BeOS go by without adding my two cents, because I also worked at Be in Menlo Park, back in the day. (I went down with the ship and got laid off as they went out of business.)

I was sore about it at the time, but I agree that Apple made the right decision by choosing NextStep over BeOS. If for no other reason, because that's what brought Jobs back. It's hard to imagine Apple making their stunning comeback without him.


Thanks a lot! I ran BeOS fulltime for a few years (R3/4/5) and I'm looking at a BeOS "the Media OS" poster at my wall here. Fond memories!


Care to share where you got the poster?


It was not in the box. Back then, it was still quite difficult to get a hold on an actual R3 box here in Europe. There was 1 official reseller here in the Netherlands and I actually bought their official demo machine: the famous first dual processor Abit BP6 with 2x 400Mhz Celeron processors. When picking it up in their office I spotted the poster and asked if I may have it. Still got a T-Shirt and a hat too ;-).


I vaguely remember it being in the box (bought R4, R4.5, and R5).


And apparently a couple Amiga gurus made their way to Be (see: Fred Fish).

I’d always heard that after Amiga (and Be) many decided to opt for Linux for philosophical reasons.


Which is ironic, given that I am yet to see a GNU/Linux based hardware setup that matches the experience, hence why I went back to macOS/Windows that much a much closer multimedia experience.


Wow! Thanks so much for working on BeOS. This was a super fun OS to use.


I'm curious what sort of issues you have in mind. I was never very familiar with BeOS but from what I understood the issue with it was more that its responsiveness came from very heavy use of multi-threading, but that also made it very hard to write robust apps for it as, in effect, all app code had to be thread safe. App devs found that condition too hard to handle.

Can I assume that the quality issues were somewhat related to that? BeOS devs found it no easier to write thread safe code in C++ than app devs did?


I’m the guy who left a case of champagne at the office one weekend, to celebrate an early release.

Thanks for the memories.


“That said, I don't think the world would be in a better place had Apple chosen Be over NeXT.”

Yes. Except that it wasn’t acquiring NeXTSTEP that saved Apple’s skin; it was acquiring Steven P Jobs.

True, version 1 had been rough and flakey as hell, and honestly really didn’t work all that well.

But Steve 2.0? Damn, that one could sell.


NeXTSTEP pretty directly evolved into iOS, though, so it was certainly a significant asset in the acquisition, too.


True, but a technology is only a means to an end, not an end itself. What sells is product.

You may have the finest tech on the planet—and that means precisely squat. What counts is putting bums on seats. Your seats. And keeping them there. Limps of tech are just a vehicle for that; to be used, abused, chewed up, and/or discarded on the road(s) to that end.

Apple could have done better; they certainly did plenty worse (Copland, Taligent, the first Mac OS).

As it turned out, NeXTSTEP proved it was indeed “good enough” to fit a pressing need at the time; and the rest was just hammering till it looked lickable enough for consumers to bite. All it was needed was a salesman to shift it—and Steve 2.0 proved to be one of the greatest salesman in modern marketing history.

That’s what made the difference between selling a tech to a million dyed-in-the-wirewool nerds, and selling tech to a billion everyday consumers. And then up-selling all of those customers to completely new worlds of products and services invented just for the purpose.

..

Want to created a whole new device? Hire Steve Wozniak.

Want to create a whole new world? Oh, but that is the real trick.

And Steve Jobs gave us the masterclass.

..

Had Steve started Be and Jean-Louis built NeXT, we would still be in the exact same situation today, and the only difference would be chunks of BeOS as the iPhone’s bones instead. Funny old world, eh? :)


I'm not sure I've ever encountered someone so invested in the "great man" theory of history.

Jobs was obviously talented, but assuming no matter where he went he would have had the same level of success is discounting a lot of luck in how everything lined up,and who was available to help bring all the things to market jobs is famous for. There's no guarantee the hundreds or thousands of people that were also essential to the major successes of Apple would have been around jobs had he stayed at Next. Those people deserve respect and recognition too.


You forgot his family had been the largest share holder of Disney not because Steve got apple. He is VERY successful to the point he even gave up getting anything but an private jet. That is billion of course but that is not success. What is.

And unlike v1 v2 seems better on human level as well. We do not need saint. He still parked in space for hadicapped only I guess. But let us admit, it is not just one for all. But all for one.


ISTR a tale of Legal keeping a large slush fund from which to pay off all the ex-Apple-employees that Steve 2.0 would straight tell to their face to fuck off. Just because that is what worked best for him†. :)

“But let us admit, it is not just one for all. But all for one.”

Damn straight. Epically focused leadership.

--

(† For all others who aspire to build their own businesses, there is HR procedure and askamanager.org—and do not for the life of you ever bypass either!)


>Epically focused leadership. Just to support that, I remember hearing a story told by Larry Elison (they were apparently neighbours for a while), where he would pop over to see Steve, and would be subjected to the 100th viewing of Toy Story where Jobs was obsessively pointing out every new tiny improvement they'd made in the story or graphics.

Epically focused indeed.


“Those people deserve respect and recognition too.”

ORLY? Name them.

--

Not “great man”. Great vision.

Geeks tend massively to overrate the importance technical aptitude, which is what they’re good at, and underrate everything else—business experience, sales skills, market savvy, and other soft skills—which they’re not.

Contrast someone like Jobs, who understood the technical side well enough to be able to surround himself with high-quality technical people and communicate effectively with them, but make no mistake: they were there to deliver his vision, not their own.

Tech-exclusive geeks a useful resource, but they have to be kept on a zero-length leash lest they start thinking that they should be the ones in charge since they know more about tech than anyone else. And the moment they’re allowed to get away with it, and you end up with the tails-wagging-the-dog internecine malfunction that plagued Sculley’s Apple in the 90s and has to some extent resurfaced under Cook.

Lots of things happened under Jobs 2.0. That was NEVER one of them.

..

Case in point: Just take the endless gushing geek love for Cook-Apple’s Swift language. And then look at how little the iOS platform itself has moved forward over the 10 years, it’s taken to [partly] replace ObjC with the only incrementally improved Swift. When NeXT created what is now AppKit, it was 20 years ahead of its time. Now it’s a good ten behind, and massively devalued to boot by the rotten impedance-mismatch between ObjC/Cococa’s Smalltalk-inspired model and Swift’s C++-like semantics.

Had Jobs not passed, I seriously doubt Lattner’s pet project would ever have advanced to the point of daylight. Steve would’ve looked at it and asked: How can it add to Apple’s existing investments? And then told Lattner to chuck it, and create an “Objective-C 3.0”; that is, the smallest delta between what they already had (ObjC 2.0) and the modern, safe, easy-to-use (type-inferred, no-nonsense) language they so pressingly needed.

..

Look, I don’t doubt eventually Apple will migrate all but the large legacy productivity apps like Office and CC away from AppKit and ObjC and onto Swift and SwiftUI. But whose interest does that really serve? The ten million geeks who get paid for writing and rewriting all that code, and have huge fun squandering millions of development-hours doing so? Or the billion users, who for years see minimal progress or improvement in their iOS app experience?

Not to put too a fine a point on it: if Google Android is failing to capitalize on iPhone’s Swift-induced stall-out by charging ahead in that time, it’s only because it has the same geek-serving internal dysfunction undermining its own ability to innovate and advance the USER product experience.

--

TL;DR: I’ve launched a tech startup, [mis]run it, and cratered it. And that was with with a genuinely unique, groundbreaking, and already working tech with the product potential to revolutionize a major chunk of trillion-dollar global industry, saving and generating customers billions of dollars a year.

It’s an experience that has given me a whole new appreciation for what another nobody person starting out of his garage, and with his own false starts and failures, was ultimately able to build.

And I would trade 20 years of programming process for just one day of salesmanship from Steve Jobs’ left toe, and know I’d got the best deal by far. Like I say, this is not about a person. It is about having the larger vision and having the skills to deliver it.


Jobs was far more of a "tech guy" than either Sculley or Cook. He understood the technology very well, even if he wasn't writing code.

I would also say, Jobs had a far, far higher regard for technical talent than you do. He was absolutely obsessed with finding the absolute best engineering and technical people to work for him so he could deliver his vision. He recognized the value of Woz's talents more than Woz himself. He gathered the original Mac team. If he had, say, a random group of Microsoft or IBM developers, the Mac never would have happened. Same with Next, many of whom were still around to deliver iOS and the iPhone.

Your take is like a professional sports manager saying having good athletes isn't important, the quality of the manager's managing is the only thing that matters.


“Your take is like a professional sports manager saying having good athletes isn't important, the quality of the manager's managing is the only thing that matters.”

Postscript: You misread me. I understand where Jobs was coming from better than you think. But maybe I’m not explaining myself well.

..

When my old man retired, he was executive manager for a national power company overseeing distribution network. Senior leadership. But he started out as a junior line engineer freshly qualified from EE school, and over the following three decades worked his way up from that.

(I still remember those early Christmas callouts: all the lights’d go out; and off into the night he would go, like Batman.:)

And as he later always said to engineers under him, his job was to know enough engineering to manage them effectively, and their job was to be the experts at all the details and to always keep him right. And his engineers loved him for it. Not least ’cos that was a job where mistakes don’t just upset business and shut down chunks of the country, they cause closed-coffin funerals and legal inquests too.

--

i.e. My old man was a bloody great manager because he was a damn good engineer to begin with. And while he could’ve been a happy engineer doing happy engineering things all his life he was determined to be far more, and worked his arse off to achieve it too.

And that’s the kind of geek Steve Jobs was. Someone who could’ve easily lived within comfortable geeky limitations, but utterly refused to do so.

’Cos he wanted to shape the world.

I doff my cap at that.


“Jobs was far more of a "tech guy" than either Sculley or Cook.”

Very true. “Renaissance Man” is a such cliche, but Steve Jobs really was. Having those tech skills and interests under his belt is what made him such a fabulous tech leader and tech salesman; and without that mix he’d have just been one more Swiss Tony bullshit artist in an ocean of the bums. (Like many here I’ve worked with that sort, and the old joke about the salesman, the developer, and the bear is frighteningly on the nose.)

But whereas someone like Woz loved and built tech for its own sake, and was perfectly happy doing that and nothing else all his life, Jobs always saw tech as just the means to his own ends: which wasn’t even inventing revolutionary new products so much as inventing revolutionary new markets to sell those products into. The idea that personal computers should be Consumer Devices that “Just Work”; that was absolutely Jobs.

And yeah, Job always used the very best tech talent he could find, because the man’s own standards started far above the level that most geeks declare “utterly impossible; can’t be done”, and he had ZERO tolerance for that. And of course, with the very best tools in hand, he wrangled that “impossible” right out of them; and the rest is history.

Woz made tech. Jobs made markets.

As for Sculley, he made a hash. And while Cook may be raking in cash right now, he’s really made a hash of it too: for he’s not made a single new new market† in a decade, while Apple’s rivals—Amazon and Google—are stealing the long-term lead that Jobs’s pre-Cook Apple had worked so hard to build up.

--

(† And no, things like earpods and TV programming do no count, because they’re only addons, not standalone products, and so can only sell as well the iPhone sells. And the moment iPhone sales drop off a cliff, Cook’s whole undiversified house of cards collapses, and they might as well shut up shop and give the money back to the shareholders.)


I hear you, I do, but here's another perspective: Jobs without Wozniak wound up being California's third-best Mercedes salesman.

And neither of them would've mattered a jot if they were born in the Democratic Republic of the Congo, or if they were medieval peasants, or if Jobs hadn't been adopted, or or or ...

Luck is enormously influential. There thousands of Jobsalikes per Jobs. Necessity isn't sufficiency.


I think Steve Jobs The Marketing and Sales Genius is an incorrect myth.

Jobs was an outstanding product manager who sweated all the details for his products. And in contrast to Tim Cook, Jobs was a passionate user of actual desktop and laptop computers. He sweated the details of the iPhone too, but his daily driver was a mac, not an iPad. Cook is less into the product aspect, and it really really shows. Cook is a numbers and logistics guy, but not really into the product.

That's a thing I think Apple has fixed recently with some reshuffling and putting a product person (Jeff Williams) in the COO role. The COO role is also a signal that he'll be the next CEO when Tim Cook retires.

To be clear, I don't disagree that Jobs was a great marketer. But that stemmed from his own personal involvement with the product design of the mac--and later the iOS devices--rather than some weirdly prodigious knack for marketing.


> You may have the finest tech on the planet—and that means precisely squat.

You shouldn't talk about Sun like that.


NeXTSTEP appears to have first gotten incorporated throughly into the OS X codebase. Browse through the Foundation library for the Mac - https://developer.apple.com/documentation/foundation/ . Everything that starts with NS was part of NextStep.


It didn't get 'incorporated'.

OSX/macOS/iOS is the latest evolution of NeXTStep/Mach which originated in the Aleph (and other) academic kernels.

(of course OS's evolve pretty far in a few decades...)

(https://en.wikipedia.org/wiki/Mach_(kernel))


My understanding was always that NeXTSTEP served as the foundation of OS X, and while it certainly got a new desktop environment and compatibility with MacOS's legacy Carbon APIs, it was essentially still NeXTSTEP under the hood.


Yes. That is all those NS... prefix meant.


I always thought that, too.

It's wrong.

Original NeXT classes were prefixed NX_. Then NeXT worked with Sun to make a portable version of the GUI that could run on top of other OSes -- primarily targeting Solaris, of course, but also Windows NT.

That was called OpenStep and it is the source of classes with the prefix NS_ -- standing for Next/Sun.

https://en.wikipedia.org/wiki/OpenStep#History

This is why Sun bought NeXT software vendor Lighthouse, whose CEO Jonathan Schwartz who later became Sun's CEO.

Unfortunately for NeXT (and ultimately for Sun), right after this, Sun changed course and backed Java instead.


> Everything that starts with NS was part of NextStep.

Not quite. Everything in Foundation gets the NS prefix because it's in Foundation; only a fraction of it came directly from NeXT.


Yeah, Rhapsody > Mac OS X Server 1.0 > Mac OS X > iOS which was literally described as being "OS X" when it first launched.


Security from what? Do user accounts really provide much benefit in the personal computing space? Where the median user count is 1?

Neither OS had the kind of security that is really useful today for this usecase, which is per-application.


But a bunch of the methods we have for securing, say, mobile phones, grew out of user accounts.

Personally I don't know Android innards deeply, but when I was trying to backup and restore a rooted phone I did notice that every app's files have a different owner uid/gid and the apps typically won't launch without that set up correctly. So it would seem they implemented per-app separation in this instance by having a uid per app.

Imagine a world where Google had chosen to build on a kernel that had spent many decades with no filesystem permissions at all. Perhaps they'd have to pay the same app compatibility costs that Microsoft did going from 9x to NT kernel, or changing the default filesystem to ACL'd-down NTFS.


Then you'd maybe get something like iOS, where the POSIX uid practically does not matter at all, and the strong security and separation is provided by other mechanisms like entitlements...

Someone else pointed out that BeOS allegedly had "quality and security" problems in general (I myself have no idea), so that may indeed have led to problems down the line, whereas BSD was pretty solid. But I agree with the OP and don't think POSIX security in particular is much of a factor today.


Yeah. Funny enough, if Apple had skipped OS X and gone directly to iOS, BeOS would have been a superior foundation. No uselessly mismatched security model or crusty legacy API baggage to clog up the new revolution in single-user always-online low-powered mobile devices.

Of course, that was in back in the days when an entire platform from hardware to userland could be exclusively optimized to utterly and comprehensively smash it in just one very specific and precisely targeted market. Which is, of course, exactly what the iPhone was.

Just as the first Apple Macintosh a decade earlier eschewed not only multi-user, multi-process, and even a kernel; every single bit and cycle its being being exclusive dedicated to delivering a revolutionary consumer UI experience instead!

In comparison, NeXTSTEP, which ultimately became iOS, is just one great huge glorious bodge. “Worse is Better” indeed!

..

Honesly, poor Be was just really unlucky in timing: a few years too late to usurp SGI; a few too early to take the vast online rich-content-streaming world all for its own. Just imagine… a BeOS-based smartphone hitting the global market in 2000, complete with live streaming AV media and conferencing from launch! And Oh!, how every Mac OS and Windows neckbeards would’ve screamed at that!:)


On a similar note, I've often wondered what Commodore's OS would have turned into. Not out of some misplaced nostalgia, just curiousity about the Could Have Been.

My guess is that by now in 2020, it would have at some point had an OSX moment where Commodore would have had to chuck it out, since both Apple and Microsoft have effectively done exactly that since then. Still, I'd love to peek into Amiga OS 9 descended from continual usage.


I think AmigaOS 3 could be a nice kernel as it is. And to make it more Unix-y memory protection could be introduced but only for a new userland process with more traditional syscalls.

It's a bit how DragonflyBSD is slowly converging to.


Amiga OS 9 would have looked very different from the Amiga OS that we know (I am talking from a developer's point of view, not about the GUI).

Since inter-process communication in Amiga OS was based on message passing with memory-sharing, it was impossible to add MMU-based memory protection later. As far as I know, even Amiga OS 4 (which runs on PowerPC platforms) is not able to provide full memory protection.

There was also only minimal support for resource tracking (although it was originally planned for the user interface). If a process crashed, its windows etc. would stay open. And nobody prevented a process to pass pointers to allocated system resources (e.g. a window) to other processes.

The API was incomplete and tied to the hardware, especially for everything concerning graphics. This encouraged programmers to directly access the hardware and the internal data structures of the OS. This situation was greatly improved in Amiga OS 3, of course far too late. Amiga OS 3 was basically two or three years too late. As far as I know, Apple provided much cleaner APIs, which greatly simplified later the evolution of their OS without breaking all existing programs.

Finally, the entire OS was designed for single-core CPUs. At several places in the OS, it is assumed that only one process can run at a time. This doesn't sound like a big issue (could be fixed, right?) but so far nobody has managed to port Amiga OS to multi-core CPUs (Amiga OS4 runs on multi-core CPUs, but it can only use one core).

I have been the owner of an Amiga 500 and Amiga 1200, but to be brutally honest, I see Amiga as a one-hit wonder. After the initial design in the mid-1980s, development of the OS and the hardware basically stopped.


> Since inter-process communication in Amiga OS was based on message passing with memory-sharing, it was impossible to add MMU-based memory protection later.

Why can't you do shared memory message passing with MMU protection? There is no reason an application in a modern memory protected OS can't voluntarily share pages when the use case is appropriate. This happens today. You can mmap the same pages, you can use posix shm, X has the shm extension...


Or just take a Docker-like approach, where each app thinks it is the only user and intra-app communication is where you put the security functionality


But the predecessor to containers were features like having daemons chroot into somewhere else and drop their uid to something that can't do much. That very much grew out of the Unix solutions. If Unix daemons were written for decades assuming all processes have equal privilege maybe we wouldn't see that.


I think this sort of thing is a capabilities ladder in an arms race.

If you never evolved account based security, you never built the infra for even evaluating application permissions in the first place.


“Security” is a bit of a misnomer in this context: I think what you actually meant was “multi-user architecture” which, as remarked elsewhere, undergirds the whole notion of keeping processes from promiscuously sharing any and all resources.


yeah i think of it more as multi tenet safety


Yes, in short - users & groups serve as a rudimentary implementation of capabilities. Best example is Android. But there's more to it.

Separating admin user from non admin user always has advantages and I do it even on Windows.


Best counter example to their point is iOS, though, where POSIX permissions don't play much of a role in securing the system and separation applications.


I do like that you have to “sudo” a program to allow it to access certain files. Even if I am the only user, it stops malicious programs from modifying certain files without me noticing.


Obligatory related xkcd: https://xkcd.com/1200/


Posting links to XKCD like this is generally considered to be a low quality post, hence the downvotes. I’m not one of the downvoters, but thought I’d share the reason as nobody else did.

Edit: gotta love HN! I try to be helpful to someone else that was downvoted to heck with an explanation of why that was the case (based on past replies I’ve seen) and now my post is the own with a negative score. Cheers y’all!


First rule about downvotes is we don't talk about downvotes.


Under the hood though there's multiple accounts which different applications use, the user might only log in with one but applications are isolated from each other and the system because of it.


Security from malicious programs or exploits, accidentally altering system files and other device users.

We used to be able to trust intentionally installed programs not to exfiltrate data. It's sad that we still can't.


Wouldn't the median need to be over 1? I get your point but am feeling pedantic today.


If more than 50% of personal computers have 1 or 0 users, then the median would be 1, assuming 0 users is less common than 1, regardless of how many users the remaining computers had.


If more than (or equal to) half of computers are used by only one person, then the median user count is 1, no?


If you have 3 PCs in the world, one with 0 users, one with 1 user and one with 23 users the median is 1.

Median is literally the middle, like a highway.


They're just stating that having more than half of all computers with just 1 user guarantees that the median is 1.


No.

For example, suppose five computers have 1 user, 1 user, 1 user, 3 users, 300 users. The median is 1 user.

The claim of "median 1 user" just means more than half of computers has a single user.


> Used to use BeOS as my primary OS for a year or two. I think it's the only OS I ever found to be truly intuitive and pleasant to work with.

I love everything I've read about BeOS but to be honest I must mention I couldn't understand how to use Haiku (I've never used the original BeOS) once I've tried - id didn't feel intuitive at all. And I'm not really stupid, I've been using different flavors of Linux as a primary OS for over a decade.

> That said, I don't think the world would be in a better place had Apple chosen Be over NeXT. The elephant in the room is security: NeXTSTEP, being Unix-based, has some amount of security baked in from the ground up. BeOS didn't; it was more akin to Windows 95 or classic Mac OS on that front.

Some times I miss the days of Windows 95 so much. I wish desktop OSes could be more simple, i.e. without multi-user and file access rights. When it's my own personal computer all I want of it from the security perspective is to prevent others from unlocking it or recovering data from it and to prevent any network communication except that I authorized. Sadly Linux still doesn't even have a decent implementation of the latter (Mac has LittleSnitch).

Windows 9x did pretty well for me - I've never caught a virus, never corrupted a system file and it was easy to fix for others who did.


> I wish desktop OSes could be more simple, i.e. without multi-user and file access rights.

Have a look into Oberon and its successor A2/Bluebottle.

http://ignorethecode.net/blog/2009/04/22/oberon/

https://liam-on-linux.livejournal.com/46523.html


Security, networking, multiuser, i10n, print (don’t even begin to underestimate this and quartz and display postscript) beos was an rtos with a neat UI. it was still fun but there was a gigantic pile of work before it could do what system 7 did, let alone what next did.


Additionally, NeXTStep had been in use in production on investment bank trading floors, in scientific institutions, and in military/intelligence agencies. It wasn't widely used, but it was used.

So while it might not have been quite ready for the median System 7 user's expectations, it was pretty solid.


May be so. But I have Mac OS 1.0 running on my MacBook. It is so slow and really not that working. Unlike the Mac OS 9. It is not that smooth. Luckily he found iPod ... even the colour one is very slow.


Also, the the familial relations of MacOS and Linux made it possible to share code fairly seamlessly between both (providing not talking about hardware integration). In a world where we there was 3 separate universes: Windows, BeOS, and Linux it's possible Linux would've become more isolated.


BeOS had a regular Unix like (even posix IIRC) dev environment.

I was able to do most of the CS course work projects normally done on my University's Sun workstations on BeOS instead. Most of these of courses were data structures, algorithms, compilers, etc projects in C, and not things that required platform specific APIs.

But arguably, BeOS' overall model - a single user desktop OS built on top of but hiding its modern OS underpinnings like memory protection and preemptive multitasking - is far more similar to what eventually became MacOSX than Linux. Which isn't so surprising since it was built by ex apple folks. Remember that consumer OSs before this point had no memory protection or preemptive multitasking.

Linux, though it had the same modern OS features, was far more closely aligned in spirit with the timeshared modern multi-user Unix OS's like what ran the aforementioned Sun workstations (it's "Linus' Unix after all).


BeOS had a POSIX-compliant layer, but under the hood it was totally different from a UNIX.

Also, let’s keep in mind that Windows95 (released that same year) featured preemptive multitasking on a desktop user OS (albeit not a strong memory protection model), and WindowsNT has been available for a couple of years by then (having first shipped in 1993, If memory serves) and was a fully ‘modern’ OS (indeed it serves as the basis for the latter Windows), albeit with a comparatively large footprint.

I was an avid BeOS user (and coincidentally a NeXT user too) and I was enthralled by its capabilities, but in terms of system architecture it was a dead end.


IIRC the Unix compatibility layer had some pretty grotty warts. Porting Unix applications virtually always required fiddling to get them working, especially the network code.

Unfortunately this meant BeOS was perpetually behind the curve on stuff like the World Wide Web. I had a native FreeBSD build of Netscape long before Be managed to get a decent browser.


The Amiga had preemptive multitasking in the 80's. (No memory protection though.)


So did the Lisa even earlier (and Xenix, which was a derivat of Unix Vers. 7, anecdotally also seen on the Lisa).


Is that true? I see contradictory information about Lisa OS. Some posts claim it was cooperative, like the original Mac System. Example: https://macintoshgarden.org/apps/lisa-os-2-and-3


(A bit of research later:) It's actually a bit of a mixed bag. The "Operating System Reference Manual for the Lisa" [0] reads on pp. 1-3/1-4:

> Several processes can exist at one time, and they appear to run simultaneously because the CPU is multiplexed among them. The scheduler decides what process should use the CPU at any one time. It uses a generally non-preemptive scheduling algorithm. This means that a process wlll not lose the CPU unless it blocks. (…)

> A process can lose the CPU when one of the following happens:

> • The process calls an Operating System procedure or function.

> • The process references one of its code segments that is not currently in memory.

> If neither of these occur, the process will not lose the CPU.

In other words, non-preemptive, unless the OS becomes the foreground process, in which case it may block the active process in favor of another one currently in ready or blocked state.

[0] https://lisa.sunder.net/LOS_Reference.pdf


BeOS was as UNIX like, as Amiga was.

Surely it had a cli, UNIX like directory navigation and a couple of UNIX command like utilities.

But good luck porting UNIX CLI software expecting a full POSIX environment.

If I am not mistaken, Haiku has done most of the work regarding POSIX support.


It had a bash shell, and used glibc, and partially implemented POSIX.

I was also able to get most my CS homework done in BeOS. But I definitely needed to keep FreeBSD around for when I hit a wall.


It was ok. Back when I ran BeOS as my primary OS (2001 or so) I built half a C++ web application on BeOS, the other half on a HP-UX server logged in through an X terminal using ftp to sync between the two. Not much support in the wider *nix ecosystem though, so anything big would often fail to build.

I regretted having to move away from BeOS, it was by far the most pleasant OS I’ve used, but the lack of hardware and software support killed it.


In college I wrote a web server in beos and ported it back to Linux, learning pthreads along the way. Bonus achievement was making it multithreaded, so I got that for free, since beos makes you think architecturally as multithreaded first


AmigaOS was not UNIX-like in the least. Amiga UNIX, which shipped on a couple models, was directly System V UNIX, though.


That was the point I was trying to convey regarding BeOS.

Having a shell that looks like UNIX, and a couple of command line utilities similar to the UNIX ones, does not make an OS UNIX.


Ah. I gotcha now.


Hmm possibly. But it could also have been to Linux benefit. It would be alone among these in having then advantage of Unix heritage.


Yes, I remember so many software developers switching from Linux to OSX in the 2000's because "it's a Unix too, but it's shiny".


bounced between windows and os/2, never really used beos as an os, mostly just as a toy for fun. the one thing I remember is that I could play a video that for the time looked amazing without issue. I want to say I even played Quake on it, in a window!


Funny you should mention Windows 95. The company that sold that ended up doing pretty well.


Sure, but at the time Windows 95 was released, they already had a couple of Windows NT releases (3.1, 3.5, and 3.51). Windows NT was a different, more modern operating system than the Windows 95/98/ME line. So, they did not have to evolve Windows 95 into a modern operating system. After ME, they 'just' switched their user base to another operating system and made this possible through API/ABI compatibility (which is quite a feat by itself).


The company that sold classic Mac OS did, too.

But you have to consider what else was going on at the time: Microsoft was actively moving away from the DOS lineage. OS/2 had been in development since the mid-1980s, and, while that project came to an ugly end, they had also released the first version of Windows NT in the early '90s, and, by the late '90s, they were purposefully moving toward building their next-gen consumer OS on top of it.

Apple needed to be making similarly strong moves toward a multi-user OS with concerns like security baked in deeply. BeOS had the memory protection and the pre-emptive multitasking, which were definitely steps forward, but I don't think they would have taken Apple far enough to allow them to keep up with Microsoft. Which, in turn, would have allowed Microsoft to rest on its laurels, probably to the detriment of the Windows ecosystem.


Really? Most people I talk with these days seem to agree that the proprietary OS is a liability.


I’ve never heard anyone say Windows is a problem because it’s proprietary. I have heard that having to pay to upgrade is a pain because you (the company) have to budget for it. Even then, you would also need to budget for the downtime and time to verify that it works before deploying the update, and both those have to be done on Linux too (it’s why LTS releases are a thing).

Anyways, Windows 10 may have its problems, but Microsoft the company is doing pretty well. Their stock is up about 50% this year (200% over the past 5). And that’s not to mention the fact that they’ve open sourced .NET among many other things.


I interpreted then as saying it was a liability to Microsoft.


Outside HN and Reddit talks, most people I know don't even care about FOSS OSes existence, they just want something that they buy at the shopping mall and can use right away.


In fairness, I don't think most people care about the OS at all, FOSS or otherwise; they care that the UI is something they can use, and that their apps work. If you perfected WINE overnight, I'll bet you could sit 80% of the population down at a lightly-skinned FreeBSD box and they'd never know.


I don't even think you'd need that for most of the population: it's been quite some time since the median user cared about desktop software[1]. I switched my parents over to a Linux Mint install a decade ago when I went away to college, and it lowered my over-the-phone tech support burden to zero overnight.

I also had (non-CS but very smart) friends who switched to (ie dual-booted) Linux on their own after seeing how much better my system was than a Windows box. A decade later, one of them is getting her PhD in veterinary pathology and still dual boots, firing Windows up only when she feels like gaming.

[1] My impression is that committed PC gamers aren't a large portion of the desktop user population, but I may be wrong.


I know a decent number of people who have That One Program that they've been using for 20 years and can't/won't leave. It probably varies by population group.


It didn't kill them, though, which was my only point. I guess HN didn't think it was as funny as I did.


AYBABTU = All Your Base Are Belongs to Us, which is a mangled or broken translation in English of a Japanese phrase from a Japanese game `Zero Wing` [1]

[1] https://en.wikipedia.org/wiki/All_your_base_are_belong_to_us

Edit: removed the extra A in the acronym


Got an extra A in there


The anti competitive business practices of Apple make it hard to imagine the world could be worse.

Instead of competition, Apple survives off marketing medium quality products at high prices.

I'm not sure how that's good for anyone unless they own Apple stock.


You don't get to Apple is (large market cap, high customer satisfaction scores, high reviews in the tech press, etc.) because of marketing. If it were that easy, companies would just copy their marketing or load up on marketing and they would be successful.

And a huge part of Apple's current success is based on the tech and expertise they got from NeXT. That work underpins not just laptops and desktops but phones, tablets, set-top boxes, and more.


Perhaps you only get to where Apple is with world-class marketing.

Apple's iPod wasn't the first mp3 player, and it for damn sure wasn't technically superior.

The iPhone was not the first smartphone, nor the first phone with a touchscreen, nor the first phone with a web browser, nor the first phone with an App Store. It arguably had a better UX than incumbents, but better UX doesn't win markets just by dint of existing.

The iMac was a cute computer that couldn't run prevalent Windows software and didn't have a floppy drive.

Recent MacBook Pros have an awful keyboard, not just aesthetically but with known hardware problems. I understand at long last they're reverting to an older, better design.

Tech and expertise don't win just because they exist.


You've left out the part where Apple makes products that have user experiences that are miles ahead of whatever existed at the time.


I'm as reflexively inclined as many technical people to be dismissive of marketing, but I dont think you're right here. You can't "just copy" marketing in the way you can't "just copy" anything else that a company is world-class in, and good marketing can indeed build market dominance (do you think coca cola is really a vastly superior technical innovation over Pepsi?)

The fact that it isn't a net good for users in most cases doesn't mean that it's trivial to do.


> If it were that easy, companies would just copy their marketing or load up on marketing and they would be successful.

Maybe good marketing is really hard and you can't just "copy Apple"?


If people willingly exchange currency for products from a company and are satisfied with the value that they get out of it to the point that they become repeat customers, then how can you judge that no one except stockholders are benefitting?


Because Apple obviously sucks. I don't understand how hard it is for all their happy customers to understand that they suck. /s


Network/lock-in effects and negative externalities can easily have that result.


> negative externalities

This is very true. macOS and the iPhone, for me, went from being "obviously the very best of the best" to "the lesser of all evils".

When my 2015 rMBP finally gives up the ghost and / or when 10.13 loses compatibility with the applications I use, I have no idea what I'm going to do - probably buy another working 2015 rMBP used and pray that the Linux drivers are livable by then.

I know it's ridiculous, but it helps me fall asleep at night sometimes.


You don’t agree on the 16” being the spiritual successor of the mid 2015 15”?


I feel like it's a huge step in the right direction, but for my own personal use:

- I still have mostly USB 2.0 peripherals. I don't see that changing anytime soon.

- I'm still hung up on the MagSafe adapter.

- I love the form factor. The 13" display is the perfect size, for me. I could've switched to a 15" 2015 rMBP with better specs, but I hated how big it was.

- I have no interest in using any version of macOS beyond 10.13, at present.

I'm really glad that they brought the Esc key back, especially as a pretty serious vim user. I don't know, maybe I'm stuck in the past. I'm certain that many, many people are really enjoying the new Macbook Pro 16; I just really, really like this laptop. It's the best computer I've ever owned.


I'm in the same boat as the sibling poster (albeit with a 15" machine) and I'll add this:

- The TouchBar is terrible

I hope they'll bring back a non-TouchBar configuration when they release the "new" keyboard on a 13" MacBook Pro. I could live with both a 13" or 15" laptop, but right now the list of drawbacks is still 1-2 items too long.


Can? Sure. I would commend anyone that can make the case that this is the best explanation for Apple’s success as a whole though.


make it hard to imagine the world could be worse

This seems like a failure of imagination.

I'm not a huge Apple fan, but I lived through the Bad Old Microsoft of the '90s, and grew up on stories of IBM of the '80s.

Apple is nothing like them.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: