Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What Happened to Borland?
213 points by tiffanyh on Dec 10, 2021 | hide | past | favorite | 307 comments
I recall early in my software development career Borland have a strong hold in the developer tool space (and largely loved by developers). What happened to them? They were the “JetBrains” of their day.



This might provide some hints (or not): https://en.wikipedia.org/wiki/Anders_Hejlsberg#At_Microsoft

"In 1996, Hejlsberg left Borland and joined Microsoft. One of his first achievements was the J++ programming language and the Windows Foundation Classes; he also became a Microsoft Distinguished Engineer and Technical Fellow. Since 2000, he has been the lead architect of the team developing the C# language. In 2012 Hejlsberg announced a new Microsoft project, TypeScript, a superset of JavaScript."

I can only speculate that lots of skilled Borland developers followed Hejlsberg and participated in creation of C# and later TypeScript.


The story I heard (I worked at Borland briefly in 1999): Microsoft would send a limo to the Borland HQ to pick up engineers for interviews on their lunch breaks. Borland sued, Microsoft settled for many millions, but basically instead of buying their rival outright (for assimilation into the Borg, lol), they just bought the talent. Last I knew Borland had changed names at least twice (Inprise, Embarcadero) and still existed, in some remnant form.


Amusingly Steve Ballmer got the hump when MS engineers started leaving for Google


As did Google when Google engineers started leaving for Facebook, and FB when their engineers started leaving for Uber/Lyft/AirBnB/Stripe/Coinbase/etc, and so on. It's pretty much a revolving door now, where many engineers have worked at all these companies and sometimes even come back to their home base.

CA's prohibition against non-competes and the DoJ's lawsuit against anti-poaching agreements is basically what makes Silicon Valley work.


Oh that definitely happened, but we don't have quotes from Google like this:

    Prior to joining Google, I set up a meeting on or about November 11, 2004 with Microsoft’s CEO Steve Ballmer to discuss my planned departure….At some point in the conversation Mr. Ballmer said: “Just tell me it’s not Google.” I told him it was Google.

    At that point, Mr. Ballmer picked up a chair and threw it across the room hitting a table in his office. Mr. Ballmer then said: “F---ing Eric Schmidt is a f---ing p--sy. I’m going to f---ing bury that guy, I have done it before, and I will do it again. I’m going to f---ing kill Google.”


Not that the giants haven't tried to be a cartel when it comes to labor. Including Jobs at Apple.


Borland sold their software division to Embarcadero, which appears to have been a big payday for top execs as they jumped out. However, Embarcadero did do a decent job of keeping the ship afloat and running things, though Delphi got very expensive. Idera then bought Embarcadero, but appears to allow it to have a high level of autonomy.


Another maybe interesting detail:

At some point there was an attempted pivot as well or maybe it was just what Embarcadero always had focused on.

I wasn't yet working in software then I think but there was an interview or paid article or something I think were someone told that the future of software laid not in languages and IDEs but in Software Lifecycle Management.

In a way they were right:

Today all major languages have free and open source implementations and Atlassian and a few others seems to have found larger or smaller sweet spots in what I think is Software Lifecycle Management or something.

That said what could Borland do at that point? It probably felt worse for them to bet the farm but in my opinion it absolutely isn't the most bone headed moveI have seen.

That said: The ads not so long after for "Delphi con" or something similar with large "No toothbrush required", that didn't exactly seem smart to me. I think by then everyone who used their products were grown up serious business programmers.


> At some point there was an attempted pivot as well or maybe it was just what Embarcadero always had focused on. [...] that the future of software laid not in languages and IDEs but in Software Lifecycle Management.

That was after they'd changed their name from Borland to Inprise, before they sold out to Embarcadero. I think that, in contrast to this, Embarcadero still bought them mainly for the IDEs.


> I think that, in contrast to this, Embarcadero still bought them mainly for the IDEs.

Indeed, seems it was the dev tools division Emba bought. Maybe the ALM div is still around. Hm, have I actually heard somewhere they even switched the name or that back from Inprise to Borland, or am I imagining it? Naah, can't be -- 't'would be both too pathetic and too absurd.


I remember I started getting a bunch of emails from an "Embarcadero." I didn't remember subscribing, and the unsubscribe didn't work, so I just wrote a filter to skip the inbox and send them all to spam. I must have subscribed to something from Borland at some point.


Interesting, do they make a similar effort to hire IntelliJ staff? I mean they could offer a much higher sallary than what they would get in Russia. Or is it all due to having the top guy, who is in the know on who-is-who?


What was the basis of the lawsuit?


"In the past 30 months, Microsoft has hired at least 34 of Borland's top software architects, engineers, and marketing managers", according to a complaint prepared by Wilson Sonsini Goodrich & Rosati. These actions have been undertaken “for wrongful purposes: to acquire Borland confidential information and to inhibit Borland's competitive position,” the filing states.

Borland's lawsuit seeks unspecified financial damages and an immediate end to Microsoft's unfair practice of targeting Borland employees in order to hamper the company's ability to compete. The suit claims that Microsoft's activities are illegal under California Business & Professions Code Section 17200.

https://www.eetimes.com/borland-sues-microsoft-for-unfair-co...


> Saying that he "just wants Microsoft to leave us alone," Borland International (BORL) CEO Delbert Yocam today filed a lawsuit against Microsoft (MSFT), claiming that the software giant is hiring away Borland's key employees to put it out of business.

https://www.cnet.com/tech/services-and-software/borland-sues...


Ruthless. Just pure evil to target one specific company like that!


Not really. Borland could have issued attractive stock based retention packages to the employees they wanted to keep, and forced Microsoft to acquire the company or go away.

This was on Borland for not adequately valuing their staff.


Borland was losing over a hundred million in revenue while Microsoft was offering seven figure signing bonuses. There's no way they could have paid more than what MS was, since MS was using their war chest to kill the company.


If they were losing a hundred million in revenue before their best employees were poached then we should thank Microsoft for saving these people‘s careers.


as the op says - they could have issued stock instead of real money


I call equity "Bison dollars": https://youtu.be/Shxiy7l5b_4

It's only worth anything if the world-domination plans go off without a hitch.


"Stock Instead of Real Money" is my new band name. So much meaning in just five words :)


Equity schmequity, and we don't know what Borland did or didn't offer to keep people around. We just know MS offered more.


We know for sure that Borland didn’t offer enough. And we can reasonable speculate that they didn’t offer a significant ownership stake.


Anders got 3mil signing bonus. Hard to counter that.


Easy - give him 4 mil in equity.

On the other hand if the company doesn’t have good prospects, then indeed there is nothing you can do.


Personally I would rather take cash.


You would likely think differently if you cared about the product and had invested time in developing it.

Having said that, there is no evidence their developers felt differently to you.


I do almost nothing but develop products. However as product becomes mature I start looking for new things to do. Also maintenance of mature product requires different type of people, not inventors.


Obviously this isn’t how Anders Hejlsberg, who invented and then ‘maintained’ C# for a decade thinks about things.


a million dollar signing bonus? Is there documentation of this?


I remember hearing rumors that microsoft would pay some developers $1 million a year and tell them to just take a vacation instead of work at borland.


Man, 1999 sounds wild.


Long term this kind of practice is bad for engineers the same way Wal-Mart driving other retailers out of town with low prices due to their size was bad for small businesses and small towns in the 90s and 00s.


how do I upvote a comment more than once?


You're acting like the employees were forced into the limos at gunpoint. People have free volition. Offering someone a better opportunity is not remotely 'evil'.


"What's wrong with this country? Can't a man walk down the street without being offered a job‽" https://www.youtube.com/watch?v=yDbvVFffWV4


In this case it's more like, "can't a CEO walk down the street knowing his employees are safely shielded from better work opportunities"


It's anti-competitive.

There's a difference between hiring talent because you want talent, and hiring talent to undermine a smaller competitor.

It's an analogous to dumping.

That is evil.


Dumping is basically not enforced, even if it is against the law.

Monopolies and cartels are against the law, but not enforced.

It is a sad reality of the modern economy, and one of the biggest indicators of who actually runs America.

At least in this case workers made money.

If borland was losing money, why didn't the execs negotiate a merger if they had so much desirable talent?

Hmmmmm, I bet the execs couldn't negotiate a big enough reward for themselves in an acquisition. The limo pickup at lunch strikes me as a big middle finger to Borland's management.

Of all of Borland's products that I liked, did I like them because of the software devs or the management? I guess what I want is the borland devs back.

I miss Turbo Pascal, DOS or Windows.


Again...you are acting as if the people being hired are...what? Not humans with free will?


Shoppers buying lower-priced products at a giant retailer moving into a new market are also humans with free will. It's the rational choice.

Once the store drives local businesses out, prices go up.

That's why this kind of behavior is unlawful.


I don't understand your analogy. Are you saying once Microsoft drives Borland out it will...cut salaries?...raise prices?


It's not ruthless -- it's business.

If the fault lies on anyone, it's the employees who accepted the offers. If they really thought it was "evil", they would have denied the offer on moral grounds or in loyalty to their employer.

Do you not frequently get offers for more money than you are currently making at your employer? I would be a massive asshole if I accepted and left a job every time I got one of those -- especially in this market!

Since they succeeded in hiring so much of their company away, it seems none of them felt particularly attached to Borland or their work there, compared to a salary.

The only "evil" in the situation is how easily some (most?) people will abandon you the moment they get a better opportunity.

I suppose Borland could have matched salaries or tried to keep their employees in whatever way (maybe they did, who knows?) but at the end of the day either they didn't, or it wasn't enough for those engineers.


>The only "evil" in the situation is how easily some (most?) people will abandon you the moment they get a better opportunity.

As if your company wouldn't fire you the moment it was more lucrative to do so.


> It's not ruthless -- it's business.

It's not like these things are mutually exclusive.


It's a dick move, and I wouldn't do it, but I am also not beholden to a board of investors/shareholders that expect to see positive ROE at the end of the day.


This is part of Microsoft's core culture. Bill Gates championed the philosophy of doing anything it takes to get ahead, as long as there's some argument that it might be legal.


And to their credit they've been fantastically successful. I enjoy many things in my day-to-day life by Microsoft provided "free" of charge.

Take that for what you will.


Thank Linus and RMS not Microsoft. They were forced to play catchup with the free/open source movement.


So what I left out of my comment are the ethical implications of doing whatever you want, as long as there's no law against it. Many behaviors are unethical, while still (for the time being) legal.


Wait till you hear how they compete for suppliers, customers, and regulatory changes. Business is about gaining an advantage over a rival. Scoping up rival employees is a 2x activity, you get talent and a competitor has less.


> It's not ruthless -- it's business.

Targeting all employees of a smaller company to destroy them is considered unfair business practice in some countries (legitimately IMHO). It's similar than selling at loss until your smaller competitor is out of business.


  > It's similar than selling at loss until your smaller competitor is out of business.
Why is this considered illegal or unethical? This seems like a fairly legitimate tactical move to me.

It's like a war of attrition -- you allow yourself to suffer losses for the sake of ultimately winning. At least in this scenario, the main player is also slightly fucking themselves over, instead of just you.


> Why is this considered illegal or unethical?

After competitors are knocked out of the market, the survivor can raise prices to above-market levels.

https://www.ftc.gov/tips-advice/competition-guidance/guide-a...


Ah this is true, hadn't considered this.

It seems like there's a lot of this all the time though?

Everywhere I've lived, my only choice of ISP was Comcast. Whatever Comcast told me I needed to pay for internet, that's what I was going to pay lol.

For mobile phone providers, in the US your options are generally AT&T, Verizon, and T-Mobile, which I'm sure collude on prices and fix them. Same with Cloud services providers, etc


You can take a job and leave if they pay you more. That is fine. Microsoft isn't really only trying to gain talent. They want to drain the life blood out of their competition so they can get ahead. That intention is evil


It's a tactic. A dark one for sure. But corporations aren't known for being philanthropist anyway even if they spend millions on PR to mask that image.

As long as there's no enforcement (and in a free capitalist economy it's hard to enforce this, and I personally think it shouldn't be enforced too) these will happen. The best thing that smaller companies can do is to adapt and play by the rules if they can't change them.

Not saying it's good or bad. It is just it is.


>"The only "evil" in the situation is how easily some (most?) people will abandon you the moment they get a better opportunity."

To keep feeding bosses while loosing potential raise? Thanks but no thanks


Why do you think they only targeted Borland like that?


MS was trying to pivot away from their 90s platforms, and Borland was a potential destination for customers jumping ship from stuff like VB.

It was a different time. Even dinosaurs like IBM were still competitive in some verticals.


Because the objective of Microsoft's recruitment was not just to acquire talent, it was to diminish their leading competitor.


In a nutshell: Because Borland, more than anyone else, had hugely superior development tools (compilers and IDEs) for Windows.


No. One man's departure did not bring down Borland. Years after this guy left JBuilder was a great product that made them tons of money in the early 2000s. Java's popularity explosion (think J2EE) came years after the J++ debacle and JBuilder cashed in.

They disappeared because they weren't able to compete with the commoditization of Java IDE's (Eclipse) and Microsoft's integrated sales channel on Windows (Visual Studio). These two things killed their two biggest products.


Borland staff also disappeared because M$ made them offers that they couldn't refuse. Departing engineers were offered megabucks salaries which lasted only a year or two, but were enough to decimate the ranks of Borland's talent and wipe out the company's skillbase. Of course, Borland wasn't the only competitor to receive this kind of attention from M$.

In the 1999 federal prosecution of M$ for antitrust, Judge Thomas Penfield Jackson found that 'Microsoft used its "market power" to unlawfully "maintain its monopoly in the operating system market," violating the Sherman Antitrust Act. Microsoft, the Appeals Court found, unfairly used its monopoly power to strongarm computer manufacturers, Internet access providers, Internet content providers, independent software vendors, and companies like AOL, Apple, Intel, and Sun Microsystems.'

https://www.newyorker.com/magazine/2001/07/09/the-microsoft-...


They weren't able to compete with those things in the early-2000's, yet JetBrains was founded in 2000 and has had nothing but growth and success ever since.

The quality of leadership at Borland fell off, and the organization lost its vision and ability to execute. Simple as that.


I suspect JetBrains location in Eastern Europe helped a great deal talent-wise.


Maybe, but I really doubt it. "Our developers are cheaper" is rarely a winning strategy.

JetBrains did - and still does - execute well. They expanded their IDE to many languages and caught the Ruby, Node, and Typescript waves. Borland did JBuilder, yes, but it wasn't category-winning. Maybe Delphi could have dominated with more investment and more imagination, but it seems to have risen and fallen with Win32.


Their pricing strategy always sucked. I mean, 4000 Bucks for a Java IDE? Jetbrains on the other hand did the right thing from the beginning by asking money but keeping the price realistic.


If I recall Borland had very strong team in Saint Petersburg that moved entirely to JB


> They disappeared because they weren't able to compete with the commoditization of Java IDE's (Eclipse) and Microsoft's integrated sales channel on Windows (Visual Studio).

That, plus their weird and (IMO obviously even at the time) misguided pivot to emphasise SLM systems over dev tools. Well, it may be the same thing: seems likely this pivot was what led to them not being able to keep up with Microsoft on dev tools.


It's important to understand that in the 90s Microsoft was one of the few software companies that took "software talent" seriously, and would aggressively poach talent from competitors. They offered better pay, better working environments (private offices instead of cube farms). Often times their competitors wouldn't realize this until it was too late.


I can die for a private office :(


Hell even a cubicle would be a step up from the last few office environments I've had.


Yeah exactly. Now that I work in open space I realize that a cubicle is actually not bad.


What incredible impact.


Still Delphi was better.

- DSL for UI (Forms) - fast native compiler that produced self sufficient binaries - great component library and many open source libraries - Object Pascal was extended to fit perfectly the needs of UI programming


Building things that have no real-world constraints does often result in great beauty. Unfortunately, the web and all its ugliness became the dominant platform because it enabled no-download no-install information & application interaction


It was probably 20 or so years ago when I played for some time with Delphi. It was extraordinary easy to make program interfaces. Fast forward to today and we are in this clusterfuck, where everyone keeps reinventing the wheel and complexities just keep growing.


Yeah I feel that in a way, the best time to be a developer was back in the 1980s/90s. Your tools were limited but those constraints took away a lot of the "overhead" thinking about what frameworks to use and you could just focus on functionality. You didn't have Google or StackOverflow, but had a few books on your desk that covered pretty much everything you needed to know. Or if you were working on Unix, you had man pages, K&R, and Kernigan and Pike's The UNIX Programming Environment


The reason was you weren't competing with the world class. Programming was great because you could be a local hero.

Now everyone strives to make as beautiful websites as <insert big blue with their genius web framework>


We also didn't have security issues, which helped.


In the same way that Visual Basic offered simple, drag & drop interface builders.

What happened is also changing hardware, UIs that need to adapt to changing screen sizes, different needs, theming, and many more.


Making resolution-agnostic applications in C# Winforms wasn't hard, it was a simple flag to tell the OS how to scale the GUI. And if you used the native widgets and set tab-indexes you'd be all set for changing sizes, blind users, OS re-theming, etc. A good UI framework should handle that stuff internally... even a bad one should do that (bad, like how winforms set the wrong default font).

Imho, the real reason we don't see stuff like this for the Web is that the web isn't designed for modularity. CSS, Javascript, and HTML IDs are all global.

Programming 101 lesson 1 is "don't use globals" and the Web is the perfect object-lesson in why not.


CSS - Yes. JS - Not since 2015, I'll admit that is somewhat recent. HTML IDs - IDs are only good if they're unique. Since HTML had no notion of scope, they became global. Shadow DOM is the web platform answer for modularization, however any JS framework will allow you to slice your CSS and HTML in components.


> if you used the native widgets and set tab-indexes you'd be all set for changing sizes, blind users, OS re-theming, etc

> Imho, the real reason we don't see stuff like this for the Web

Where are you looking on the web? tab-indexes and extending native web components gives you responsiveness and accessibility. The browsers provide theming capabilities for light and dark mode, and OS level color preferences (I use "red" for selected on Mac) easily show themselves on CSS `outline` etc

> Globals

No one uses globals on the web. This isn't 2000, or even 2013.


Everyone seemingly unaware that this still exists and you can still do it in Hejlsberg's C#? Drag and drop WPF components and program in code-behind? With themes and responsive design?

(Big asterisk: mostly Windows-only and has recently lost product direction coherency)


My biggest problem with the drag and drop builders was how it played with version control. The last I used was nearly 20 years ago - Windows Visual Studio C++ with MFC. Even if the builder produced C++ code, it was hard to know exactly where and how the changes were made. CVS (and maybe svn?) didn’t exactly like tracking those changes.

Is that any better now?


One thing I really liked about Delphi was its strict separation between generated and manually-written code. Delphi had a dedicated text file format that represented the form built in the GUI builder. So, VC would just pick up changes in the form file and save them like anything else.


Also Android Studio, XCode, all the React builder websites, etc. Basically every GUI platform + IDE has some form of a drag & drop interface builder.

The problem is that outside of the iOS ecosystem, there are too many subtle differences in behavior to really trust the results. And because a lot of software gets worldwide distribution these days, it's economically worth it to squeeze that last bit of performance & user friendliness out of the framework. So most professional programmers learn how to do things programmatically and only use the interface builder if they're doing a quick internal tool.


WinForms still exists and is fine for the odd quick utility. Doing WPF without ever having to drop in the XAML is... unlikely to say the least. You're going to end up in there are one point, because styles are fucking you over or for some other reason (databinding? lol). But yes, C# is still keeping the drag & drop alive. So are plenty of tools (Lazarus, hell even Android Studio has a drag & drop designer).


Windows forms are great, whilst WPF is a bit odd,to say the least.. Here's an example:

https://docs.microsoft.com/en-us/visualstudio/get-started/cs...


I would almost like WOF is component styling wasn’t one of the most miserable clusterfucks of XML boilerplate I’ve used.


Delphi still exists and you can still do it in Delphi. Responsive. Cross-platform on Android, iOS, macOS, Windows, and Linux. Single codebase single UI.


Can you point to any Delphi app on the android playstore that doesn't look and feel like shit? Or a Linux app? I used to be a Delphi fan, but the results I've come across are not exactly convincing. Compare this to Flutter, which looks and feels gorgeous, at least on Android.


> UIs that need to adapt to changing screen sizes, different needs, theming, and many more.

People keep saying that as if that is this new thing that wasn't ever heard of before the web or smartphones.

Open any desktop app and resize it. Boom, you've got "changing screen sizes".

And yeah, "needs are different, and we need theming is this new thing that never existed before the iPhone ".


I think forced aspect ratio was very common in the era of these oldschool UI toolkits. Additionally, handling multiple classes of pixel density and input (touch vs mouse) was unheard of.


> Fast forward to today and we are in this clusterfuck, where everyone keeps reinventing the wheel and complexities just keep growing

Yes, it is quite funny to see how no toolkit exists to simply produce Web UIs in a meaningful way.

However, the complexity has grown largely from externalities that didn't exist during the time of effortless interface builders, which is screen sizes and aspect ratios and pixel densities of all sorts. To handle this, you need to have some lower level primitives, and of course any time you have to go to a lower level you surface more complexity.

Final point - as a person who develops web UIs professionally for 7 years now, I think that the "reinventing the wheel" has been actually quite beneficial to tame this complexity. Previously, untyped JS had to be bent into surfacing type-style error messages, and good luck with boundary crossing data. Now, TypeScript lets you describe every key in your application and have incredible confidence that a fully-typed piece of UI or logic (which of course must avoid `any`) will deliver exactly what you intended. GraphQL & codegen has given us the ability to type our boundary crossing data straight from our DB or resolvers without any runtime reflection. Runtime reflection tools like io-ts also bridge that gap admirably to program defensively in the situations it's needed. It's obviously been accompanied by a lot of churn, but with strictly typed component libraries, a bit of reusable layout logic, and Hasura, I can make sexy fully-themable UIs strictly typed all the way to and from the data source without significant effort. The complexity in my new paradigm is entirely in application-level tricks like UIs visually informing users of all the async actions, animations / transitions, avoiding dynamic content causing bad layout blips, and ensuring user input is never lost. I think this kind of thing wouldn't have been easy in any oldschool toolkit because it inherently requires some wiring that isn't easy to surface


> where everyone keeps reinventing the wheel and complexities just keep growing

Cue XKCD "Standards" comic. People look at an existing framework and declare "this is total shit, I can build a better, easier to use version!" They then start building the better-easier and realize why the old version is so hard to use--because it's a difficult fucking problem begetting awful complexity + shitty code.

This repeats itself every 18-24 months, giving us the current clusterfuck of JavaScript libraries. Lather, rinse, repeat for the past 30 years (n.b. XWindows Athena -> Xt -> Motif/Lesstif -> ...<aeons pass>... -> Qt -> Electron)


The user downloads a browser that downloads the site-specific javascript application.


But in today's reality, there is a lot of Javascript that is getting "downloaded and installed" into your browser's cache. It's all being managed for you and it mostly works.


Please, you must not be pedantic in this way, it's clear that I mean there's no install wizard and OS-native interactions the user must go thru, they just provide a string URL and immediately begin application-style interactions after a brief load


It's not really pedantic. Since any application (not just a browser) could provide the same download-interpreted-code-then-run-it functionality.

It's probably not this particular feature of the web browser matters but the fact that web browsers come preinstalled with new computers.


> It's not really pedantic

I mean, if not pedantic, then perhaps grinding an axe - he said 'there is a lot of Javascript that is getting "downloaded and installed"' as if that has anything to do with the user experience I was referring to

> any application (not just a browser) could provide the same download-interpreted-code-then-run-it functionality

I have never heard of such a platform besides the web. That is why the web became the dominant platform. I do not know what theoretical could've would've should've has to do with the reality of technological evolution

> It's probably not this particular feature of the web browser

... yes, it's not the ease of use that makes users like web, it's that computers in 2021 come with browsers installed that makes users prefer web over bloated desktop applications without good accessibility anyways!

I have no clue what your comment is meaning to get at.


>I have never heard of such a platform besides the web.

Steam comes to mind.

> ... yes, it's not the ease of use that makes users like web, it's that computers in 2021 come with browsers installed that makes users prefer web over bloated desktop applications

Wasn't comparing to "desktop applications." I was saying that the reason that the download-code-then-run-it functionality was shoved into the browser rather than coming in some other form, was that the browsers were already on all the desktops.


Compare and contrast today's Javascript-in-browser dependent applications versus a pure server-side-rendering model. With the latter, there is truly no software downloaded, only static assets such as images.


It was the pinnacle of an era that was gone with the rise of the Internet.


That does not diminish the accomplishment in my eyes.


It doesn't, but that doesn't help us now.

See Flash.


Shit, I never thought I’d ever be a fan of any given programmer, but if one guy made both C# and Typescript I’m prepared to change my mind.


Several things happened at the same time:

1. At peak popularity, Borland products where easily available. Borland decided to turn to enterprise and raised the price considerably, so individuals and small companies started looking elsewhere. By the time they realised the mistake it was too late. In my opinion this was the biggest mistake.

2. Internet and Linux came, and with them Perl, PHP, Python and others. Borland missed the boat, and again by the time they realised that, it was too late.

3. Sun came with Java and Microsoft with C#, both seen as the future of enterprise, and available for free or at very low cost. Java was extremely popular at education sector, pushing out Pascal and other competitors. Both made Object Pascal obsolete.

So bad decisions and being late to the party. Also it was hard to compete with Microsoft in the long term.

As an unrelated sidenote, at the time when world was turning towards agile, they were building and marketing software for managing waterfall project management. That just shows how disconnected from reality of their customers they were.


> 1. At peak popularity, Borland products where easily available. Borland decided to turn to enterprise and raised the price considerably, so individuals and small companies started looking elsewhere. By the time they realised the mistake it was too late. In my opinion this was the biggest mistake.

Exactly this. Turbo C was the first real C compiler for DOS that I could afford, where "real" means that it supported large memory model. Before then, the cheap compilers were always stripped down to only allow small model (64K instructions/64K data). Turbo C was also downright fast, and it cost far less than what Microsoft wanted for their compiler. But once we actually started writing 32-bit applications, its day in the sun had already past.


> Internet and Linux came, and with them Perl, PHP, Python and others. Borland missed the boat

Actually, they were looking at new (by then) technologies. C++ Builder was a reality in the late 90s, Kylix (essentially Delphi for Linux) was released in 2000; I attended one demonstration by Marco Cantù himself. The problem with Borland/Inprise/Embarcadero was that they weren't giving a damn about small developers but wanted to play only in the enterprise world, an ill choice that backfired spectacularly. If they could exchange a few ideas with the developers behind Lazarus, and could help them some way but guaranteeing it stays free no strings attached, both worlds could benefit from it. Lazarus needs some more work and a name to become well known outside of the Delphi nostalgic developers world, and newer Delphi and related tools need a much cheaper product to attract developers in long ignored market segment.


> At peak popularity, Borland products where easily available. Borland decided to turn to enterprise and raised the price considerably, so individuals and small companies started looking elsewhere

There is a lot to be said for being easy to get and affordable. I gladly give IntelliJ a couple of hundred dollars a year, because $10-$20 a month for a superior developer experience is worth it to me.

If I was paying their enterprise rate, though ... not a chance.


On the other hand the "we're professional products, you'll eat the cost and like it" approach worked fine for Adobe.


That works as long as:

1: someone other than the user is footing the bill (bulk licensing)

2: there are no suitable alternatives

Adobe will fade (or be forced to change) over the next 20 years as different tools pick them apart one niche at a time. It's already happening: the Affinity suite, Procreate (and similar indie-focused tools), and DaVinci Resolve, among others already serve huge niches within the Adobe suite quite well. Capture One, which predates Lightroom, is getting better as it expands out of the portrait studio.

This is essentially what happened to Microsoft. Macs got good (for definitions of good that matter to non-enthusiasts), mobile swept up almost all casual computer usage, and the web took much of the rest by way of Linux and open source. It happened a niche at a time until the world outside was too big to fully EEE.


> On the other hand the "we're professional products, you'll eat the cost and like it" approach worked fine for Adobe.

I know several designers from various areas who do serious attempts to free themselves from Adobe's bondage (in particular the running costs of the forced subscriptions) and are looking for (and partially have found) alternatives for doing their work.


A staggering percentage of Adobe installations in the 90s were pirated, even in the US.


My understanding is that was their model.

If you let teenagers pirate your product, they're already familiar with it by the time they get hired. When they get hired, the cost to re-train them on something else is more expensive than the license (at least, in the short term).


I know. Pirated copies of Adobe Flash Creator created an entire generation of animators. A shame that no real worthy successor has appeared.


I won't defend Adobe, but Photoshop is (or was) free for students, and I recall a friend buying first licence for 10 dollars or so when I was studying (this was licence only, you had to get the software "somehow"), so at least they are covering this ground. Maybe it's different now.


Yeah, this is it.

BTW, I still moderator in https://www.clubdelphi.com (despite not working on Delphi anymore, like many there!) and see how much mind-share Borland and later lost on the small-bussines/solo-developer side.

You can't imagine how much in the official forums people ask for not ruin it and put real features, and the things that get implemented were so out of touch.

Also, is pretty similar how MS ruin Fox/VB (except not even sell it after): Burn the goodwill in the low-end... in the hope them move to costly products -sql server, ejm- ... and then lost that people to the rising of open source.


The Delphi version we used cost $3000 per developer ~ 2003-2005.


Death by bean counting and enterprise bs, that's what happened.

I spent 13 years writing Object Pascal in Delphi full time, starting from Delphi 4 and ending right about where Embarcadero entered the scene.

They seriously dropped the ball by focusing on ticking enterprise boxes and charging insane amounts for it rather than evolving and fixing stuff.


Exactly this.

Pascal was a great language, and I wonder how much we've lost by having it go by the wayside. It was strongly-typed, easy to read, cross-platform, produced native executables, and was lightning fast to compile and execute.

I say "was" a great language because it isn't widely used today. I miss it.


Also it had sane strings. Strings with sizes, withouth the whole null-terminated madness of C which still haunts us nowadays.


Languages 10 years older than C have proper strings, in a certain sense there are some design decisions common to Go and C, regarding adoption of common features.


> Also it had sane strings. Strings with sizes

Except they were prefixes on the buffer which was bad and not sane, and greatly limited the size of the original pascal strings.

And technically they were UCSD strings, not standard pascal, and other implementations used e.g. padded strings.


"Sane" strings that may waste 4 bytes of length field for unnecessary count, or may have a 1 or 2 bytes length field that proves to be insufficient. Or "sane" strings that alloc and dealloc and refcount like mad, bringing the application to a stall. "Sane" strings that discourage the developer from just coming up with a simple memory management scheme that fits the situation at hand.

"Sane" strings that lead to incredible bloat and incompatibility, because there is no one true "sane" string type, so every module that doesn't know better forces their own way onto the user.


If the length field is 4 bytes then only 3 bytes are "wasted" compared to C with its null-terminated strings and 1-byte chars. The difference drops if you have wider character types. Not to mention the time saved not having to scan every string to determine its length.

I always find it weird when people fret about bytes but not cycles, especially cycles that have to be spent waiting for memory reads.


You are thinking as a developer in 2020s, not one in 1990s (or earlier). Memory was incredibly precious: 16-bit x86 had 64KB segments, so if your data didn't fit in, it would be a lot slower. People used nibbles (4 bits) because the extra instructions dealing with bit twiddling was worth the cost.

Basically, no sane programmer in the 90s would be happy with a string type that wasted three bytes per object.


I think one of the other factors (I fear calling it "mitigating") is that fixed-width strings were a lot more common back in the days. Outside of serialization they're pretty much gone now, but we can see their mark in various oddball "string" functions of the C standard library which were never designed to operate on "C strings" (though strings.h also has a lot of functions which are just plain garbage with no redeeming features).

For instance strncmp and (especially) strncpy make very little sense with C strings, but make sense for NUL-padded strings.


CPU cycles were also incredibly precious. It's a tradeoff. In the 80s and 90s you also had smaller caches so iterating over a string to determine its length was more expensive (more likely to hit RAM) than "just" reading its length parameter and carrying on with your life.


Sure everything was more expensive, but not by the same factor. Main memory was smaller but also relatively faster compared to CPU. Search for "386 simm memory" and you'll see 60ns modules. Considering that 386 debuted with 12MHz clock, 60 ns is faster than one CPU clock cycle!

In other words, "reading the whole string from memory" could be a performance problem, but a less serious problem for machines of those days, compared to using a few more bytes to store the length.


> Basically, no sane programmer in the 90s would be happy with a string type that wasted three bytes per object.

In general, maybe, but we are talking about Borland here, so business logic apps mostly. String size is not a problem there.


> Basically, no sane programmer in the 90s would be happy with a string type that wasted three bytes per object.

Though it was eventually eclipsed by C/C++/Objective-C on Apple platforms, I believe Pascal was the original application programming language for the Apple Lisa and Macintosh, and produced some revolutionary software in the 1980s.

Object Pascal/Delphi certainly enjoyed a fair amount of success in the 1990s.


Yet JOVIAL, NEWP, PL/I, PL/S, PL.8 among other Algol dialects managed it.


> JOVIAL, NEWP, PL/I, PL/S, PL.8 among other Algol dialects managed it.

… with all of them having been developed and run on 32-bit IBM mainframes and other big irons with «lots» of memory (e.g. 512 kB of RAM would have been considered huge in early 70-s).

C, on the other hand, was developed with limitations of early PDP-11's in mind that were often equipped with 56kB of RAM, so null-terminated strings in C was a rationalised design decision and/or a trade-off. Besides, both, UNIX and C started out as a research and a professional hobby project, not a fully fledged commercial product.

Since the internetworking had been inexistent slightly less than entirely, remote code execution that stemmed from buffer overruns was not an issue, either.


You can start with IBM 704 used for Fortran and Lisp in 1954, TT 465L Strategic Air Command Control System in 1960, B5000 in 1961, CDC 6600 in 1964, and then compare with the capabilities from a 1964 PDP-7.

Read the DoD security assessment on Multics, https://multicians.org/b2.html

Afterwards you can read from Denis' own words,

> Although we entertained occasional thoughts about implementing one of the major languages of the time like Fortran, PL/I, or Algol 68, such a project seemed hopelessly large for our resources: much simpler and smaller tools were called for. All these languages influenced our work, but it was more fun to do things on our own.

Taken from https://www.bell-labs.com/usr/dmr/www/chist.html

So thanks to their fun, the world now suffers from C strings.


> Although we entertained occasional thoughts about implementing one of the major languages of the time like Fortran, PL/I, or Algol 68, such a project seemed hopelessly large for our resources: much simpler and smaller tools were called for […]

Precisely my point. The definition of fun is up for a personal interpretation.


Scanning for the string length (e.g strlen()) is asymptotically worse than reading a fixed size integer, so obviously don't do that unless it's a good memory/speed tradeoff (i.e. when you know the string is at most say, 16 bytes long).

Overall, it seems you didn't read my comment either. Or was I _that_ unclear?


Obviously C-style strings can still remain an option where they are needed, but in most cases using the 4 bytes for a length field is a sane default. How many buffer overflow attacks have been enabled by that four byte savings over the years?

> "Sane" strings that discourage the developer from just coming up with a simple memory management scheme that fits the situation at hand.

"Sane" compilers that discourage the developer from considering the machine level instructions. It's turtles all the way down.

There's a reason that Python is so popular and it's not performance


This is not Python so that's a strawman.


By that I meant that there is obviously value in abstracting away tasks and details that the programmer would otherwise need to manage. That is why compilers exist. The value-add for abstracting a string is certainly more than the cost of 4 bytes of memory in the typical case.

Put another way: optimizing the management of strings in memory is almost never the best use of time to make progress toward an organization's objectives, and doubly so when that kind of micro-tuning can actually introduce security risks


This never, ever bit me in the Pascal days. I suspect this was primarily because the stack I was using was either "provided by the Borland Pascal standard library" pieces, or it was my own Pascal or assembler code.

I had a limited number of calls into a library and a need to do a few things that escape me with regard to interacting with -- I think -- an 16550 UART[0] and its driver, but I don't recall them being particularly nasty to deal with. I mean, all things relative -- I was expecting these to be nasty to deal with because they often involved inline assembler, so the problem of "making it behave with the string" wasn't quite as pressing as "what the hell am I actually doing here?" :)

[0] My huge project was a bulletin board system in the 90s.


I hardly consider four bytes to track the length of a string "waste".

I also don't really know why you assume that "sane" means "doesn't let you manage memory effectively".

That said, how many applications really are bottlenecked by string processing in the first place? I don't care if processing Unicode graphemes is slow, as long as it's correct and doesn't mangle users' names.


Well I for one optimized an authorization module of an enterprise application written in Delphi by getting rid of standard library strings. Speedups where 100x-1000x, accelerating application startup time from minutes to maybe 3 seconds.


He is probably thinking about the context of the 1980s, when a large amount of 3 byte waste (the 1 byte null char is a form of 'waste' itself) might have been a problem back then.


Packed structs with char fields of 8 bytes or so are still common.


I don't know the details but I doubt that 4 bytes would be used for a length of a string in the 90s. 2 bytes would lead to 64k characters which is surely more than you'd need for the average string.

And surely if you malloc(..) a block of memory in C and store a string in it, the memory allocation system is going to store how large a block that is anyway (even if it isn't visible to either C or Pascal probably.) I know not all strings will be malloc'd but a lot will be. And we seemed to deal with this overhead fine in the 90s?


That you have to weigh 2 vs. 4 byte lengths is exactly making my point. There is also a case for 1 byte lengths, and for string implementations other than length + pointer, like for example rope datastructure implementations. My point is that there is not one sensible string implementation, and acting like there was is in many situations trading short term convenience for long term pains in larger projects.

> I know not all strings will be malloc'd but a lot will be. And we seemed to deal with this overhead fine in the 90s?

It's a common but deeply flawed assumption that allocations and lifetimes are so random that every little object should be individually allocated and later deallocated with malloc()/free() or with another generic allocator. I don't use malloc() to allocate string buffers - except in rare situtations of laziness, knowing it will come back to bite me later. Not only performance concerns but also the practical impossibility of matching each malloc() with a free() forbids that. Systems like RAII come to help to solve the latter issue, but I prefer to take the difficulty of matching everything up as an indication that the general approach is too complicated.

Instead I recommend to allocate strings using a fixed field (eg. char buf[16]) on a stack frame, or using a member in a struct, in order not to add any management overhead. Alternatively, for unbounded and dynamically sized strings, it's often a good idea to allocate them using linear allocators. For example, in a GUI renderer, it's a good idea to have a per-frame allocator that collects all allocations in a list of larger chunks, to minimize the allocation overhead to a few chunks (KBs to MBs) that were individually allocated from the system. With this, everything is freed using a single central function call after the frame was renderered and the data is now not needed any longer.


How is iterating over the string each time you want to do anything meaningful better? Also, there is a short string optimization, where you can store the string inside the pointer , eg. c++ does just that.


You didn't read my comment right. My statement is there isn't one true string type. I didn't say you shouldn't use a length field.

Zero terminated strings still make some sense of course - ease of reading when looking at byte level representation, and moderate cost savings in packed structs (4, 8, or 16 byte strings). The former is why I zero terminate by default where possible, even when using a separate length field stored somewhere else (almost always).


The thing is, not every developer wants to care about memory management. A lot of us just want to solve user problems, and we don’t mind too much if we spend a couple of extra bytes to do so.


It's not primarily about the bytes. Heap allocating RAII style strings can absolutely kill performance. And they are baaaad for modularity. It's all in my OP, why do I even repeat?


> strongly-typed, easy to read, cross-platform, produced native executables, and was lightning fast to compile and execute.

Sounds like Go


Except that Go is for command line programs and backend code; its standard library doesn't include a cross-platform GUI API for desktop applications.


What was the stdlib's cross platform GUI like in Pascal?


Probably could have been awesome, the Windows flavor certainly was; but they dropped Kylix before it got a chance to go anywhere.


You might want to look at Delphi's VLC and FireMonkey. With Free Pascal/Lazarus, they have the LCL (open source equivalent to VLC).

Free Pascal/Lazarus has no problem with Linux or macOS. It was Delphi that stumbled around for some years about this, but presently can also be installed now on both. Of course, Delphi cost the big bucks.


VCL, not VLC. Visual Component Library.


LOL, you are right, such a silly typo. I was actually playing around with VLC player at that time, so must have got "crossed wires".


I usually get it wrong the other way around in stead.


On the other hand, it was verbose, and most of original Pascal ideas were unsound and replaced in Object Pascal with ones coming from C.

"Break loop" being a function rather than keyword is serious PHP land.

It had better OO than C++, though. I admit it, they managed to have a sane and compact statically checked compilable OO.


Seems like you are a bit mistaken, as Object Pascal borrowed very little from C, nor did it need to. Object Pascal is just an extension of Pascal. Objects and Classes are just another type, just like Records. The relationship between Pascal and Object Pascal is extremely close, far more than say between C and C++. Which is why many get confused, thinking there is such a similar gap.

OO in Object Pascal is a lot more sane than in C++, because the concept of an Object was more aligned to the existing Record type. It would be something like taking C and empowering its structs with methods. There wasn't an attempt to make radical changes, but simply extend the language with OO. Where with C++, in comparison to C, much more fundamental changes were made that created more conflicts in how the languages are used.


I used Pascal between 1992 and 2000, one thing I missed a lot was dynamic lists and dictionaries. It was so hard to develop everything with pointers and allocations, so fragile.


At least in Germany it is still used enough to have a presence on some magazines, and there is an yearly conference.


Yep, they brutally tried to trade community goodwill for enterpris-ey contracts and quickly lost both.


I was under the impression that a lot of the original team was hired by MS and that was the end of Borland that created Delphi.


I mean, they even had their own CORBA broker!


Unpopular opinion: CORBA was underrated. For its time, CORBA was a decent way of stitching together heterogeneous systems. Tediously hand-coding REST APIs was a big step backwards, and gRPC is really just catching up to where we were before.


I remember learning about CORBA in undergrad too. Then a few months into my first job, "new guy, go do this integration with SOAP/WSDL." Crushed me so hard I went into sales.


The 'never again' part of working with soap for me was that Microsofts and Javas soap libraries were more or less incompatible for a variety of reasons.

I could handle the xml madness but at least make sure the standard is a standard.

I'm pretty happy with rest apis on that front. I can always make them work.


Yeah I remember those happy days.

There were some configuration behaviours on the standard that had opposite defaults.


And there's Cap'n Proto, which is essentially a reinvention of CORBA.

Unlike gRPC, where RPC calls are just functions that take pure data arguments and return pure data structs, Cap'n Proto allows RPCs to return references to objects. The client can hold onto the reference and call methods on it, which invoke RPC calls, while the client and server runtimes keep track of what the references refer to. So you can treat remote objects as if they're local to your process.

This "location transparency" feature is at the heart of CORBA, and later, Microsoft DCOM and Java RMI. I've never used Cap'n Proto, but with CORBA/DCOM/RMI, this is a really powerful feature which allows you to work with APIs as if they're just in-process libraries. The downside is that if you pretend there's no network overhead, you might end up designing very inefficient applications, with each method call becoming a network roundtrip. It also means a client can, if you're not careful, "hog" a remote object and keep it from being deallocated, resulting in leaks or excessive memory usage.

Basic DCE-style RPC like gRPC is simpler and has more predictable performance, since you're forced to consider that you're talking to a remote API, just like REST.


Having worked with some of those; I'm going to say thanks, but no thanks.

Hiding network access and pretending it's all local is a recipe for disaster.

If you're writing a networked application, that's right up there on the list of things you need to focus on and never forget.


Thank you!! I did my undergrad thesis around CORBA (showing my age…) and you are spot on that the whole world proceeded to pile on a whole lot of work in order to make everything run over http.


Agreed, but unfortunately so is fashion in this industry.


The Pascal community has to be thankful for the creation of the open source project of Free Pascal/Lazarus and to a degree the Russian created PascalABC. There are some others, but these are ones still here and viable. The open source Object Pascal projects is was kept the language alive.

Delphi/Embarcadero went all out chasing enterprise dollars. The prices they charge burns holes in pockets for any little guys. Only in the last few years have they come to their senses with the free Community Edition and focusing again on making school/academic licenses easier to get and more accessible.


Embarcadero was a version of Delphi?

I googled it, but it wasn't clear if it was a buy-out, a rebrand, or something else.


from wikipedia, "On May 7, 2008 Borland Software Corporation announced that its software development tools division, CodeGear, was to be sold to Embarcadero Technologies for an expected $23 million price and $7 million in CodeGear accounts receivables retained by Borland.[5] The acquisition closed on June 30, 2008 for approximately $24.5 million.[6]"


I was in the Borland Languages division as a test manager and staff metrics engineer until I was recruited away at the start of 1995. At that time we were beginning to struggle to compete with Microsoft.

We had been winning for a long time because we were close-knit and highly motivated. We were scrappy competitors. The first real blow is that we moved into a new office complex that literally forced the team to sit separately instead on close to each other. That was a big hit on morale and productivity.

Then Microsoft seemed to get its shit together and spend a lot of money on R&D. A lot more than they could have been getting in as revenue. Just after I left this forced Borland into an unsustainable cadence of delivery. Soon afterward Microsoft just started wildly hiring our best people.

The thing is, for the period I worked at Borland, I instigated or participated in many of the innovations in process and testing that even today are coming as a surprise to people I teach… We had a fantastic team! Jothy Rosenberg, for instance, whom you can find on LinkedIn as founder of his Nth successful company, was my counterpart in development. He’s probably the most gifted technical leader I have encountered in my whole career.


I checked out Jothy Rosenberg. Deeply moved by his perseverance, will power and technical sophistication. Wish I knew him in person.


I can think of several things that may have contributed. In no particular order.

* Their application business lost out to Microsoft many times over the years. (Sidekick, Quattro, Sprint, etc.)

* The availability of free and open source development tools went way up. (This undermined the ability to make money selling development tools, even as they become more expensive to develop.)

* They lost Anders Hejlsberg to Microsoft. (His Microsoft resume is a testimonial to his skills, technical and otherwise, but prior to that he was the driving force for the Turbo Pascal line through Delphi. They did diversify, but Turbo Pascal really was Borland's core asset.)

* Developer mindshare pivoted away from client apps to web apps.


It was also very distinctly DOS/Windows-first product family. Microsoft OS slide to irrelevance sealed the fate.


> Microsoft OS slide to irrelevance

I wouldn't call 75% of desktop share "irrelevant". Tablets are still 20 times less common than PCs. *

* data from https://en.wikipedia.org/wiki/Usage_share_of_operating_syste...


But people don’t build desktop applications that much anymore. Windows, macOS, and Linux as GUI platforms have all faded with the Web and smartphones.

It was a huge downgrade in interactive functionality at the time, but the Web has slowly built up over the past two decades so it is almost as functional.


> But people don’t build desktop applications that much anymore.

This might be outside of your echo chamber, but lots of companies still very actively build desktop applications (also completely new ones) for their own line-of-business purposes.


Agree, I should have specified “for the mass market”.

Totally different inside the enterprise, one reason Borland & Embarcadero focused on it.



What happened is the web replaced most native desktop apps, not mobile tablet UIs. There is a reason why web devs are the still the most common type of devs, despite mobile devices being the most common type of device used today by a large mile.


Doesn't matter what you run your browser in. The lion's share of job market for developers is distinctly not Windows native applications.


I specifically recall Microsoft engineering in the 1990 era, driving up to Borland headquarters in Scotts Valley en masse in expensive cars, and inviting engineers out for lunch "on them". It was a calculated PR and intimidation move by someone at Microsoft to acquire talent and to destroy Borland. Borland and its quirky leader were well regarded in Silicon Valley by many engineers, while Microsoft was busy making a reputation as scorched-earth competitors. It seemed that it was not enough to win a market segment for Microsoft, there was a clear signal of destroying competing companies, and Borland was one of them at the top of the list, based on this event.


Is there something wrong with hiring engineers from a competitor? Should they have had a no-poaching agreement like Apple and Google had a few years ago? That was deemed illegal if I remember correctly.


In and of itself no, but courts have ruled that there's a distinction between hiring people to make use of their talent vs. hiring people to deny said talent to the competition.

Was Microsoft targeting Borland employees because Microsoft was looking to make use of their talent for their own products; that's legitimate. However, if Microsoft was hiring Borland employees for the purpose of keeping those employees from working at Borland; that's potentially predatory and violates antitrust laws.

Note that it's not even in the employee's best interest in the case of predatory hiring. Much like with predatory pricing, once Borland goes out of business as a result of said practice, Microsoft is unlikely to continue retaining many of those employees or paying lucrative salaries and the overall pool of talent as well as salaries is likely to shrink in the long run.

It's the nature of predatory actions that they hurt the actor in the short run but benefit them in the long run whereas the public gains in the short run but is damaged in the long run.


Hmmm.. this seems like a very grey area in hiring practices and competition. Almost impossible to do anything about it… sucks for small companies.


Back in the late 80's or so, Borland did a lot of direct mail advertising that would start out with "Dear Friend XXXX", where XXXX was the target's name. I was in Phillipe Kahn's office once talking with him, and noticed that tacked on the wall was one of those letters. Across it was written in heavy red ink:

"Dear Phillipe, I am not your fucking friend. Got it, Phil baby?"

How could you not like a man who'd stick that on his office wall?


I did learn one technique from this. I finally managed to put a stop to Capitol One sending me credit card offers in the mail practically every day. I'd write over the offer letter in heavy red crayon a profanity-laced diatribe and mail it back in their business reply envelope. It took several tries, but the CO junk mail finally ceased.


I sorta had the feeling he fell victim to "cult of personality" eventually. Everybody telling him he was so great he lost touch with reality. Or was he never actually connected in the first place?


I didn't know him well enough to know if he emanated a "Reality Distortion Field" like Jobs and Holmes did.

But he was very charismatic.


I checked out this guy's resume and even after Borland he has sold two companies for $300M each. A fantastic engineer and entrepreneur I think.


I still use Delphi (it is being actively developed) for my Windows Desktop products and Lazarus (Delphi's opensource clone) for Windows / Linux desktop clients.

I do not think anything comes close to the practical feature set, ease of use, power and long term stability for GUI creation. Well QT does but at what expense.

Meanwhile HTML/javascript based frontends is a pitiful clusterfuck comparatively. Modern computers have more than enough power to have the HTML/javascript front end with the power of Delphi. Why oh why web tool creators have to come up with abominations like React instead. The end result is that in case of Delphi the tool works for you. In case of popular web GUI frameworks it is the other way around.


Checked out Jon Lennart Aasenden's compile-to-JavaScript Object Pascal IDE/compiler?

I don't know how far he's got with it, but might be worth looking it up.


There are actually a few Pascal to JavaScript transpilers. Free Pascal/Lazarus has one, and then there is Smart Pascal that comes from DWScript. For that matter, there are a some other languages that have transpilers to JavaScript or WebAssembly.

Aasenden's new transpiler and IDE is for the dollars. It's supposedly not just a transpiler for Object Pascal, but for a few other languages too. The problem that I see with that route is if you make an HTML5 app with it, you still going to need Cordova/PhoneGap for making it run on various OSes.

If you are going to make people pay, rather see the finishing piece of the puzzle included too, because JavaScript and various dialects of Object Pascal are free.


It is not practical for me personally even if it is perfect for 2 reasons:

1) Unless it comes with completely ported VCL I do not see how it would be useful.

2) Doing it for the language's sake - I am not a language zealot and am fine to use whatever suits me better for particular situation. JS as front end for my servers works fine for me. For example even though Delphi / Lazarus is fine for developing servers I prefer doing it in C++. Backend libraries and performance are superior in my opinion.


1) It pretty much does, AIUI.

2) "Doing it for the language's sake" -- Not necessarily the language itself, but he clarity it affords you. I could never quite get my head around the "half-typed, half-not; half-OO, half-not" nature of JavaScript; it just feels so... messy, to me. Any other language that avoids those as well as Pascal would do me equally well. (Hm, what does that leave? BASIC, C, COBOL, and FORTRAN...?)

2 B) "C++, superior backend libraries and performance" -- Maybe... but not by all that much, AIUI. Compensated for by clarity and simplicity (leading to better safety and productivity) on Pascal's side, IMO.


I'm starting a Delphi job in the new year, engineering software, its good to see it's still around, and I'm looking forward to no more HTML :-)


The answer is well-documented in Merrill R. Chapman’s book “In Search of Stupidity: Over Twenty Years of High Tech Marketing Disasters”

Terrible title but great book.

Basically Microsoft dominated every software category by waiting for the #1 company to make a dumb mistake. They then swooped in and won.

This book is an excellent but biased history of that era.


After Anders' departure, they created a promising project to make Delphi for Linux that was making progress.

At the time Borland had sued Microsoft for some big IP feud.

Then suddenly an arrangement was made. Delphi for Linux was rushed, released unfinished and flopped. Borland got $30M and access to every .NET documentation. But Delphi.NET was never very popular because it was never as good as VS.NET.

.NET modules were added to Windows native Delphi and slowed down the IDE.

IDE price skyrocketed, users flew and after some acquisitions dance, Delphi is owned by IDERA... not sure how it's doing now because they closed developers' forums years ago.


Borland fell for Microsoft's OS/2 fiasco, and devoted a lot of energy to porting products there, to no economic benefit.

At that time I was using Borland C++ and the quality really went downhill. Errors that never should have gotten past QA. Either they weren't checking or were intentionally shipping with killer bugs.

One release the license said you couldn't use it to make a long list of products that would compete against Borland, like spreadsheets or databases. They rolled back that provision a little later.

Then releases every couple of months.

I just gave up them, they had burned the tremendous good will originally generated by Turbo Pascal, which, it should be mentioned, was created originally by Anders Hejlsberg.


OS/2 was not a problem for Borland? Why? Because IBM paid us to do that! We didn’t lose money on that. Or not MUCH money, anyway.

The license agreement thing was a failure of imagination. The project team didn’t ever think we needed to review legal language. But after that fiasco we made sure that we did. I tell that story in my classes.

You are right about slipping quality. It was just an unsustainable rate of development. I was ringing alarm bells inside the team, and management was hearing them, but they said “look, either we ship or we start laying off the team.”

So you are mostly right.


Phillipe Kahn seemed bitter over OS/2. Perhaps using up the company's intellectual resources, when the resources could have been better used elsewhere, money from IBM or no money. But that's just a guess for why.


I don’t know if any other part of the company worked on OS/2. But in the languages group it was one of those death march projects that helped forge the team into a better working unit. The project was bad but I missed the people I worked with when it was over.

And like I said, IBM footed a lot of the bill for it.

I think the secret to Borland’s fall is mostly that we had to earn our way forward, whereas Microsoft could write huge checks to itself. It’s like we were playing a video game against someone using cheats.


> Microsoft . . . huge checks

E.g., any company smaller than MSFT would have been destroyed by the endless Word For Windows effort.


This story is mid-80s, not later on. I was at Analytica, which made Reflex. It debuted in 1985 and sank, to good notices and dismal sales.

Borland bought the company for less than the VCs had invested, cut the price to $99, advertised it, sold it by mail order, and made it a hit for a couple years.

Several of the key execs at Analytica went to Borland and then to Microsoft; some of them are fairly famous now. I don't know anything about the limos or the lawsuits.

I heard Kahn talk in early 1985, and the Analytica founders made fun of him, for selling his product through mail order when everyone knew you had to go through BusinessLand and ComputerLand.


I started software dev in the mid-2010s, and had never heard of Borland, until I heard a former colleague mention it having great built-in debugging tools and that they missed it.

Same colleague had some vocal criticism of `gdb` as a debugging tool, and the state of Linux-based debugging tools as a whole, with claims that "Borland's were much better, and Visual Studio (not VS Code) being one of the few development environments with a quality debugger".

I'm not sure how fair that assessment is, I've found `gdb` to be a helpful tool, though I've never used Visual Studio.


I'd say Delphi was one notch better than Visual Studio. The pascal language was more highly optimized and easier to parse, so the debugger was much snappier about popping up the tooltips to show you values of variables or F1 to jump into the help pages. I remember being especially disappointed with VS because they would spend hundreds of MB of your precious disk space installing the entire Microsoft Knowledge Base, and the hitting F1 on a code statement would bring up a mishmash of Visual Basic examples (while developing C++) for almost but not the right class. In Delphi, it always knew exactly what class you were dealing with and would open the correct help page before your finger was fully off of the function key. The help pages had been expertly written to show you the most important details first, and were easy to browse. Really, I'm surprised I don't see more people reminiscing about those help files. There was probably as much effort put into that as into the frameworks or compiler or IDE.


Agreed. The help pages were really top notch. Other software's help pages were basically useless to the point that no one read them. In Delphi not only you had to the point explanations but also examples that frequently solved the problem one had in the first place. Sadly, as someone pointed out, most of this art seems to be forgotten with time.


You could say the only fault with those help pages is that they were too good: Some time after v. 5 they stopped including those lovely paper manuals, probably because they figured the help pages were good enough to replace them.


You should try the Visual Studio debugger one day. It's kinda the yardstick for graphical debuggers.

gdb is a fine tool, but I think the VS debugger is reasonably described as "next level".

Many people don't know about Microsoft's other debugger, WinDbg. It's actually more capable than the VS debugger, but the UI is closer to that of gdb.


I would be keen to try, but afaik there's no way to do it on Linux which is my main dev/deploy environment.

It feels like Linux debugging is stuck in a viscous cycle, since few people are putting the capital into a decent debugger UI, and thus few people are using debugging UIs (and thus using printf debugging, or gdb CLI). Folk might not realise how much better it could be.


It’s night and day. The debuggers in those two tools were very easy to use, very visual, but surprisingly powerful.

I cry a little on the inside when I see developers using Visual Studio and resorting to printf statements (or the equivalent) because they’ve never even tried to use the debugger, ever.


"Apparently, JBuiler used to be their #1 cash cow. Well, until Eclipse came along: within 18 months, JBuilder license sales dropped to essentially zero." quoting from a previous employee.


It might have saved them to go with the JetBrains model of a free "Community" edition along with paid supported options but that's probably not a jump that the uninspired management would have been willing to make.


They did, later... Too late :-(


We used jbuilder and it was a great product, simpler to use than eclipse but it's hard to compete with free and good enough


Yet JetBrains survives.


Which is fascinating to watch. As it is quite uncommon in today's world.


That's an understatement. JetBrains has a strong continual growth.


Yeah as a kid i remember staring at the jbuilder box in a "computer store". I wanted it so bad. But never had the money.


For me, the only thing I truly remember about Borland is Philippe Kahn (https://en.wikipedia.org/wiki/Philippe_Kahn). Kahn is a giant: supreme intelligence, accomplished musician, worldwide competitive sailor. He is also a big and tall man. He is also the person who sent the first photo via a telephone. I first met him in 95 when I invited him to speak at a Wharton meeting. His vision of the internet blew me away. Shortly after meeting with him I quit my job to start one of the dotcom companies. Like a lot of companies ran by Founders, Borland had a face, for me it was Kahn.


As far as I can tell, Philippe Khan was the reason Borland failed.


I have heard a different story (but from Kahn's side): that the board did not back him up when he presented his internet strategy. He left after that, arguing that the board did not have the vision.


Kahn was a poor visionary overall. He didn’t have a vision of the web at all. His vision— articulated to us in 93 and 94 was “client/server.” Close, but really not the same thing.

He drove constant reorgs that followed the pattern of: centralize then decentralize then centralize then decentralize… He bought Ashton Tate, which was a big mistake.

I don’t think Kahn was good at leading a big company, and it seemed to me and my friends there at the time that he was floundering.


> [Philippe Khan] bought Ashton Tate, which was a big mistake.

Was it really? It brought in dBase to make simple desktop (as opposed to client-server) database apps super-easy to develop in Delphi, and IIRC Interbase to make client-server DB apps easy and cheap todeploy (thanks to the license terms of the CS version) too. I always thought had they stuck to that, and not veered into .NET and that whole weird ALM software pivot, they could have done much better.

I don't know how much you are at liberty to tell, but if there is anything more you can say about it: Why was the Ashton-Tate acquisition a "big mistake"?


Man, this reminds me of how much I loved Borland's OWL in school. When I entered the workforce, I was forced to use MFC and I was probably the worst kind of coworker to be around at that time.


Turbo Vision libraries for DOS where an absolute delight. And I clearly remember a Turbo Pascal demo with a breakout game clone that was a lesson on object oriented development.


I made a good bit of money in undergrad and grad school building software in Turbo Vision in the early 90s.

The biggest project was writing all the software included with this college textbook: https://www.betterworldbooks.com/product/detail/quality-cont....

I found out later that the software in that book ended up being used in industry, including Jim Beam (the whiskey brand) using it in their distillery.


Boy, forgot about OWL, I loved that. I remember the reworked version without the custom C++ language extension for event handlers wasn't quite as magical.


They threw amateur scene overboard and changed the price from $100 to $3000 per seat. People moved to cheaper options. When companies needed new people they couldn't find anyone with Delphi experience so they were forced to switch. Now nobody uses Delphi and it is impossible to find people who knows it.


I would say that is an oversimplification. People use Delphi, but it's not aimed at hobbyists, independent developers, or arguably even small businesses. Embarcadero (owner of Delphi) goes for the deep pockets and the enterprise. The very obvious problem with that tactic is you dry up your grassroots. Which was a problem for some years.

To compensate, Embarcadero now leans on its free Community Edition of Delphi and cheap school/academic licenses, to create a developer talent pool. The other quiet thing that they do is also lean on open source Free Pascal/Lazarus to help build an Object Pascal community and more interest.

For those that don't know, Delphi is a dialect of Object Pascal. Lazarus allows importing of Delphi projects, and is a pretty similar dialect, though there are differences.

Another element of the game played by Embarcadero (and Borland before it), is a lot of people don't realize they have a foot in the C++ world too, with their C++ Builder and open-source Dev-C++. They have other products in which they get revenue from too or to get attention.


> Now nobody uses Delphi and it is impossible to find people who knows it.

Hey, if you need somebody who knows Delphi, I do.


I don't know what happened to Borland.

In 1982ish in Germany I was programming my Apple II in Applesoft Basic and UCSD Pascal. UCSD was 3 floppies, I had a 2 floppy system so for certain steps one had to physically swap floppies.

I attended an Apple User Group In Frankfurt and somebody demo'ed Turbo Pascal 1.0 on their Apple with the Z80 add-in soft card under CP/M. Everybody was amazed with the speed and integration. I bought a copy on the spot, received it maybe 3 months later as it had to shipped from the US. By that time it was on version 2.0. I had bought the Z-80 card in the meantime and switched all software development to Turbo-Pascal.


I really dont know what happened, but borland c++ 3.1 was the very best IDE I have ever used, no amount of emacs can replace it in my heart.

Delphi was super good as well.

We have only gone backwards since those days..


hey apparently someone made a dosbox distro to play with borland c++ https://developerinsider.co/download-turbo-c-for-windows-7-8...


I'm wondering if it's possible to retain the editor but use modern toolchain such as GCC? I guess it's definitely possible but how can I achieve that from WITHIN the editor?


that's the whole issue, people mentioned borland c++ 3 being nicer than emacs, until you need to tweak it :)


Agreed, we need an editor that is as flexible as possible. From that perspective VSCode serves a pretty good middle ground. It's programmable (but in Typescript that I don't like) and has tons of plugins, but it also has nice GUIs and such.


We used 3.1 as our internal benchmark of quality.



Similarly, I'm trying to find high-quality retrospectives on Pascal vs C. (I'm still trying to understand today if there is good reason [other than Lazarus and GUI programming] to use FreePascal over C or C++.) Any links you know of are welcome!


The main issue is (and always has been, in my opinion) that "Standard" pascal sucks in that it doesn't have standard libraries, and thus all of them were just a little too different from each other.

Delphi (and Lazarus/Free Pascal) have a largely compatible and sane set of libraries that include quite sane string handling (you never allocate or deallocate them, the have length, and can contain nulls without issue)

The latest versions of things include generics, so you can make lists, trees, etc of your defined record type with little to no trouble.

Separate compilation of Units means that I almost never wait for a compile... it is always sub-second from hitting F9, to seeing the thing run.

The two-way tools to edit forms are one of the most productive I've ever seen. Borland C++ built a lot of boilerplate to make up for the C++ impedance match with their VCL libraries, that you couldn't tinker with, or things broke. Delphi/Lazarus don't have that issue.

Lazarus doesn't support GIT integration directly, yet, as far as I know. If you are doing personal projects, there's no reason not to use it.


Iirc, Borland literally had to add closures to C++ to make their C++ Builder work and be as simple as Delphi in the UI editor.


A random comment won't qualify as high-quality but I have a fond relationship with Pascal, since I learnt programming with a Turbo Pascal book that was on the shelf in my local library and since Turbo Pascal gave weird error messages (due to clock speeds as it turns out), I found Freepascal that I could get from the internet via dial-up connection.

FreePascal's Object Pascal is is a nice language, pascal alone gives you an environment similar to C, with Object Pascal you get some polymorphism (classes and methods). What's of course missing in comparison to C++ is templates, so as soon as you build containers you will find yourself casting again.

If you want to have a slightly-more-high-level language than C, it may work out. Especially the Unit-System allows for better modularisation and definitely is an improvement over include-headers.

For most applications nowadays I would like to have a GC in place, which kind of makes me wonder whether Modula/oberon would be the more natural choices nowadays for choosing a pascal-esque language. But then I quickly end up picking SML/ocaml because once you have a GC, you quickly want a couple of other niceties unless you really want to have a bare-metal language.


I'm thinking, perhaps the non-visual bits of the LCL: Data modules, and everything you can place on them. The (Borland-clone) data set abstractions and swappable "drivers" still make database-centric apps a joy to develop (IMO, from what little I've tested it).


I used Borland C++ in the late 90s. It had a quirky UI. I remember that many windows would just float around instead of them being docked like Visual C++. I read in a magazine (probably Byte magazine) that there were layoffs at Borland and that a famous engineer got fired and he drove to Seattle to work for Microsoft. Looking at other comments it looks like Borland fell into bad times in late 90s.


I miss floating tool palettes.


Why did Borland fail? (quora.com) - June 13, 2015, 263 comments

https://news.ycombinator.com/item?id=9712267


Good. Yes, earlier this year someone shared it again and the post from a former QA Engineer at Borland jumped out.

https://news.ycombinator.com/item?id=26018033


Borland spent quite a tidy sum buying dBase (Ashton-Tate IIRC), something like half a billion $, back when that was money. This put them under quite a bit of pressure.

Another thing happened around that time, as the memory size and speed and number of desktop computers multiplied. Software broke out of the 640 kb limit, so feature sets went bonkers. This made it extremely challenging to deliver products with a very low price and very low support cost, two pillars of Borland's success. C++ came along, to develop which MS could afford a 9-figure price.

All of this put a lot of pressure on Borland and everyone else. Total profits for the software industry were lower than total profits for Microsoft. Borland and many others could not find a way and ran to the exits.


I'm not sure how much of a factor this was, but Microsoft kept making changes to their Windows header files that broke Borland's tooling.


They never really made the jump to Windows did they?

I know they had Windows IDEs but lets be honest, just like Wordstar, Lotus 123 and a bunch of others they were way too slow to move to Windows in the industry shake up that was Windows 95.


Delphi was by far the most productive programming environment available for Windows, right up until the bean counters took over, and they started raising the prices out of the reach of casual developers.

You can do things in Lazarus (the open source IDE based on Free Pascal) in minutes that take far, far too long to fiddle with in Python/WxBuilder, for example. I wasted weeks of time trying to use a "more modern" platform.


Delphi was a leading develment tool for Windows in that era.


Nope, you've got it totally wrong. From the introduction of Delphi 1 for Windows 3.x in 1995, they were huge on Windows.


maybe not directly related to the question, this project makes vim look like TurboC, the author mentions, that he has configuration that does so. https://github.com/skywind3000/vim-quickui


As a teenager in the UK I bought an issue of a PC magazine that had a cover CD (remember?) with Borland C++ Builder. That's what got me started with programming.

A while later the same mag gave away a copy of Delphi. That really opened things up. I found it was more accessible and was quickly making all kinds of stupid windows forms apps and sharing them with friends.

So, no insight into what went wrong but the name Borland has very positive associations for me, and it's safe to say their products played a role in the course my life took.


> A while later the same mag gave away a copy of Delphi.

Funny you mention that, was it PC Pro? I bought the same magazine a long time ago, I remember the CD. I really wanted to make windows appear and things happen on screen. But I unfortunately couldn't figure out the programming. It only came to me much later and long after Delphi (and winforms) had fallen.


Either PC Pro or PC Plus! Hard to remember exactly all these year later. I wrote more about how it fit into my journey as a dev in this essay: https://www.michaelbromley.co.uk/blog/confessions-of-an-inte...


PC Plus, I'd guess. They had a long-standing series of columns by Huw Collingbourne on developing with it.

Another quirky memory of 1990s Brithish PC magazines: PC Shopper had a column by then-pharmacist-turned-computer-guy, now not wholly-unknown SF author Charlie Stross.


Just finished giving it a read, thanks for sharing. Maybe some day I'll have done enough interesting things with programming to write a similar tale.


Acquired by Microfocus in 2005. Apparently some of the products live on: https://www.microfocus.com/en-us/products/borland/overview

I also wonder what Phillipe Kahn is up to: https://en.wikipedia.org/wiki/Philippe_Kahn



Not specific to Borland, but this book makes for entertaining reading: https://www.powells.com/book/in-search-of-stupidity-over-20-...

He does get a few things wrong though, like open source. The best bits focus more on the 80ies.


I believe most of the cream moved away to other companies (e.g. to Microsoft). The company itself was merged into Micro Focus a few years ago.


Micro Focus is a seriously weird company. I'd love to know what their story is.

They just seem to acquire defunct software companies. The only company that I know they sold again is SuSE. Everything else is just merged into Micro Focus. We never hear from the hordes for developer that presumable works there, nor do anyone claim to be using their products.

It's just a black hole for aging, failing software companies.


They're using the age old 'roll up' strategy: buy a bunch of past their prime products/companies with an installed base, bean count them to profitability and then milk them dry. This tends to work pretty well with enterprise customers who will pay obscene amounts of money for years/decades to maintain the status quo.

Once products/companies enter these roll up black holes, they are rarely heard from again by anyone other than legacy customers.


Their website seems to advertise the fact that they are in some the "top 9/10 investment companies, 10/10 telecommunications, 10/10 pharmaceuticals, 10/10 aerospace and defense companies." Not sure how they define topness, but maybe they've made a business out of buying up companies with lots of contracts in risk-averse fields and doing maintenance/collecting renewal fees?


Legacy software really is their stated core business. Maybe that is why they sold Suse again.


We found who's maintaining all of this COBOL code after all. They must have a lot of legacy-systems developers and can probably market themselves for expensive consultancy.


They are betting on inventing a time traveling machine. They would carry the stocks of today into the past and sell them there, then reinvest it into Apple's and afterwards would be returning back. That's what I think.


My employer had Micro Focus COBOL, acquired when we used Peoplesoft. That is long gone, but I'm not sure how long gone.


> [Micro Focus] just seem to acquire defunct software companies. [...] It's just a black hole for aging, failing software companies.

Oh, so a bit like Computer Associates, then?


Borland logo was prominently displayed on a high-profile building in Atlanta until a few years ago. Surprised they lasted so long.


I think competing directly against Microsoft in the 90s was pretty tough, even if you were largely loved by developers


The firebird database engine is open source, but came from Borland Interbase http://mc-computing.com/Databases/Firebird/index.html


I remember going to trade shows (i.e. 'SDWest') back in the day when Borland was big.

The thing I remember most: They had some guy (short, with very long hair) that did demos for them. Live and without a net. He was good, and a good crowd pleaser. I hope he's doing well today.


The ill-conceived idea of rebranding as "Inprise"[1] certainly didn't help.

[1]: https://en.wikipedia.org/wiki/Borland#Inprise_Corporation_er...


I used to love working with Borland C++ Builder many moons ago.


I'm working with it at my job. I't called Embarcadero now. They name their versions after world city names.


> What happened to them?

Microsoft poached dozens of key staff with 7 figure signing bonuses.

Also, ISTR some anticompetitive thing with Windows APIs, but the poaching was decisive.


The same thing that happened to Lotus and WordPerfect: Microsoft drove them out of business.


Factually not true. I suggest you grab a copy of "In Search of Stupidity by Merrill Chapman":

https://www.amazon.co.uk/Search-Stupidity-Twenty-Marketing-D...

Most of these companies from the 80's and 90's foot gunned themselves into oblivion or were bought by companies that ruined their products.


> Factually not true.

No one who was active in the computer industry in the 90s will say that. Don't care what that book says.


I've been active in the computer industry as a professional since the mid 80's and can tell you that most of these companies screwed themselves up through complacency, idiotic business decisions and being acquired by larger players that ballsed up their products. Their products got worse, releases delayed etc etc.

I can cite one of many concrete examples which is Ashton Tate. Some young upstart with a compiler called Clipper by a company called Nantucket came and ate their lunch because dBase IV was a pile of crap. Sadly Nantuck got bought out by CA and that was the death knell. Microsoft didn't kill Clipper, CA did.

Sure in the 90's (late) Microsoft may have been a ruthless operator and stole many and various cheeses, but they were low hanging fruits (apologies for the mixed metaphors) and you can't blame Gate's and co for taking advantage of idiotic commercial choices made by people who were asleep at the wheel.

Sure Microsoft have abused their monopoly position during the browser wars but when it came to the old school productivity suites these older companies just weren't agile enough to protect their positions.

But don't let facts get in the way of a chance to bash Microsoft. Read that book, I'll even buy you a copy.


> Ashton Tate. Some young upstart with a compiler called Clipper by a company called Nantucket came and ate their lunch because dBase IV was a pile of crap. Sadly Nantuck got bought out by CA and that was the death knell.

While Ashton-Tate itself was, perhaps somewhat ironically in this context, bought up by... Borland.


What happened to Lotus is really just sad. Selling out to IBM, who sucked every dime out, then tossing off the desiccated husk to HCL to keep it on life support and crank out an enterprise security patch from time to time.


I 100% agree. I was a Notes Evangelist for + 20 years. Now I'm using Nextcloud and hope it will have a long and good life.


Lotus was not really killed by Microsoft but by IBM...


You can absolutely still buy WordPerfect. It’s alive and well at Corel, updated regularly, and is something of a cult favorite in the legal field.

Lotus isn’t exactly dead. But it’s not exactly alive and well either.


They flew too close to the sun.


First tool I ever used for converting class diagrams to code. Felt pretty cool!


Turbo Pascal was a beautiful piece of software in the 80s.


Oh gawd, those CDs were everywhere when I was in uni.


On that note what happened to BloodShed DevC++?


Embarcadero sponsored its upgrade to support Windows 10/8/7.

https://github.com/Embarcadero/Dev-Cpp


They underestimated Richard Stallman




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: