Hacker News new | past | comments | ask | show | jobs | submit login
Apple Reveals ‘Lisa’, Its $50M Gamble (1983) (newsweek.com)
158 points by rbanffy on June 12, 2019 | hide | past | favorite | 179 comments



My favorite part:

Instead of typing a sequence of commands on the computer keyboard; the user merely points to tiny "icons" or commands on the screen by sliding the "mouse" (a plastic control box the size of a cigarette pack) on the desktop beside the computer. As the mouse rolls, an arrow called a cursor moves across the screen. To erase obsolete information, for example, the user moves the mouse to point first at whatever is to be thrown away, and then at an icon in the shape of a tiny trash can...


And just a few years later in 1986, the "Scotty uses a Mac" scene from Star Trek IV: https://youtu.be/LkqiDu1BQXY?t=64


Fun fact: "transparent aluminum" is a reality, albeit insanely expensive: https://en.wikipedia.org/wiki/Aluminium_oxynitride


Didn't anyone else wonder why they needed transparent aluminium? Wouldn't some water proof cameras have been sufficient to keep an eye on the whales?


Corundum is also transparent aluminum and it's insanely cheap.


Speaking of which, where's our sapphire phone screens already?


Sapphire is too brittle to be of much use in a phone screen. It would increase scratch resistance but be more prone to breaking than chemically strengthened glass. It works for watches because they can make it thicker to compensate.


You can say he tried to use Siri before Siri was developed. Talked into the mouse "Hello computer".


I guess that means people were more used to holding a pack of cigarettes than a pack of cards! Never thought about that before.


I remember trying to teach someone to use a mouse, and having to move their arm back down onto the table. See, you can't hold it in mid air, you need to slide it across the mousepad. (and in this case it was specifically required, I believe it was a mouse systems mouse with an LED and a patterned metal mousepad)


Those doing customer support for consumers have some of the best stories. In the '80s, I heard one story about a bleeding-edge grandmother who was having troubles operating her newly purchased PC. One of her troubles turned out that she had placed the mouse on the floor and was trying to operate it with her foot. Why would she do that?! The mouse looked a bit like the foot control pedal on her Singer sewing machine :o


There is actually a peripheral used that way: https://en.wikipedia.org/wiki/Footmouse

I've heard them referred to as "rats", but that term seems to have disappeared into obscurity.


I feel like this has somehow passed into the realm of urban legend as I've since heard a few similar stories repeated back to me but I used to work in phone support quite a ways back and we had a recorded call that we used to pass around to all new employees. It was a new computer user trying to figure out how to change the background of a document (I believe it was a web page he was trying to build in FrontPage but it could have been a PowerPoint).

The call went something like this:

[Call center introduction and customer response]

Employee: "So what seems to be the problem, sir?"

Customer: "I can't find where to change the background of my page. I've looked at all the buttons but I don't see a 'background' or 'color' button."

Employee: "I completely understand. Sometimes these types of things seem like they're hidden but it's pretty easy."

Customer: "It can't be that easy. I've been looking for over an hour."

Employee: "Well, it's in a context menu so they're kind of hidden. You can access those by going anywhere on an empty are of the screen with the cursor and then just right click on the mouse. It should pop up a little menu where the cursor is."

Customer: "Ok. Just give me one second. I want to make sure I do this right. I'm going to go get a pencil."

Employee: "Of course, sir. I'll repeat it again when we've done it once so you can make sure you write it down correctly."

Customer: "Thanks." [A few moments of silence] "Ok... so I move the cursor to a spot on the screen... [shoulder brushes receiver as he writes] ...C-L-I-C-K and then the computer will understand that as a command and give me a menu? Is that right?"

Employee: "That's correct. After you right click, you should see a little box pop up next to the cursor with a list of items. One of them will be 'Properties'."

Customer: "Do I have to wait? I did that and nothing has come up yet. How long do I need to wait for this menu to come up?"

Employee: "It should come up right away after you right click. Did you write click on the mouse already?"

Customer: "Yes, I did. I mean... my handwriting is not the greatest but hopefully the computer can still understand it."

The customer had used the pencil to literally write the word "CLICK" on the mouse. He didn't know that the left and right mouse buttons did something different. He thought they were for left-handed or right-handed use.


The mouse has a useful feature, in that it can be used while connected to an external power source.


What's a cigarette pack? /s


It's a drug, shipped in a pack roughly the size of a desktop mouse :P


Desktop mouse? Is that some kind of pet?


It's kind of like marijuana, except it will soon be illegal.


Alternately it's about the same size as a double cassette tape box.


Uh, Magic or Standard?


That depends on what jurisdiction you're in.


The Lisa was a good machine, but there was one big problem. Motorola had come out with the M68000, but not the MMU for it. 680x0 MMUs were years late, and the first one was terrible. The Lisa had a real OS, and needed an MMU. That had to be built out of smaller parts, which increased the cost enormously.

There was also a major bug in the M68000 - instruction backout didn't work. That was fixed in the 68010. But on the 68000, a page fault was not handled properly if a register had incrementation set. So the Lisa compiler had to be dumbed down to not use that feature, slowing down execution somewhat.

If Motorola had fixed those problems sooner, the history of personal computing might have been very different. Intel's x86 machines, with their 16 bit address spaces, might have gone nowhere on the desktop.

Hence the cost-reduced Macintosh - no MMU, no memory protection, no CPU dispatcher. Also no hard drive. The original 128K Mac was a flop commercially. Not until memory cost came down and Apple got a hard drive into the product did it sell successfully. The IBM PC had a hard drive earlier, which got them going in business use. The floppy-only Macs were incredibly slow.


Apple wasn't visible at all in Europe during those days, Atari, Amiga and IBM PC ruled this side of the Atlantic.

Now had Compaq not been so lucky, then the PCs might have indeed gone nowhere.


Certainly in the UK (special case?), that's not the case . We certainly had Apple ][s at school and 6th form in the UK as the main machines - later replaced by BBC model Bs and back then there were summer camps doing computing for enthusiastic youngsters. Apple was again the computer of choice in the early days, to be replaced later by the ZX Spectrum.


Really? Throughout the 80s, it was all Sinclairs, Commodores and Amstrads (certainly in the home) and even before the ZX Spectrum was the ZX81 which at £100 (or thereabouts) was the first home computer of choice for many, many people. I knew a number of people with CBM VIC-20s, a handful of people with Acorn Electrons.

Of course these were after the Apple 2 so maybe you're correct but I don't think even I saw my first Apple computer until 1989, and that wasn't even in the UK (I emigrated to the US for a short while that year).

My secondary school had a bunch of Research Machines, until they were replaced by BBC micros.


My UK knowledge from those days was thanks to Crash, Your Sinclair, BBC Micro, Input Magazine and Computer Shopper, so I might be bit biased in regards to any sort of Apple's widespread adoption.


Not sure I agree. Just my anecdotal data of course, but I don't remember seeing an Apple II in UK schools in the 80s, but definitely BBC Micro/Master and a few Archimedes later. This was in 2-3 North West state schools, so there might have been regional differences.

Edit - oh yes, and about the ZX Spectrum... I can't imagine anyone willingly changing from something else to that rubber keyboard! Yuck.


> oh yes, and about the ZX Spectrum... I can't imagine anyone willingly changing from something else to that rubber keyboard! Yuck.

The ZX Spectrum had tons of games (many of them quite creative, many weird), all kinds of software (limitied, of course, but even a Lisp) and several magazines dedicated to it. Many of my friends owned one, so it was much more fun than owning a more obscure computer, no matter how much better its keyboard was. Remember, there was no internet.

I still have nostalgic conversations with friends from that time about how magic the ZX Spectrum world was.


Likewise here too. In fact all through the 80s and 90s I only knew one place that owned an Apple computer. Everyone else was running BBC / Acorn, Sinclair, Amstrad, Commodore 64, Atari, Amiga, Dragon, and, much later, IBM-compatible PCs.

For reference: south east of England.


The 380Z was the overall 'king of the hill' pre-BBC, yes, some schools had Apple II's, some even IIe's but everywhere had at least one 380Z.

Had the 'PC' not won, I suspect some of the dos compatibles that the UK produced would have been more popular - specifically, Apricots were doing fairly well with the F1 and XEN in the 83-87 era where the PC hadn't quite dominated yet.


Of course this is always going to be anecdotal, but I never saw them in the north-east in the 80s either. The predecessor to the BBC Micro that I remember was the RML 380Z. I was messing with computers at school and elsewhere throughout the 80s, and to this day I've never actually seen a working Apple II.

I do remember reading about the Lisa and thinking it looked amazing.


Again anecdotal, but my experience was that state schools had BBC Micros but private schools and many technology-focused colleges had AppleIIs. Apples were very expensive. I remember helping out at a technology college night class where people were being taught Logo programming (and later Pascal). It was the first time I'd seen more than one ApppleII in a room at a time (this class had a doze or so, two people to machine).


I went to a private school (1986-89) and we had nothing but BBC Micros. Don't think I saw an Apple until well into my job career (~2000ish).


I could have typed that better for sure. I had meant to say that Apples were not common anywhere in the UK but the places I had seen some were in tech colleges and private schools which it seems would fit with your experience too.


Hmmm, this would have been before the BBC launch. This was a North London comp which received its first computer, an Apple II the month I finished my GCEs, so that would have been summer 81. When I went to a separate 6th Form college they also had Apple IIs with BBC Bs following soon after.

The BBC Micro didn't really get into mass production until 1982.


Our school had an Apple II in 1982 but they were quite expensive, so people mostly got Sinclair machines or Commodore 64s for home use. When I went to 6th form they had a few BBCs and an old Commodore PET. I don't the Apple IIs were common, but they were certainly around.


Apple ][ was very popular in the UK for Business - it never took hold in educational markets like it did in the USA as it was to expensive.


>Apple wasn't visible at all in Europe during those days, Atari, Amiga and IBM PC ruled this side of the Atlantic.

If you mean early eighties, yes. If you mean the whole of eighties, I'd say no. I remember lots of businesses with Macs in the late eighties/early nineties, especially for print businesses.


The first time I saw anyone using a Mac live, actually an LC, instead of having to drive down to Lisbon or Porto to the see the only importer in the contry was in 1994.

Around here print business were using Amigas and Ataris.


I heard Apple was popular in Switzerland.


Might be, but in the Iberian Penisula, UK and Germany, the countries I can speak about during those days, it was hardly seen around.


Yep, I remember (Spain) Amstrads, Amigas, Spectrums, all sort of PCs and compatibles, and an Atari and MSX or two.


Apple was visible in the UK during that time.

I used Apple ][s and had a demo of the Lisa when it was being launched.


Small nitpick but the 8086 and 8088 had a 20-bit address space.


Not really. 16 bit addresses with a kludge to get to 20 bits, but not in a straightforward way.


I don't think it matters how it's achieved.


Oh well... This brings me back memories.

The way it's achieved may not matter much with a 4 GHz multi-core CPU running a multitasking OS, but having to deal with 16-bit pointers and segmented memory in a 4.77 MHz 8086/8 was a huge pain I felt in the flesh.

Most compilers wouldn't even deal with that mess.


Has anyone tried retrofitting a Lisa with a real MMU? Seems like it would be a fun project to build "the Lisa that could have been".


You would also need to update the compiler to use the MMU and backout instructions properly, and re-compile all the software to use it. It would also probably require some modifications to the OS and application code as well to get much advantage.


Wow, I wonder why Apple kept using their processors, it seemed it was more trouble than it was worth


The 68000 and 68010 were 32bit CPUs, released in 1979 and 1982 respectively. Intel didn't have a 32bit x86 CPU until 1985 when it released the 386.

At the time though, the 68K was regarded as the best general purpose CPU you could get and was sort of a "default choice" for building any reasonably high performance systems. Unix workstation manufacturers like Sun and SGI originally built their systems on the M68K platforms before their respective RISC architectures developed and matured.

With that said, it's totally reasonable to make the hindsight argument that Apple should have developed the IIgs line further, with its 16-bit 65C816, and then reevaluated the CPU landscape in the mid 1980s, where you had more mature 68K chips, 32bit x86 CPUs, and a plethora of RISC options. In that world, I could very easily see the Macintosh make its debute on a RISC or 386/486 platform.

Speaking of the IIgs: It's rumored that Steve Jobs requested they downclock the CPU (it only ran at 2.8mhz) to prevent it from seeming faster than the Macintosh.


I worked on a IIgs at the very beginning of my career, porting a word processor that had originally been developed for the Amiga and subsequently ported to DOS and Mac. Having worked on the Mac previously, I was a bit surprised that the IIgs OS was actually more advanced in some ways. Color, for one, but hierarchical menus are another one that stick in my mind. It was slow and weird, but the IIgs was a remarkably modern update to the already-venerable Apple II series.


Because the 8088/8086/80286 had their own bugs, and 68K was a better architecture than x86. No segmentation, orthogonal registers and instruction sets, etc. This superiority probably peaked around the 68020, which started losing out to the 80386 the next year, and after that Motorola never quite recovered. The Pentium, which I imagine most here think of as a starting point, was really more of an endpoint in a complex battle between Intel, Motorola, NatSemi, and a whole bunch of RISC alternatives.


Well, yeah, but if you're buying a 32 bit CPU with weird bugs then why bother.

At this point, I'm thinking the x86 memory segmentation sucked less than a 32 bit CPU with no MMU and bugs requiring the compiler to pull the parking brake

I am familiar with how bad it was on the x86 world, but then maybe that explains why Mac OS before X was so bad


This makes the new Mac Pro look like a bargain... especially after you adjust for inflation.


My dad and I found one at a garage sale when I was a kid. I bought, repaired, and resold mostly 8 bit machines, but found a big bunch of Apple Lisa stuff for not a lot of money, maybe $150. I didn't really know much about it, but it was interesting and I figured I could get my money back out of it. It had the computer, three external hard disks (2x5MB and 1x10MB), and some software and other accessories.

This would have been around 1988-1989, and the Mac had been out for a while; this seemed like a quaint old computer at the time. Since the Lisa wasn't quite Mac compatible, there wasn't really anything you could do with it...the software it had (early variants of stuff like Mac Write and Paint and such, called, I think Lisa Write, etc.) was what you got.

When I unboxed everything I found the receipts from when the original owner bought it all new. He'd spent something like $20,000+ on the whole setup. Computer was $10k, and each of the hard disks was several thousand dollars.

I sold the whole setup for about twice what I paid for it (I seem to recall about $300, but it's been a long time, and it wasn't super memorable...I bought and sold a lot of weird old computers back then), after cleaning it up and testing everything and tinkering with it until I was bored (I was a Commodore kid with a C128D and saving up for my first Amiga...Apple stuff was just a curiosity, not anything I wanted for myself).

Though I wasn't super into it at the time, it's one of the few things I kinda wish I still had all these years later. It has real historical significance that I didn't really appreciate at the time.


I think the difference is between productivity gain. In these early days, you bought a computer or you had to use a calculator and paper. It's the difference between automating a dentist or realtors office or not doing automation.

Now, buying a new computer results in marginal productivity gains because likely it's just replacing another computer and maybe improving things or maybe just adding a touch-bar.


You mean adding fresh bugs and anti-usable features.

There are a few things you can do by adding expensive hardware, but they're generally limited to advanced media processing, like video editing and raytracing. Maybe a few medical tricks.

(Gaming does not count, it scales to generally available hardware.)


coincidentally I just looked these up the other day:

Apple Lisa 1983 $9,995 ($25,143 in 2018 dollars)

Apple Macintosh 128k 1/24/84 $2,495 ($6,000 in 2018 dollars)

Macintosh II 3/2/87 $5,498 ($12,125 in 2018)

Next Cube 9/18/90 $10,000 ($19,177 in 2018 dollars)


I bought a 25 MHz NeXTstation in 1990 for the low, low price of $3,995. $7,500 in 2018 dollars.

In hindsight I should have bought Apple stock. Oh well.


Had to look it up. Would be worth ~$0.5M today


I was probably happier not knowing that...


Now do SGI workstations!


I remember 25k SGI workstations back then


A Xerox Star makes the Mac look cheap at $16,000 in 1981 dollars, which would be roughly $40,000 in today's money.


The Lisa was also much, much nicer to use.


$25k after inflation. I had no idea it was that expensive!


The IBM PC in 1981 cost $4,500 to $8,500 in today's money. The original Mac would cost $6,000 if released today. Computers have become a great deal cheaper in the ensuing decades.


Bill Machrone, long-time editor of PC Magazine, coined a "law" that the computer you wanted always cost $5,000. Which held reasonably true for a rather long time. It broke down maybe about 10 years ago although, as with the latest Mac, you can still get to $5K without too many gymnastics for pro video work or high end gaming.



You can configure a desktop for way cheaper than that and it can run 90% of modern games. Just don't get the latest generation top of the line Nvidia/AMD card, either get the previous top of the line one or a decent mid-range one (check benchmarks when choosing).

Don't get the most expensive RAM or CPU, you lose 10% real performance but you halve your budget.


Yeah, knock off $1000 or more if it's a tower.


Is it really the computer you want if you can't run the software you want on it?


Given the various gaming peripherals the parent lists, I'm guessing that they may prefer a Windows system.


What software do you want to run?

Windows is fantastically supported: Microsoft Office, Adobe suite, AAA games, Active Directory, databases, browser and browser plugins, and a thousand others.


My Apple II cost around $1k in 1978, about $4k today. That was without a disk drive.

Today, a Rasbpi costs $5.

Holy freakin' cow.


The Raspberry pi is a killer deal.

When I was in college in the mid 2000s my senior project needed a single board computer. We spent $500 on a small PC104 board with a 100MHz National Instruments Geode (x86 compatible) process. That didn't include any RAM or storage. I really wish the Raspberry PI would have been available back then. Especially the Raspberry PI ecosystem, makes interfacing with stuff so much easier. We interfaced a small LCD I found at a surplus shop and it took us weeks to get it working. No you can buy a touch screen LCD for $30 and there are libraries to get it working in about 20 minutes.


> Computers have become a great deal cheaper in the ensuing decades.

By the way, it turns out people think "automation is replacing jobs in manufacturing" because they've misread the data showing them just how much cheaper computers have gotten since then, and therefore how much more computer one factory employee can make.


I am really curious about how anyone could afford computers at these prices in the 1980s. I guess my family was really poor then. We had a C64 and later an Amiga.


Most people couldn't. And businesses which could afford them wouldn't. Which is why Lisa bombed.

But Lisa's pricing made the Mac more appealing. It wasn't exactly affordable, but you were getting maybe a third of a Lisa - including the new GUI, which was obviously the future [tm] compared to the Apple II - at less than a third of the price.

Even at the prices being charged, the Mac and PC were competitive with previous standards, and affordable enough to have a sizeable market among affluent middle class users.

They offered more than the old S-100 boxes did, for the same or less money. And they were much cheaper and more "personal" than industrial minis like the PDP-11 and the VAX.


Fun fact: Lisa sales were much better than the projections in the marketing requirements document (MRD). However, the MRD was using a much lower price ("end user price under $5000") and now the Lisa had to bring in exploding development costs. What really bombed (or flexed) was the Apple III motherboard, resulting in Apple having to reduce its line of products – and Lisa was the first to go.

Regarding the price, mind that the Lisa came with an integrated office suite (long before the success of MS Office) and was targeted at offices and professionals (think dentists). Which rendered it a somewhat curious "workstation for office work", at least from todays perspective, where eventually office machines became the epitome of cheap, bare-bones boxes.

The MRD mentions as potential users secretaries, managers, and executives (of Furtune 1500 businesses) as well as bookkeepers in general.

[1] https://archive.org/details/Apple_Lisa_MRD_Marketing_Require...


They didn't unless they were well off. Only 8% of houses had a computer in 1984. 15% by the end of the decade. C64 was $595 new, if you had one I doubt your family was poor.


I bought a TS1000 in 1981 or 82 for $99, and a C64 in probably spring 1983 for $299 at Circuit City. I was in 9th grade and used paper route money to get it (parents thought they were glorified Atari 2600's). Hehe, remember newspapers and paperboys?


I'm surprised only 8%. My memory of growing up in the UK in the early 80's was that ZX81'S and ZX Spectrums were super common. I bought my ZX81 in 1982 for £50 new which looks like it was maybe $80.


One of the reasons the Spectrum was so common was the low price. And of course using taps for games meant that most weren't paid for, but copied and shared around at school.

I grew up with the ZX Spectrum, like so many others, and it is what started me programming/developing/being interested in computers.


The UK actually had the highest level of computer ownership in the world at around that point.

You're probably looking at it from a biased point of view though. When I was growing up everyone had a Snes. Of course it was just all teenage boys that had one.


My family was pretty poor, but my parents found a way somehow.


It would have been closer to $200 around the time we got it!


I'm not even sure you can get a working real C64 for $200 now.


There seems to be a few C64's including breadbin and C64C models on eBay most days that I look. Granted some may need recapping or other work done, but there are plenty of others in working condition or only needing minimal repairs. I think you could get one for not much more than $US200 + plus shipping.


So funny that you could get a Raspberry Pi for like $5 that is orders of magnitude more powerful, and could even emulate a complete C64.

Everyone loved the C64 though, so fond memories are worth something.


As others have noted, individuals mostly didn't own PC clones or Macs in the early 80s and they weren't even all that common in businesses.

I did buy a dual floppy PC clone in about 1983. I don't remember how much it was--wish I still had my receipts from that far back--but it was a big purchase for me at the time. [ADDED: I probably dithered over it for something like a year, during which time it became obvious that PC clones were the future rather than the S100 etc. systems running CP/M.] Based on ads from the time, it was probably about $2500 but a printer and software would have added to that.

And when I went to business school about a year later, I was one of very few people in my class who had their own computer. (There was a small computer lab in the school--that actually had a Lisa among other things as I recall. At some point when I was there they added a bigger lab with a bunch of IBM AT clones (80286s)).


If your family had a personal computer at all you were very well off indeed.


My Dad bought a 386 with a whopping 40mb Hard disk in 1991 I remember him mentioning it cost a small fortune and he could have purchased a car for the same price he paid.

I remember one of my friends had a Tandy the 386 ran rings around it (it supported vga graphics for one - I can remember playing the original Civilization game and Prince of Persia on the 386).


My parent’s Mac II cost over $6000 at the time we got it. Compared to the IBM PCs at the time, the 640x480 hires display was stunningly gorgeous!


My first Mac (512K) and Imagewriter cost north of $5,000. I still remember paying for it with cash from anything I could sell, a cheque with the money I'd saved for months, and the rest was on my newly acquired VISA card.

When I think about that investment as a ratio to my disposable income at the time, I was crazy! But, as a retired IT Exec, things did work out in-the-end.


The Mac II was released in 1987. On the PC side, VGA cards were introduced a year earlier and could do those resolutions.


VGA 640x480 was 16 colors, wasn’t it? The Mac II did 256 colors at that resolution.

The software side also helped. Windows (¿mostly?) ran a fixed palette, while the Palette Manager on the Mac made it easy for applications to pick whatever color (out of 16M) they wanted.

On the hardware side, most PCs still had monochrome monitors, and even those with color monitors didn’t typically have ones as good as the Trinitrons that Macintosh IIs often had.


So are we talking about what was available or what people could actually afford? Because if it's the later, the Mac II was much more expensive than even the highest end PCs.

You're right though on the color palette. I never understood why Windows nor its applications didn't change it to something less garish than the default CGA palette.


I think we’re talking about what people saw. If you saw a Mac II, chances were quite good that it either had a Trinitron display or that monochrome vertical one.

If you saw a PC with a color monitor, chances were that monitor was moved over from a DOS machine, and dated from the 320x240 era.


Do you have any idea what a VGA card cost at the time, or what a VGA CRT would run you?

Your comparison is hardly apt.


Compared to a Mac II, it was pretty cheap.


EGA (640×350) came out in late 1984 and was the first reasonably decent color display on the IBM PC. Prior to that you mostly had a choice between mono text, really crappy CGA color graphics, or something proprietary like the Hercules mono graphics.


It was cheap, compared to the Xerox Star it was "inspired" by...


True. But consider about 2.5 years later, you could've got an Amiga for 1/8th the price.


My dad paid $6k+ for a IIci in 89. I had no idea at 9 years old and really no appreciation for what kind of money that was. In 2000 I bought a Power Mac G4 for around $2k. Being young with lots of disposable income, I didn't appreciate then either what kind of money that was. It's only recently that I've started to understand what a computer costs as I've bought computers while paying living expenses.


In about the same year, my dad brought home a Macintosh SE for the family. I was very young and remember only a few things: it ran the game Dark Castle in b&w from a hard drive, and my father had pulled the entire system out of a dumpster at work(!). The machine had been used in some way on a defense or NRO contract, and when the project was cancelled, the entire thing was simply discarded. If I recall, one of the back corners of the case was slightly dented from its unceremonious toss into the dumpster.

As a kid I took it all for granted, but we never would have been able to afford such hardware on my dad's engineering salary at the time. The price for a new one adjusted for inflation today is jaw dropping.


In 2000, you were 30 years old. That's hardly young :)


30 is still pretty young, but your math is off.


I was 20, actually :)


Bargain for whom?

According to the article Lisa was a $50 million gamble.

I just read somewhere about the 2000 engineers Google took on for the Pixel phone from HTC. Keeping the lights on for the buildings they are in is a $50 million gamble, that's without paying them or allowing for inflation. But you get the idea, $50 million was cheap for the product compared to what hardware tech costs to develop today, particularly if it has an operating system to write from the ground up.

Incidentally, the threat from IBM mentioned in the article. Peanuts turned out to be the ill-fated PCjr that had good graphics and a bad price point for the home market. Popcorn turned out to be the first PC based lug-gable computer from IBM.


In addition to inflation (which other commenters have already mentioned), don't forget to take into account the relative sizes of 1983 Apple vs today's Google.


>the threat from IBM mentioned in the article. Peanuts turned out to be the ill-fated PCjr that had good graphics and a bad price point for the home market.

And a really lousy "chiclet" keyboard. I don't remember all the details of the PCjr flop but, as I recall, it was still quite a bit of money for a system that had a whole big bunch of compromises.


Keep in mind that $50M then is about $130M now, adjusted for inflation.


Trying to understand Steve Jobs is an effort in futility I know, but I've never understood what he was thinking around this time. He refused to accept responsibility for his daughter when she was born in 1978, and even 5 years later, he was still questioning the accuracy of DNA paternity tests in Time Magazine and despite being a multi-millionaire, only grudgingly paying $500 a month for child support. But then he goes and names this computer after her, yet claims for years it wasn't named for her. Sure, he was pretty young (28) but that's hardly an excuse. Jobs wasn't just a pathological narcissist, he was just plain weird.


He was doing a lot of drugs and had mental health issues.


It was out of his control, foisted upon him. He didn't want the responsibility at the time, didn't want to focus his time on it. So for an immature 20 something, he took the route of deny deny deny to evade it. Lisa had noted with emphasis previously his pattern for having something less than no patience for things that took his time against his wishes (when he didn't want them to).

Combined with his emotional issues, rage, temper - it resulted in a manic back & forth. When he wanted to be magnanimous or kind, he could be, and it always had to be strictly on his terms and only if he didn't feel forced into it in any manner (to the far end of that spectrum, like he'd do it only when it was least expected, as an amplification device). If anything attempted to force his hand, or he perceived such, he very aggressively rebelled against it in all cases (you see that pattern over and over again throughout his history with people and business). It had to be on his terms, or there would be no terms at all. When he could reorient the context of Lisa (his daughter) to his terms, as and when he saw fit, then it became acceptable. You can see some of that behavior in action, in Lisa's description of what it was like to live with Steve when she was younger (my way or the highway atmosphere; he had to feel in control). He seemingly struggled to control his emotions for most of his life, which must have been wildly frustrating for someone like him. That lack of personal emotional control might explain a control over-compensation directed at other things in his life, the need to control everything else (and perhaps a behavior where if he felt in control of everything else around him, then he could keep the other things from setting off his emotions, which he couldn't control properly).

That's my read on it anyway.


"It was out of his control, foisted upon him. He didn't want the responsibility at the time, didn't want to focus his time on it. So for an immature 20 something, he took the route of deny deny deny to evade it."

???

'20 something' is well into adultland.

You want a good job? You want to vote? You want to be treated as an adult?

Then there's no such thing as 'out of his control', really.

For someone who's ostensibly responsible enough to run an entire company, being responsible for one's children is well within reason.

Everyone has challenges in their personal lives, it's 'never a cakewalk' - ok - but there are really no excuses for Jobs here. Point blank.


You misunderstood what adventured wrote, it isn't that he (adventured) believes that it was out of Jobs' control, but that Jobs himself saw something happening (his daughter's birth) that he didn't control. Jobs wanted to be in control.

This is stuff Lisa herself has written about.

(also 20somethings can act very immature, as can 30somethings, 40somethings, etc and the opposite where teenagers act more mature than expected - it is that "expected" part that sometimes fails with people, not everyone is the same)


Plus he lied to her about it when she asked him years later. It’s in her fairly recently published memoir.


Maybe she was named for the same muse that the computer was named for?


Not mentioned in the article, but one of the major contributors to its failure is that the Lisa was a very restricted environment that made it difficult to develop for:

https://en.wikipedia.org/wiki/Apple_Lisa#Third-party_softwar...

Contrast this to the IBM PC where people could easily get started programming in BASIC (included in ROM!) or Asm (MS-DOS DEBUG), for which many magazines of the time had listings. Of course not every user did, but certainly a lot of them started and eventually helped greatly grow the amount of software available.

The PC had a learning curve but the user was in full control, whereas the Lisa didn't have much of one but had many impediments that prevented users from becoming developers. This attitude persists in Apple today.


Did much of Lisa even survive to become the Mac, or was it just in spirit?

Considering you needed a Lisa to develop for the Mac for the first few years, it’s good that they made it.


Spirit. The Lisa was a MUCH more powerful computer than the first Mac - which is why you needed a Lisa in the first place. The Mac wasn't able to run much of anything developed for the Lisa. Mac software was written in machine language, as I recall the Lisa made more use of higher level languages, so even the tool chain would be problematic.

Check folklore.org for more stories of that time.


Specifically, Apple Pascal: http://pascal-central.com/images/pascalposter.jpg (i want to recreate this in SVG and print a real poster out of it at some point).

Although AFAIK the original Mac was also programmable with Pascal (in folklore it mentions that the calculator, etc were initially written in Pascal as examples for the API).


the reason I went into Pascal was when I was younger I found out my favorite Apple II game; Wizardry; had been written in it. I think they used USCD Pascal.

awhile back I was able to find recreated source of the game

[1] ftp://ftp.apple.asimov.net/pub/apple_II/images/games/rpg/wizardry/wizardry_I/

Spreadsheet of game maps, etc.

[2] https://docs.google.com/spreadsheets/d/1Mo1EoWXSnR5gNn5kAt9f...

Wizardry III re engineered

[3] https://groups.google.com/forum/#!searchin/comp.sys.apple2/w...


QuickDraw, the 2D graphics library, is probably the biggest part of the Lisa that lived on in the Mac.


Additionally, the Lisa Desktop Library contains many similarly named routines to the Mac Toolbox. The Lisa Desktop Library was never directly exposed to third-party applications though. (The Lisa Toolkit exposed it indirectly.)

Lisa Desktop Library: https://www.apple.asimov.net/documentation/applelisa/AppleLi...

As an example, the Lisa Desktop Library has

PROCEDURE HiLiteMenu(menuId: INTEGER);

The Xcode 11 beta that just came out this year still has <Menus.h> which includes...

extern void HiliteMenu(MenuID menuID) AVAILABLE_MAC_OS_X_VERSION_10_0_AND_LATER;


That function is inside an #if 32-bit, which means it no longer actually exists in 64-bit only macOS.

I think AppleScript/Apple Events is the oldest technology still standing, but maybe printing and scanning is older.


Good point! it is only usable from 32-bit apps, which don't run anymore.

But -- if we want to be technical, the function still actually exists, since macOS menus are still driven by Carbon code.

$ nm /System/Library/Frameworks/Carbon.framework/Frameworks/HIToolbox.framework/HIToolbox | grep HiliteMenu 0000000000077d86 T _HiliteMenu

That's kind of amazing to me!


Additionally to the other features mentioned, originally the file system, but without using any of the auto-repair abilities, the Lisa had. (This was soon to replaced by HFS+.)


According to the wikipedia page for the Lisa, Jobs went to the team developing the mac, and changed the focus from a command-line machine to be a cheaper Lisa competitor, releasing the Mac 1 year after the first Lisa. How could Jobs get away with trying to cannibalize the sales from their flagship machine? It seems traitorous if true.


Haha! They've made several movies and written books about this drama. Whether or not it was traitorous depends on who you view as being a traitor first, Steve Jobs or John Sculley.

I recommend reading Walter Issacson's book on Steve Jobs, or watch the movie "Pirates of Silicon Valley" for a good take on it.


folklore.org is another good source. It is a collection of stories written by the team that created the original Mac (Most stuff written by Andy Hertzfeld)


V1 Jobs was known for being a selfish asshole.

V1 Jobs (post 97) was much the same, but with a bit of humility that allowed him to empathize enough to be able to make successful products.


I really, really, really wanted one of these when they first came out and my stepdad laughed in my face when I asked to borrow that much money for a computer "for college" when we had a perfectly functional Apple IIe at home.


It’s interesting to me how well-reported this seems, or maybe how it seems informed by ensuing decades of literature rather than being reported in the moment. Apple’s mythos, Jobs’s showmanship, and the importance of Xerox PARC and Apple’s deal with Xerox were all established lore very early on. I guess it’s just bias on my part that it seems like things would have been less clear or known at the time.


I think it was a distribution problem. Now we’re accustomed to being able to easily find any public information we want, but 35 years ago you had to work at it.


    > Jobs personally heads the MacIntosh development team
MacIntosh :)


That (or McIntosh) is the correct spelling for the fruit breed (though not the clan name from which it descends).

I don't know at what point Apple settled on its non-standard spelling for the product, but it might well have been after this article was written.


Steve Jobs could not get trademark on McIntosh but was able to negotiate for Macintosh

https://www.soundandvision.com/content/flashback-1984-apple-...


I don't see a reference to it at that link but it's because audio equipment manufacturer McIntosh Labs had the trademark:

https://www.mcintoshlabs.com/

Edited to add from wikipedia:

“Apple Inc. employee Jef Raskin named the Macintosh line of personal computers after the McIntosh. He deliberately misspelled the name to avoid conflict with the hi-fi equipment manufacturer McIntosh Laboratory. Apple's attempt in 1982 to trademark the name Macintosh was nevertheless denied due to the phonetic similarity between Apple's product and the name of the hi-fi manufacturer. Apple licensed the rights to the name in 1983, and bought the trademark in 1986.”

https://en.wikipedia.org/wiki/McIntosh_(apple)#Cultural_sign...


I just lost ten minutes to drooling over the absurdly over-the-top McIntosh Reference System. Now I need to win the lottery.


How much does it cost?

Unsurprisingly the price doesn't seem to be listed on their site.



> MacIntosh :)

They'll probably program it in "C", and then in the 1990s move on to JAVA, with some software in PERL.


I think the Lisa was programmed in Pascal


I guess most people really can't distinguish between "C" and C.


Perhaps you should tell us the difference


Apple platforms were mostly developed in Object Pascal and Assembly, until they added support for C and C++, with MPW and Metrowerks compilers.

Then it was mostly in C++.

Apple platforms never were big C fans.


> Apple platforms never were big C fans.

Ah, but were they "C" fans?

Never mind. My joke apparently went over everyone's head.


Was C ever really named "C" with the quotes? Was it supposed to be a reference—were "C-style strings" a key innovation of C over BCPL or something?


> Was C ever really named "C" with the quotes?

No, it wasn't, but it gets called that often enough for me to have made a joke about it.


>. To erase obsolete information, for example, the user moves the mouse to point first at whatever is to be thrown away, and then at an icon in the shape of a tiny trash can; at the press of a button on the mouse, the information vanishes

I'm trying to imagine most readers of a 1983 Newsweek magazine would read this and be turned off from computers for life


I was surprised seeing Steve Jobs wearing a suit in that picture.


He had an Armani suit phase for a number of years.


Losing only 5% market share 2 years after early 1980s IBM enters said market seems pretty good to me, actually.


Last line: "But win or lose, the age of "friendly" computing has begun."

Feels bittersweet to recall the age of "friendly" computing in these dark days of "antisocial" computing.



If you convert the retail price of a usable configuration of Lisa into 2019 currency it is pretty nuts.

It debuted at $9995, which is about $26,500 in today's money.


I recall Macs being AppleTalked together to move anything anywhere—it was magic.


Watching early Apple presentations and modern presentations shows the company prioritizing sales over innovation. Now we see a phone that is marginally better than the previous year, with taglines like "the best iPhone ever". The future is AR and gestures in my opinion, if keyboard input speed is ever an issue then a glove Swype interface would work. Many smaller companies have already set the stage nicely, such as the currently monocular Vuzix Blade (https://www.vuzix.com/products/blade-smart-glasses).

Edit: Guess HN thinks we're going to be carrying screens in our pockets forever.


While we are definitely going to have more AR, I'm not convinced that it's the future at all. But I guess that's the case with most people and most technologies until they become ubiquitous.

AR still seems like a solution looking for a problem.


Funny, I've felt the same thing about VR. With AR, I can think of a million practical applications. With any kind of VR that is short of full sensory immersion, I can't think of a single practical use.


> With any kind of VR that is short of full sensory immersion, I can't think of a single practical use.

If you've ever had to use a hardware "simulator" to train on Big Hardware like a plane or submarine, using VR as a replacement is an "obvious win." Rather than dedicated rooms with all sorts of custom fake hardware that only a few people can use at a time, you can buy one classroom full of VR equipment, and then every student can do their simulator runs in parallel, allowing each student far more total simulator-time. As well, to switch to a simulation of newer-model hardware, you just need a new piece of VR software, rather than entire new rooms full of molded plastic and slapshod wiring.


Wouldn't you still need the fake controls? It reduces the problem, but doesn't get rid of it entirely.


Has VR ever been sold as "practical"? I've never seen it used for anything than gaming. That's a large enough market to justify the tech IMO.


The only games that really appealed to me as a use case for VR are "Keep Talking and Nobody Explodes" and using it as a virtual cockpit for mech/dog fighting sims, otherwise it just seems like a thing I have to have on my head for not much gain.

I think it is used for pornography a fair bit though


If the future is AR, it's a good thing Apple is one of the leading companies in this area.


I would not say even their software AR is leading AR in general (assuming you are referring to ARKit). They have no hardware AR devices, correct me if I'm wrong. What are they leading?


Every recent iphone (6s and newer) is an AR device. You can load up an Ikea app, and check how the furniture you're interested in looks in your room, or if that end table will actually fit between the wall and your couch. There was an estimate 2 years ago that there were 700 million iphones in active use. Let's assume that the current number isn't any larger and that 6/7 are recent enough to use ARKit, so let's say 600 million AR devices. I'm willing to bet that is more than all other AR companies combined.


What's a "hardware AR device"? Anything with a screen and a camera. You don't need immersion for AR. Pokemon Go is AR.


Isnt every phone a potential AR device? The Minecraft demo at WWC comes to mind.


I guess you're confusing AR and VR. VR needs dedicated hardware. AR doesn't. And Apple is one of the leaders in AR because they already ship state-of-the art AR devices in the millions.


They aren't selling any AR devices. Doesnt mean they aren't developing and testing them internally.


The idea of having to wear a glove to type is horrid.

As with the iPod, iPad, and iPhone, I fully expect Apple to wait for things like smart glasses to evolve a bit before jumping into the fray with a mass-market application. Being one of the first-to-market has never been their thing.


They are pretty far from first to market. I'd say Google was first to market with Glass, along with several competitors including Vuzix. Apple developed ARKit, which would work nicely with glasses technology. I agree a glove is a little too complex, but an eye tracking Swype would be a useful innovation.


I think eye tracking Swype sounds horrific.

I think I'd feel sick if I had to type anything of any length, and I don't even get motion sickness usually.


Gestures are not the future for one simple thing: physical fatigue.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: