Hacker News new | past | comments | ask | show | jobs | submit login
101 Ways to Save Apple (1997) (wired.com)
105 points by leorocky on June 11, 2014 | hide | past | favorite | 85 comments



Some people say that the advice is stupid, but there is actually a lot of sensible things that they did at some point or another, some even being distinctly Apple-y:

> 10. Get a great image campaign

> 14. Do something creative with the design of the box and separate yourselves from the pack.

> 18. Stop being buttoned-down corporate and appeal to the fanatic feeling that still exists for the Mac

> 23. Create a new logo

> 25. Portables, portables, portables.

> 31. Build a PDA for less than $250 that actually does something

people usually pay less than $250 upfront for an iPhone

> 34. Port the OS to the Intel platform

(Identical to 76)

> 37. Take advantage of NeXT's easy and powerful OpenStep programming tools to entice a new generation of Mac software developers

Especially true for iOS

> 39. Build a laptop that weighs 2 pounds

> 50. Give Steve Jobs as much authority as he wants in new product development

> 51. Speak to the consumer.

> 54. Sell off the laser printer business

> 70. Simplify your PC product line

> 72. Try the industry-standard serial port plug.

Apple switched to USB pretty quickly if I remember correctly

> 101. Don't worry. You'll survive


The problem with such advice is that some of it is always right. But you just don't which part. And you can't do it all.


Reminds me of the advertising executive who acknowledged that 50% of his advertising budget was wasted. The trouble was, he added, he didn't know which half.


Another interesting one -

> 59. Invest heavily in Newton technology, which is one area where Microsoft can't touch you. Build voice recognition and better gesture recognition into Newton, making a new environment for desktop, laptop, and palmtop Macs. Newton can also be the basis of a new generation of embedded systems, from cash registers to kiosks.


They also suggest dumping Newton, etc., which is what Jobs actually did.


Exactly, which was the smart thing to do. As much as the idea of Newton had a lot of potential, and in as much as we can draw retrospective parallels between Newton's potential and that of iOS, Newton was simply wrong for its era. It was ahead of its time in many respects -- particularly, in that telephony would become the central driver behind the idea of getting everyone on the planet to carry a device around in their pockets. Newton predated ubiquitous cell phone usage (and cellular infrastructure). In other respects, Newton was not necessarily too "late," but perhaps insufficient. It was not a viable alternative to the productivity of laptops, nor could it have been, given the ecosystem that existed at the time. Could it have powered embedded devices? Sure, but to questionable profit-margin potential for Apple. Embedded devices was an extremely fragmented marketplace back in Newton's day. (Still is today, in many respects.)

It's important to keep in mind that the iPhone worked because the world the iPhone depended upon was ready for it. The world that Newton would have depended upon did not exist when Newton debuted, and would not exist for the better part of a decade.


At least Apple didn't divest from ARM after the closure of Newton.


Most of them are very generic, pragmatic & terse statements. It depends on which side you are looking at.

> 14. Do something creative with the design of the box and separate yourselves from the pack.

99% companies would benefit from this.

> 10. Get a great image campaign

Sure, why not? That's true for every company, Intel, BMW, Microsoft, you name it. All want a great image campaign. Ultimately it's the product that makes the image campaign 'exclusive'

> 34. Port the OS to the Intel platform

That was an obvious one.

>> Apple switched to USB pretty quickly if I remember correctly

The 30-pin dock & then the lightning connector. Not exactly industry standard at both ends


>> The 30-pin dock & then the lightning connector. Not exactly industry standard at both ends

This was before the iPhone. They switched to USB on Macs at least. At that time they were using other proprietary connectors like ADB and Firewire.


Firewire (IEEE 1394) didn't show up on Macs until 1999, well after they had begun to support USB.


>>> 34. Port the OS to the Intel platform >>That was an obvious one.

in hindsight, yes. At the time, many people thought it was crazy.


It was crazy but it was also the right way forward. People speculated & desired this for a long time.


Back in 97 they probably just meant the pre-usb serial plug, that was good for mice and modems.


They probably meant ADB (Apple Desktop Bus)

http://en.wikipedia.org/wiki/Apple_Desktop_Bus


Did anybody use that old rs232 thing anymore in 1997? I would have guessed the author meant PS2.


Stuff I work with still uses rs232. In fact rs232 is still being used in new systems today :-)

Edit: before anyone thinks I'm stupid: it pays well, I get to work in interesting places and I dont have to wrangle appstore guidelines, java gc, android etc. Have to wrestle some windows systems though.


I use old-fashion serial cables for kernel development in 2014. It would be hard to debug without it. Sometimes you have bugs on physical systems that aren't reproducible in a VM and can only get any output via a physical serial cable.

For what it's worth, I also use serial-over-LAN to administer remote machines and virtual serial devices in VMs for security purposes.


People did tend to use those "modem" things back in the days.


Macs of that vintage generally came with 9-pin RS-232 serial, SCSI, and Apple Desktop Bus (ADB) ports.


I recall that Mac's had RS-422.


Well I was mostly thinking consumer machines, I totally forgot about industrial and lab stuff.

Sorry all.


54. never happened. They simple stopped selling laser printers.

72. never happened. Apple instead championed USB.


> 72

The point was "industry standard", and to be pedantic USB is a serial protocol. Also USB wasn't "championed" by Apple, in fact they weren't even in the original group of designers (IBM, Intel and Microsoft were though). It had broad industry support from the start.


USB wasn't industry standard when Apple adopted it, and Apple's adopting it (and going "all in") created a large enough market for USB peripherals that USB took off (it may well have taken off anyway, we'll never know). For a couple of years, most USB products were made of iMac-inspired translucent plastics because it was people with iMacs who bought them.


"5. Straighten out the naming convention. Link model numbers to processor speed. When buying a 3400 laptop computer, what, exactly, are you getting? Unless you study the brochures, you don't know how it compares with its competition. On the other hand, Wintel talks explicitly about processor speed. It's a Pentium 200-MHz box."

Urgh. And while we're at it, why on earth did Apple never plaster their hardware with "Intel Inside" and "NVIDIA" stickers? How else am I supposed to know that my laptop has an Intel chip and an NVIDIA chip?!?! Surely everyone wants to know this!


Back in 1997 CPU clock speeds actually mattered, there was a substantial difference between what could be done with a 100Mhz Pentium and a 166Mhz Pentium (with MMX!).

In 2014 it is screen size / resolution or storage that matters.


$$$ - Intel and Nvidia paid for the placement. The same reason why Dell, Toshiba, et al bundle a bunch of crapware with your default install: it's paid placement.

Of course, that doesn't mean it's a good idea - these companies continue to sacrifice their own brand and image in exchange for not a lot of money. You know your company has no product focus when it's willing to plaster shit all over its own products in exchange for a few bucks in extra margin.


I think Apple has embraced the philosophy of this suggestion pretty well. They don't use mystery meat model numbers, and just label a product something like "iMac 27-inch: 3.4GHz."


I think the important thing is clarity of what you're getting (and how to compare) - not specifically detailing parts. I still don't understand how this isn't industry standard. The process of buying a computer (non-Mac) these days is still as confusing as it's ever been without doing hours of research.

For instance, how do I compare two PCs:

A: Intel Core i7-4770 3.4 GHz and NVIDIA Geforce GT 620 B: FX-Series Eight-Core FX-8350 4 GHz and NVIDIA GeForce GTX760

Even when you clarify some of the parts, it's frustrating. What Apple does well (and frankly Dell and some other companies within their own ecosystems) is to add a bit more clarity even if it's still a bit opaque (iPhone 5 > iPhone 4, and the fact that all mac parts are relatively similar, just with stair-stepped upgrades).


You mock but meaningless model numbers are awful and they were banished from Apple a long time ago.


I'm not mocking the loss of meaningless model numbers. I'm mocking the suggestion to switch one lot of meaningless numbers for another.


Well, they did straighten out their naming convention but instead of putting relating processor speed and model number they simply changed everything to be classified by device type. Their most consumer friendly options only allow you to customize storage. The less-consumer friendly options have lots of customization options. I think it's great because consumers aren't really confused as to which machine is which.

Compare that to other tech companies like Garmin which has like 20 different Nuvi's that all do basically the same thing.


I think the storage size thing is just as silly as anything else. I often wonder why they bother. My best guess is it allows them to advertise a lower entry level price while pushing people to go for the mid or higher end when they get to making a decision. Ever notice how the low-storage Apple model is always just a little bit too low?


Apple ended up doing the following almost exactly as stated: 6, 7, 10, 11, 13, 14, 15, 16, 17, 19, 23, 25, 26, 33, 34, 37, 39, 50, 51, 62, 63, 70, 76, 83, 85, 87, 94, 95 and 98.

And they pretty much did these in spirit: 5 (but even simpler), 4 (Steve did it), 8 (iTunes), 9, 12, 18, 22 (iPad), 31 (iPod Touch), 40, 41 (App store), 44 (Siri), 46, 52 (briefly), 54 (dumped not sold), 72 (USB instead), and 100 (either Final Cut or Pixlet).

The remaining are either terrible ideas or intended only as humour.


"1. Admit it. You're out of the hardware game. Outsource your hardware production, or scrap it entirely, to compete more directly with Microsoft without the liability of manufacturing boxes."

Good thing they didn't 'admit it'.


"1. Admit it. You're out of the hardware game. Outsource your hardware production, or scrap it entirely, to compete more directly with Microsoft without the liability of manufacturing boxes."

Good thing they didn't 'admit it'.

Isn't this essentially what they've done with Foxconn and the other China-based producers they work with?


No. What they meant is to be a software-only company, ie. compete with MSFT without the liability of hardware.


But plenty of the other suggestions are about building hardware.


They're competing ideas from different industry-folk. There's a ton of contradiction in the article, and that was kind of the point.


No. What they meant is to be a software-only company, ie. compete with MSFT without the liability of hardware

Disagree. Apple had half-tried and experimented doing this with the "official" clones programme, e.g. Power computing:

http://en.wikipedia.org/wiki/Macintosh_clone#Official_Macint...

http://en.wikipedia.org/wiki/Power_Computing


Right, they half-tried it. This item is telling them to go all in on it.


They did outsource their hardware production...like mostly everyone else in the industry did.


You're talking about components and assembly, but that's not what's being suggested here. This was from a time period where there was almost universal consensus that Microsoft's horizontal business model (sell the OS and let other companies create hardware to run it on) was thought to have utterly and completely demolished Apple's vertical business model (create and sell both the hardware and the software that runs on it) for all time. The narrative was that Microsoft had beaten Apple in large part because the horizontal model was always going to beat the vertical model.

In hindsight, it obviously isn't that simple- there's room for both models in the industry, and each has its own set of risks, rewards, strengths, and weaknesses, all of which are constantly changing as the industry evolves. But yeah, that particular item isn't telling Apple to outsource assembly and components, it's telling Apple to stop even thinking about hardware and to try to become Microsoft circa 1997.


> It's telling Apple to stop even thinking about hardware and to try to become Microsoft circa 1997.

Ironic that Microsoft is now trying to become like Apple circa 2014. Then there is Google who has commoditized software in favor of services.


> 101. Don't worry. You'll survive. It's Netscape we should really worry about.

Why did no one listen?


Wow he was spot on. The memories of netscape live on though Mozilla.


So they went middle of the field and went with option 50?

> 50. Give Steve Jobs as much authority as he wants in new product development

Worked out for them.


This one too:

> 76. Make damn sure that Rhapsody runs on an Intel chip. Write a Windows NT emulator for Rhapsody's Intel version.

(although Parallels & VMWare did the latter part)


The first Rhapsody ran on Intel and you could compile Cocoa apps to run on (bare) Windows NT.


"7. Don't disappear from the retail chains. Rent space in a computer store, flood it with Apple products (especially software), staff it with Apple salespeople, and display everything like you're a living, breathing company and not a remote, dusty concept."

They did that with CompUSA, if my memory serves well... and are still doing it in France in FNACs for instance.


In the whole of Europe (or at least the Netherlands) where there are not a lot of Apple stores they are very actively doing this. Some large electronic chains usually have an Apple section in their stores, often with an Apple salesperson on busy days.


They still do this at Best Buy.


I remember reading this (on paper) in the 90s in college.

It came across as worthless idiot blather then, as it does now with the benefit of hindsight.

    1. Admit it. You're out of the hardware game. 
    Outsource your hardware production, or scrap 
    it entirely, to compete more directly with 
    Microsoft without the liability of manufacturing
    boxes.
Wronger words than that have rarely been spoken.


Apple has definitely outsourced hardware production (as has the rest of the industry, mostly). I don't understand what your point is.


In the context of that article, 'outsource' meant letting other companies (such as the short-lived Mac clone maker Power Computing) manufacture hardware while Apple just licensed the software.

(But also, the very best Macs, which command the highest prices, are currently made at Apple's facility in Austin, TX.)


The Mac Pros are assembled in Austin because they are high value and low volume (just like DELL still assembles servers nearby in round rock). At any rate, 2014 is very different from 1997, and outsourcing has become ubiquitous, just not exactly in the way that was foreseen.


> currently made at Apple's facility in Austin, TX

Assembled in the US, not made. It's cheaper to ship boxes from China, where they were made, full of components that stack than it is to ship assembled products that leave a bunch of empty space.

And under US regulations you still get to slap a 'Made in the USA' sticker on it and charge even more.


I think this item means stop designing, building, and selling hardware. This article is from a time when all PCs were third party clones of IBM PC yet there was breakneck innovation happening. Apple had toyed with licensing hardware.

Regardless I think they did realize that they were competitively out of the hardware game and eventually doubled down instead of folding.


"97. Have Pixar make 3001, A Space Odyssey, with HAL replaced by a Mac." They did and it was called "WALL-E"


How am I missing WALL-E being anything like 2001? I still don't see the comparison.


In general, they're very different movies, but both of them share the striking fact that you don't hear a single bit of spoken dialogue until ~15+ minutes into the movie.


When WALL-E restarts the sound is the one from the Mac boot.


Because space.


From http://2001.wikia.com/wiki/HAL_9000 :

In the 2008 Pixar animated film WALL-E, the starship Axiom's Autopilot ("Auto"), which is also the main villain of the film, has a glowing red camera, a low electronic voice and a hidden directive - deliberately reminiscent of HAL. WALL-E's pet cockroach is also named Hal, but it is also a reference to Hal Roach.


Like number 31 price is a bit off but Telco bundle pricing makes it pretty much correct.

31. Build a PDA for less than $250 that actually does something: a) cellular email b) 56-channel TV c) Internet phone.


> Like number 31 price is a bit off but Telco bundle pricing makes it pretty much correct.

In 2014 dollars, that $250 is nearly $400.


I like how everyone always wanted a computer-tv or tv-computer back then. It's the faster horse thing, but for computers.


"60. Abandon the Mach operating system you just acquired and run Windows NT kernel instead. This would let Mac run existing PC programs. (Microsoft actually has Windows NT working on Mac hardware. It also has emulation of Mac programs with NT running on both Power PC and x86.)"


Although some were (retrospectively) bad ideas, I like 31 c: an internet phone.


Not all advice given was that bad. From the article, towards the end:

"Make a lightweight, portable, palmtop Mac. Ideally, it should be a wearable, with a private eye screen and some sort of half-keyboard. If Apple can't manufacture this, it could make a deal with another hardware maker. Wearables are the future."

- Marvin Minsky, AI pioneer


"64. Team up with Sony, which wants to get into the computer business in a big way - think Sony MacMan"

That almost happened, sort of.[1]

[1] http://www.folklore.org/StoryView.py?project=Macintosh&story...


"Abandon the Mach operating system you just acquired"

No, don't do that!


They trashed Mach and Rapsody and bought out NEXT to get Steve back full time. OSX is mostly updated NEXT, right down to Objective C and API names.


Rhapsody (with the underlying Mach kernel) was the NeXTSTEP stuff in a preliminary stage of Mac-ification. The strategy of what to do with the NeXT-based OS vs. what became known as "Classic" was in flux when Rhapsody saw its first developer previews, but ultimately Rhapsody was transformed into Mac OS X. The biggest additions, IIRC, were the Carbon APIs and the Aqua UI.


Exactly. And back then, getting "Steve back full time" wasn't what they were betting the farm on. More than anything, Apple was desperate for a next generation OS. Their choice was between NeXT's OpenStep and the BeOS. Obviously, they chose NeXT.

Now in hindsight, due to Apple's success post Steve's return, many claim that he was the play all along. At the time, though, that wasn't the consensus of people I knew in and around the Apple sphere in Cupertino.


The writer was even money overall, but 2 for 10 in the top 10... not great

If you are running a startup I don't know of a better way to convince yourself not to listen to the haters


This was explicitly not written by "the haters," though.

>> So we surveyed a cross section of hardcore Mac fans and came up with 101 ways to get you back on the path to salvation.


Thinking about this period, I think they once got the Blue Box running on NuKernel, which is what Copland should have been in the first place.


This was a Wired magazine classic. They should have run a “101 Ways to Save Microsoft” sequel before the Ballmer retirement.


I wonder what James is up to these days.


James just edited it. The contributions were from (ahem) Mark R. Anderson, Ronald P. Andring Sr., Andrew Anker, Carla Barros, Dave Barry, David Batstone, John Battelle, Michael Behar, Jackie Bennion, Gareth Branwyn, Van Burnham, Seth Chandler, Tom Claburn, Christine Comaford, Peter Corbett, John Couch, Douglas Coupland, S. Russel Craig, Mark Dery, David Diamond, Dennis Dimos, Nikki Echler, Laura Fredrickson, Jesse Freund, Simson Garfinkel, Steve Gibson, Tim Goeke, Jeff Greenwald, Jacquard W. Guenon, Joseph Haddon, David Hakala, Russell Hires, Rex Ishibashi, Dave Jenne, Amy Johns, Richard Kadrey, Philippe Kahn, Kristine Kern, Indra Lowenstein, Regis McKenna, Warren Michelsen, Russ Mitchell, Eugene Mosier, Nicholas Negroponte, Eduardo Parra, Lisa Picarille, John Plunkett, Gary Andrew Poole, Spencer Reiss, Jack Rickard, Louis Rossetto, Peter Rutten, Winn Schwartau, Kristian Schwartz, Brian Slesinsky, Richard Stallman, Carl Steadman, Don Steinberg, Julie Sullivan, Kathy Tafel, Ruth Tooker, Joel Truher, Watts Wacker, Michael Wise.


One very insightful person Daly is.


if apple did what the site suggest

it wouldn't be apple


Well, yeah, that's the point. In 1997, "being Apple" meant "going out of business". So it was absolutely telling them to stop being Apple so they could live to see 1998. It didn't pan out that way, but no one could have anticipated that at the time and it's kind of miraculous that they made it through that period with the essentials of the company intact.


Maybe Wired should consider taking their archive offline.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: