Hacker News new | past | comments | ask | show | jobs | submit login
Interview with Steve Wozniak (theconversation.com)
158 points by sohkamyung on Aug 26, 2016 | hide | past | favorite | 61 comments



One of favorite interviews of Steve Wozniak is still the one by jl in Founders at Work.

http://www.foundersatwork.com/steve-wozniak.html

> When I got done, I'm looking at these 2 floppies that look just the same. And I decided that I might have written onto the good one from the bad, and I did. So I had lost it all. I went back to my hotel room. I slept for a while. I got up about 10:00 a.m. or so. I sat down and, out of my head and my listings, recreated everything, got it working again, and we showed it at the show. It was a huge hit. Everybody was saying, "Oh my God, Apple has a floppy!" It just looked beautiful, plugged into a slot on our computer. We were able to say "run color math," and it just runs instantly. It was a change in time.

But the real eureka moment for me was the very first time I ever read data back. I wrote it on the floppy, which was easy, but read it back, got it right. I just died.


This is insanely great reading. I love how old-school technical it is. I haven't finished reading yet, but my favorite quote so far:

> "What advice would you give to hackers who are thinking about starting a company or making something on their own?"

> "Wozniak: First of all, try to have the highest of ethics and to be open and truthful about things, not hiding. If you have to hide something for company reasons, at least explain what you're doing. Don't mislead people. Know in your heart that you are a good person with good goals because that will carry over to your own self-confidence and your belief in your engineering abilities. Always seek excellence: make your product better than the average person would."


> have the highest of ethics

This summer I read a couple of stories about Woz in Programmers at Work. Around 1980, when Apple was about to go public, it became clear that there were two kinds of employees: the engineers who were focused on work, and others who spent all their energy gaming stock options. When Woz found out that the shares ended up being so unbalanced, he made a large pool of his own shares available to employees. The "scheming" employees still ended up taking advantage of the others (and Woz) by gaming that arrangement. This is not altogether shocking, since other stories indicate that Woz spent his own spare time for example, seeing how many digits of `e` he could fit into one of their computers (edit "He had to use every single piece of memory, including the memory on the display screen, to hold this big number. And he didn't have any intermediate results because all the memory was holding just one number. This program took fourteen days to run.")[0]

I have such high respect for Woz. I believe he really does put "being a good person in his heart" above everything else.

[0] This was from the interview with a programmer who worked for Apple at the time (edit Andy Hertzfeld). Sorry for the bad retelling, I don't have the book handy.


I love that quote too. To add to "always seek excellence", here is his "how":

> Livingston: What is the key to excellence for an engineer?

> Wozniak: You have to be very diligent. You have to check every little detail. You have to be so careful that you haven't left something out. You have to think harder and deeper than you normally would. It's hard with today's large, huge programs.

How many times on HN do we hear about actual engineering excellence vs. the hack, the weekend mvp, the "if your product doesn't embarrass you, you've released too late" dogma?


> if your product doesn't embarrass you, you've released too late

In the age of shallow work, this is the dogma, that justifies shitty things being released as a philosophy of continuous improvement.

In my opinion no lasting and great product can be built that way. Only more shitty things that flood our market. You collect feedback on your shitty app and implement optimization? How does that differ from design by committee?

To be timeless and produce the highes possible quality you have to go deep, understand the problem you are trying to solve and solve it once and for all.

Solve it to and above the best ability you can bring to the table at exactly that moment in time.


You ever see the first few versions of Facebook and Twitter? http://makers.crew.co/wp-content/uploads/2016/03/OG-Twitter....

The point is, if you have a good idea, regardless of how complete it is, get it out there and see if it warrants going further.


I don't think you can defend Facebook or Twitter as timeless or high quality. Even Google is kind of meh at a technical level.


I was directly responding to "In my opinion no lasting and great product can be built that way." Facebook, Twitter, and certainly Google have been proven to be long lasting. Google is ubiquitous at this point!


Android for example is good enough, but far from Great IMO. Granted, this is all subjective but an overhead sprinkler whose sensor is a part that melts/deforms with heat is timeless solution that just works (1879 https://www.google.com/patents/US218564) . An electronic sensor could work, but has far more failure modes. Now compare this with say Facebook whose design regularly undergoes significant revision.


Excellent comment.

I'm not sure that many software companies today are aware of how corrosive it is to the company's culture to not strive for excellence and integrity, and to encourage the same from every employee. What exactly does it accomplish to ship a product that is buggy or half-completed, just so that you can hit a deadline ? Do these companies think that the customer isn't going to notice ? It's much better to be honest with a customer about issues when you become aware of them, and allow the customer to plan accordingly.


And striving for excellence might produce something that lives on way beyond it's time.

Just to name an example, I would name the designs[0] of (or inspired by) Dieter Rams[1].

[0]: http://www.cultofmac.com/188753/the-braun-products-that-insp... [1]: https://en.wikipedia.org/wiki/Dieter_Rams


Finally finished reading and that's an amazing interview. I thought I had read all the histories of Apple but I stilled learned something new. Love the technical content.


> He has previously talked enthusiastically about the investment of the Queensland Government of A$405 million in the startup scene.

While the Advance Queensland Summit was interesting, I've been in Brisbane long enough to know that I shouldn't believe it until I see it. At the end of the day, the amount of "non-startups" that claim these grants, and the struggle of those who actually _do_ run startups to get access to this sort of funding has left me rather cynical...


I'm involved with an organisation that recently tried to get an Advance Queensland grant for non-profit outreach/education events relating to startups/entrepreneurship (Young Starters' Fund).

We were considered ineligible to apply on the grounds that we weren't GST-registered. Now, we're a registered charity that operates on a budget that is far, far, far under the threshold where GST registration is mandated by federal law. In fact, we're so far under the threshold that all our accounting is done by our volunteer committee - being registered for GST would massively increase our administrative workload, which is why we don't do it.

Seems well intentioned but... a bit bureaucratic.


Your story is so similar to those I've both experienced and watched play out over and over again...

> Seems well intentioned but... a bit bureaucratic.

QLD government tech investment in a nutshell, really.


Here's hoping that Woz pops in at the invocation of his name!

https://news.ycombinator.com/threads?id=stevewoz

(I'm referencing this memorable AskFilter thread: http://ask.metafilter.com/47835/Woz-More-like-Was-am-I-rite#...)


It's interesting that for every public interview/article about Steve Wozniak, there are several people quick to label him a "has-been".

A further irony being that they themselves are "never-even-was-in-the-first-place-and-will-probably-never-be" (which I'd argue is worse than being a "has been").


By that standard nobody would ever be allowed to criticize anybody who got in a position of success or power, though, just because "they achieved more than you, sucker".

Although I really think Woz is a great guy and that we historically owe him A LOT, one can't understate the fact that he's been operatively absent from the industry for a long time. His comments on Apple come from insights that are quite stale. He might have his own good friends inside the company, but he's certainly not an insider in Cupertino anymore. He often lets journalists paint him as such, though.

I respect his insights on the industry at large, because he has certainly seen a lot and done a loto. Even those comments are a bit nicked by his shallow consulting jobs that in the last 10-15 years never really led to anything relevant, since they were probably well paid stunts payed by companies who wanted to enroll a big name to wow investors at pitches.

Edit - I want to be clearer: this is a friendly critic coming from someone who still loves Woz and has him on the personal list of "IT celebrities who would be amazing to hang out with". I also want to stress that I don't really blame him for this management of his late image. I just won't give a huge amount of weight to his sharpest statements as publications often do.


>By that standard nobody would ever be allowed to criticize anybody who got in a position of success or power, though, just because "they achieved more than you, sucker".

I'm not sure how someone can even begin to get that from my comment.

By the standard that I wrote, nobody would be allowed to criticise someone as a "has been". And especially if they they haven't achieved much themselves either. Period.

I didn't say anything about people not "being allowed to criticize anybody who got in a position of success or power" in general.

I believe that one can criticize Woz (or anyone other in power/success/etc) all you want. Just not as a "has been", which doesn't really have any content as a critique anyway, and is basically an ad hominem. Want to criticize Woz? Find something specific to criticize.

>Although I really think Woz is a great guy and that we historically owe him A LOT, one can't understate the fact that he's been operatively absent from the industry for a long time.

So? How does that make them any less valid than anybody's arguments (e.g. in this thread)? I don't know, and don't care whether you or others in this thread have been "operatively present" in the industry recently or not -- I just care about their arguments.

If rather you mean "Just because he is Woz, we shouldn't give more credit to what he says over anybody else", then that kind of goes without saying, it's the content that really matters, not the person.


To be cynically fair, almost no one is given serious attention when criticizing those in positions of greater success and power.


Asked about Apple’s new app Swift Playgrounds, aimed at teaching entry level programming in the language Swift, he thinks that there may have been better languages to start with.

I haven't had a chance to play with Swift Playground on iPad but Swift as a language itself is a lot fun. I wonder what languages he would favor over Swift for kids who might be interested in programming.


I agree with Woz, Swift is too advanced and complicated for kids. They are better starting out with MIT Scratch or something like that.


I really dislike the whole "starts kids with toy language" thing. My first language was 6502 assembly, and I genuinely think it's a good place to start. I'm not saying that to try and elevate myself, none of the "I walked uphill both ways to school in the snow" type shit.

An assembly that is fairly simple teaches so many fundamentals and at the same time is about is easy to learn as it gets. No crazy syntax, no OOP bullshit, no ceremony and boilerplate bullshit. It's as simple as it gets. Instructions and data.

I know so many people who learned Python and Java in early CS classes and then had their fucking minds blown and experienced so much pain just understanding the concept of a pointer. I remember my packed highschool "Intro to Programming" class where we started with VB and that was a struggle so they switched to one of those "Visual Programming for Kids" which became even more of a hassle for everyone.

When I try to learn anything new I want to distill it down to the most basic, core concepts and build from there. Doesn't matter if it's learning a framework, learning bass, or woodworking. Whatever happened to "keep it simple stupid"?


totally agree for exact same reasons


I disagree with the notion that Swift is too advanced or complicated. I think people who have prior programming experience might find Swift complicated due to some of its unique approaches (like handling null objects with optionals) or lack of expected features (C style for loops or ++ -- operators)

But if you're seeing it all for the first time, I don't think the concepts are overly complicated or confusing.

The only current downside to Swift is the fact that it keeps getting updated, sometimes with drastic changes. Beside that it's an alright language to start with.


Kids are smarter than you think.


Absolutely! Every time programming education comes up, someone brings up Scratch. It's advocacy is -- possibly detrimentally -- overused. It may be appropriate in some cases, but it is not a universal tool for introducing people to programming. There is certainly something inauthentic about using blocks. They may not be appropriate for people with motor-control issues. And the reliance on imperative constructs (assignment-based statements) as being essential for understanding computation is arguably wrong and possibly hurtful.

I only have a sample of 1 (my six-year-old son, turning seven today), but he is perfectly capable of typing, understanding function definition and composition, data structures, and working with reactive I/O. He's not some super-genius. He's a regular kid who likes to swim, ride his bike, play Pokemon, and watch Netflix. But he can also make things by "real" programming. I'm sure other kids could as well.


Certainly they can. I did at that age, in Applesoft BASIC - not at the level your son does, of course, because BASIC is, well, basic, but what capabilities it did have, I understood and used. If I'd been confronted instead with Scratch, I rather doubt I'd have stuck with it.

The thing about Scratch is that it's designed to be easy for kids, and that's not a good thing but only seems to be one. In presenting Scratch to a child one makes an implicit statement along the lines of "Here is what you can do. This is your level." There's something intensely limiting about that. It's like being given toy plastic versions of Dad's power tools and told you can use them to actually make things. Kids are ignorant, but that's not the same as dumb. No one is going to fall for this, and it will earn you the enduring contempt of those on whom you try it.

Whereas when you sit a kid down in front of something like Applesoft BASIC or I guess a Ruby or Python interpreter, a Lisp or Node REPL, whatever - the implicit statement is instead along the lines of "Here is a thing that does things. Find your own level." And there's something intensely liberating about that. It's a vote of confidence, and that counts for a lot to a kid just like it counts for a lot to anyone - perhaps quite a bit more.

And by all means feel free to tell your son that some random goober on Hacker News wished him a happy birthday!


I agree, but when they want to try something a little more advanced Swift Playgrounds is a great upgrade. I don't think Scratch/SP is an either/or.


Sorry to be pedantic but he didn't say Swift was too advanced, he simply said:

> he thinks that there may have been better languages to start with.

Which could be taken several ways.


Eh, not sure I agree there. It's not like they're throwing kids head first into generics.


"Steve – it is Steve, right? You say this gadget of yours is for ordinary people. What on earth would ordinary people want with computers?"


This talk[1] of Woz at Google is one of my favorites and motivated me a lot to learn engineering in a diligent way. His love for electronics and engineering is evident from the way he talks about it.

[1] https://www.youtube.com/watch?v=ctGch5ejjT4


>inventor of the home computer

hissssss

https://en.wikipedia.org/wiki/Altair_8800

EDIT: The earlier title on HN said that Steven Wozniak was the 'inventor of the home computer' with no qualifications, the title has now been changed but HN won't let me delete the comment.


Did you mean to say the 1950 Simon?

http://www.blinkenlights.com/pc.shtml


They meant "Inventor of the home computer," not "Inventor of one of several obscure, complex devices that might be referred to as a 'computer,' found in the basements of a negligible number of professional engineers, scientists, and extremely advanced hobbyists."

By that definition, Woz is the best candidate for the job description. I could publish an article tomorrow describing how to build an inflatable sheep that runs TensorFlow, but that won't entitle me to claim I invented the first sentient sexbot.


So discussing earlier personal computers is equivalent to the absurdity of inflatable sheep. Brilliant commentary.


It saddens me when people don't get the point of a metaphor, and instead lean on the metaphor to discard the argument behind.

In this case, the point of the metaphor was to say that something obscure and barely sold/marketed is not a valid candidate for the "first X", where X, like "home computer" implies adoption.

Whether the parent used inflatable sheep or VCRs or thermometers as their metaphor is besides the point -- as the core of the analogy (the argument) remains the same.

It's also obvious that he used the sheep analogy for extra color/fun.


I also dislike this and I've often wondered if there is a word to describe this occurence.


[flagged]


You guys are really lowering the discussion level here with your insults and innuendo and meta commentary.


Perhaps you missed the irony that you further the meta-commentary? Why is it OK for you to go-meta on my comment, but not for me to respond to an actual exchange in this thread?

I saw the parent completely sidestep the content of the grandparent's argument and focus on trivial form issues, and I found this in bad taste (and an all too common occurrence), and wrote so. Simple as that. And while the quote might imply the parent is an idiot, it was simply meant as a quote that fits the practice -- as asked by Teever.


"Forget it, Jake. It's Chinatown." Or HN's moderation policy, same difference.


To be clear, I don't mean to dismiss the work of the real pioneers of the small-computer world. Just pointing out that there was only one Prometheus who brought the fire down the mountain.

To stretch the analogy in a different direction, Altairs and the like were never going to grow into anything that people would recognize as a computer today. They were important from an evolutionary standpoint because they inspired and helped educate a generation of technology developers and advocates, but they were also dead ends.

On the other hand, if you pull the covers off of any desktop PC chassis today, almost every major component will be recognizable to anyone who ever owned a piece of Woz's handiwork. The IBM PC engineers weren't inspired by Altairs or Simons, but by the Apple II. (Which, itself, was the product of the last fight anyone ever won against Steve Jobs. Woz gets credit for that if nothing else.)


The Altair was very much like a modern PC in that it used a de facto standard S-100 bus and other standard peripheral connections, and was based on the Intel 8080 CPU which evolved into the 8086 used in PCs. It had the first commercial software. It was mass produced. It was able to run CP/M which has much more in common with DOS than Apple DOS.

The Apple II was more proprietary in nature and the line died out after the Apple III failed. Intel x86 based PCs are still made today. Edit - sorry, forgot about the IIGS.


We'll have to agree to disagree. There's nothing even remotely proprietary about the original Apple II -- seriously, have you ever seen a Red Book? And IMHO, nothing that didn't ship with a keyboard and a ready-to-use video display port can be considered either a "personal computer," or a progenitor of them.


It had an RS232 port that could be connected to a serial terminal with a keyboard. This is an emulator, but you get the idea: https://www.youtube.com/watch?v=qxcFNOpnIIs

Sure looks a lot like any other early personal computer. Having a "video display port" seems like an arbitrary distinction.

Also if you want to talk about ubiquity, the TRS-80 which released a bit later in 77 than the Apple II massively outsold it.


True, the TRS-80 often gets short shrift in these arguments. What was revolutionary about it was the price, about $1K cheaper than an Apple II. It foreshadowed the day when almost everybody would be able to afford to own their own computer. I agree that the TRS-80 was an important milestone in its own right, even though it lost the race to call itself the "first recognizable PC," or whatever.


I think Apple's big win was in mindshare and getting the education market, maybe being more professional and establishing vendor relationships. You didn't see Altairs. I only knew about them from magazines. When I was in early gradeschool, all computers were Apple IIs as far as I knew.


The article was excellent, but you could have brought it up in a less smug way.


The Altair is pretty widely regarded as the first 'real' home computer. It's perfectly poised on that dividing line between "hobby electronics" and "consumer product".


I agree. I was just kidding GP.

I've never seen one in person though unlike Apple IIs, TRS-80s, TI-99/4As, etc.


I have to admit I've never seen those in person either. I have played with Amstrads, Amigas and Commodore 64s, though, and I have a BBC Micro in a box under my desk. :)


I see your Altair and raise a Micral:

https://en.wikipedia.org/wiki/Micral

On the other hand, this never gets old:

http://www.blinkenlights.com/pc.shtml

(Edit: blinkenlights-link already posted as I now notice …)


Am I going to be downvoted if I puncture the largely laudatory bubble around Woz at all?


Not at all. You'll be down voted because this comment doesn't substantiate your claim and doesn't add anything to the discussion.


Sorry, I thought that I was asking a question and not promoting a claim. In any case, thank you for answering it, though perhaps not in the manner intended.


You asked a question that implicitly advances a claim, and it's disingenuous to pretend you don't know that.


Surely, in that case, you can vocalize what that claim was.


Of course I can. But no one needs me to.

Is it really worthy of you to play this kind of game?


I do.


See my comment elsewhere in this thread.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: