Hacker News new | past | comments | ask | show | jobs | submit login
Mossberg Final Column: The Disappearing Computer (recode.net)
266 points by andrewl on May 30, 2017 | hide | past | favorite | 61 comments



Mossberg did a lot of good work, but so did a lot of other columnists who didn't attain nearly the status he did.

Mossberg mostly has one guy to thank for the unique place he occupied in the tech media scene: Steve Jobs. For whatever reason, Jobs decided that Mossberg's take on Apple products really mattered, a lot. Mossberg was Jobs's stand-in for Mr. Everyman, and Jobs seemed to believe that if Mossberg couldn't connect with a product, then it needed to be re-thought.

His status as anointed deliverer of the final verdict on Apple's product line gave Mossberg a massive amount of influence in the tech world all by itself, but added to this was the fact that, as with so many other things, the rest of the tech industry slavishly followed Apple's lead -- at least, the rest of the tech industry's PR departments followed it. For a corporate PR flack, having Jobs's own personal oracle say nice things about your product was the ultimate win. It was the fat Harvard admissions envelope, by which I mean something like, "this achievement, while important, has a way outsized significance to a certain segment of the population who compete for it because they've all decided that getting this particular thing means You Won and have a higher status than all of your peers who haven't gotten it."

Interestingly enough, the passing of Jobs was followed by the passing of the positive Mossberg review as the ultimate prize in the PR world, and Mossberg's departure from the WSJ didn't help.

Am I ripping on Mossberg? Not really, more like I'm ripping corporate PR, but Mossberg certainly cultivated this situation (who wouldn't, though). I will say that it was a source of eternal frustration (and envy) among the rest of the tech punditry that Mossberg's reviews had this bizarre status with the PR departments of the companies we covered, sort of like "Harvard as the agreed upon brass ring that all yuppie parents have decided to compete for" no doubt occasions much eye-rolling at Stanford, Brown, Yale, and everywhere else. Nobody is sad to see that era pass.

As for Mossberg himself, Godspeed, dude. May your amulet never tickle.


Thank you for articulating what was behind the Mossberg phenomenon in a positive way. An anecdote:

About a decade ago, I was working at a startup. The VP of Marketing got it into his head that we just had to get Mossberg to write about us and it'd be great for user acquisition. We had a bunch of meetings and shuffled and reprioritized the product roadmap around this. I'd actually not heard of Mossberg before, as my prior experience was at a BigCo selling to enterprise, a market where Mossberg was essentially irrelevant. So I went and read a few of his past columns, and felt that the quality of insights was all over the place, ranging the whole gamut from good to middling to bad. At this point I was wondering in my head "Why is this guy so special? Who crowned him this way? Why are we doing contortions to attempt to please him?"

Our strategy did work, insasmuch as Mossberg wound up writing a glowing column about us. However, this did nothing to do move the needle at all with regards to user acquisition, and likely was a distraction and misallocation of resources that negatively impacted finding market fit. This left me with a bad taste about Mossberg, as clearly whatever his tastes were didn't fit the market very well. The review might have helped us if we were raising a series A or B, but we were in C territory by then, which is when investors actually look to see if you have a viable business rather than responding to hype and FOMO.

In retrospect now, after reading your comment, the bad taste I felt was really about our VP of Marketing, who was chasing a prize that in the end didn't help the business. Said VP was ineffective in other ways, and this was just one symptom of his ineffectiveness. Mossberg was just playing the cards he was dealt in life effectively, and as you said, who wouldn't?


I think you may have cause and effect flip-flopped here - Mossberg was the everyman's tech columnist long before Apple became the hot thing to cover.


I think he got it right. Mossberg was one amongst a number of everyman tech columnists for a while, but Jobs really elevated him once Apple got popular.


This is exactly right. Most major (and many regional and local) newspapers had an Everyman tech columnist. Mossberg just happened to be that guy for the WSJ.

The other thing I maybe should've mentioned, though, was that Mossberg was a very early and emphatic booster of the Jobs 2.0-era Apple products. Back when Jobs had just returned after the Next acquisition, Mossberg got on that bandwagon immediately and began talking up their products.

So in its initial stages, the Jobs/Mossberg love-fest was a bit of a symbiosis. Recall that this was in a bygone era when the powerhouse WSJ at least as big of a deal as the struggling Apple Computer company. In later years, when Apple was the giant we now know and Jobs had ascended to the pantheon of industrial greats, the relationship probably did a lot more for Mossberg than it did for Jobs.

(It's hard to recall, but there was once a time when Apple was a niche, struggling little tech company with a minuscule market share, and big newspapers were still a Big Deal. The Jobs/Mossberg relationship had its roots in that era.)


Nope, tech columnists pre-Mossberg were largely targeted at enthusiasts, he basically invented the everyman tech columnist gig.

https://daringfireball.net/linked/2017/04/12/mossberg-cjr


What has always been fascinating to me is this. If critics options matter that much why can't they make more money just offering those opinions directly to the company by way of high priced consulting?


This made me inexplicably sad. I wasn't a huge Mossberg fan or anything and haven't read more than a handful of his articles (nothing against the man, of course). But I think it's hard to hear about this and not be affected, even if only a tiny bit. We're some 35 or so years into the era of the personal computer and it's basically at that point where an entire generation of great names in the industry have spent a full and productive career in the service of technology and are now stepping down... and in some ways, the industry itself is retiring (as he mentions). Times are a-changing and we must change with them, adapt or die.

So long, Walt.


I teared up in the end. Completely unexpected


"Mossberg out" got to me too.


"Ubiquitous computing (or 'ubicomp') is a concept in software engineering and computer science where computing is made to appear anytime and everywhere. In contrast to desktop computing, ubiquitous computing can occur using any device, in any location, and in any format. [..] This paradigm is also described as pervasive computing. [..] Mark Weiser coined the phrase "ubiquitous computing" around 1988, during his tenure as Chief Technologist of the Xerox Palo Alto Research Center (PARC)" https://en.wikipedia.org/wiki/Ubiquitous_computing

I find interesting that the concepts dreamt in the 80's are becoming common, now that technology has caught up.


Before he was at PARC, Mark Weiser was a professor at the University of Maryland. I took one of his classes and I always found him brilliant. I'm very sad that he did not live to see the arrival of the Internet of Things and the Maker movement and super-cheap computing, all of which touch on his ideas of Ubiquitous Computing. He'd have loved it.


It is a common theme -- the first time big ideas are tried they often fizzle. The second time, usually after 10+ years, is when the kinks get worked out, underlying technology needs are better understood and the tech can expand (or explode) in the consumer markets. AI, video calls (AT&T -> skype), GUIs, digital cameras, etc.

I hope recent bio boom will return in 10-15 years with real breakthroughs (as is, visible and useful to an average person).


I think privacy concerns could seriously diminish or alter the AI developments. I sometimes wonder if the internet doesn't need a new abstraction. i.e. The academic underpinnings of network protocols are allowing governments and corporations to overstep. Maybe in the future having an ISP provider without a 'privacy provider' will be like having a car without insurance?

Technically, I'm thinking of something like thousands of VPN connections distributing packets across randomly chosen data paths. Some trade off of bandwidth for abstracting away "which IP is doing what". (bleary-eyed thought: some sort of probability field applied to TCP/IP)


That's more or less how Tor works.


I see that, but I was hoping for lower level.


From the article: "and robotics are in their infancy, a niche, with too few practical uses as yet."

Why does it seem like robotics is always niche and not practical yet? It seems like robotics is perennially on the cusp of being the next big thing, but never really is.

(I'd qualify that robotics are the old big thing in industrial manufacturing, though.)


AI that works is just software; a robot that works is just a machine.

A modern tractor is a phenomenally complex machine, capable of fully autonomous operation with millimetre precision. The system is guided by a complex set of 3D soil maps based on moisture holding capacity, compaction and texture, root formation and a multitude of other factors. Each square foot of field receives precisely controlled amounts of water, fertiliser and pesticide. Nobody calls it robotics or AI, because it works.


"We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run." -- Roy Amara

disclaimer: I have a Ph.D. in Robotics from CMU.

AI is similar: in the 1960s, it seemed machine translation was "just around the corner," but it turned out to not be practical for a long time. Until it was, in the 2000s or so.


I think it's worth noting that the approach to translation has changed significantly since the 1960s.

AFAIK the real breakthrough wasn't that technology just got better, but that the whole paradigm shifted to statistical machine translation in the early 90s, and the whole paradigm is changing again from phrase-based statistical translation to neural network-based translations.

The shift to neural nets can be somewhat attributed to increased compute capacity, but it's not an automatic result of it.

I have no idea what the state of the art is in robotics, but I expect there will be some fundamental changes to how we do robotics before it becomes a break-out success.


It was really not reasonable to do this stuff using 60's erra computing technology. So, it's more new tech enabling new and less efficient approaches than generally talked about.


That's definitely true, but you need both, and having access to more compute doesn't automatically mean you will find new approaches, someone actually has to put in the hard work.


True enough. But, to be clear many of these ideas are 20+ years old they simply did not work back then.


And robotics in industrial manufacturing are a REALLY BIG thing. Seems silly to discount that, unless by "robotics" he means "robotics+AI"


Given his focus on consumer electronics, it seems probable that he's referring to consumer robots. In that space we have roombas, pool cleaners, and a handful of selfie-taking quadcopters. Definitely niche.


Industrial automation isn't particularly visible to the average consumer on a day-to-day basis.


Average consumers have no clue about the incredibly complex industrial infrastructure that keeps food on the table, the lights on, and transport, communications, and logistics running.

I think this stuff should be taught in schools.

From Mossberg's POV industrial computing and industrial robotics are already ambient and invisible. Not everyone needs a professional understanding of how they work, but some basic clue wouldn't be bad.


Is my dishwasher in the kitchen a robot? Justify your answer (either is acceptable, I'm looking for your reasoning).


I think a robot has to have an arm. If it doesn't have an arm, it's not a robot -- it's just a machine. The whole robot could just be a single arm or the arm can just be one part. But it has to have an arm.


Haha. There is an anthropomorphic element to it.

When most people think 'robot', they think of something with human or animal-like elements. So for example a rolling delivery or security drone would be a robot, but an automated car would not.


Too human-like and it's an Android. Not creature-like enough and it's just a machine.


Yes. The dishwasher (DW) does not complete the entire dishwashing circuit/task, but certainly a major portion. That is, the DW can not load or unload itself generally, but it can sanitize, wash and dry among other tasks. Thus, meeting the general definition of a robot: carry out a complex set of instructions/actions automatically once programmed.


Not the OP, but I think the answer is definitely no.

Robots are not well defined, but I think at the minimum they imply the ability to be reconfigured for a large number of physical tasks that involve interactions with an environment.


My feeling is that the "robot" concept requires feedback loops and flexibility to adapt to a wide range of situations. As technology progresses, the required amount of magic required to warrant the title is a moving target.


In that case, is my clothes dryer a robot? In addition to its timed dry setting, it also has a feedback setting in which it runs until the clothes inside have reached a (configurable) specified level of dryness, at which point it turns off.


At some point in the past, such automatic intelligence would have stunned the populace - but nowadays nothing short of sorting, folding and piling on the clothing shelves would let the punters cry "robot !"


Define Robot..

Would you consider the relay in your internet connected smart bulb a robot?

Maybe it needs to also move to be considered...

Could the remote controlled computer and motor that pushes your car window up and down be considered a robot?

Maybe it needs some AI then...

What about the Nest brain to your house that switches all manner of things on and off to you liking?

Semi autonomous teslas, vending machines, drones, roombas, parking garage entries, energy efficient elevators....

Robots are fundamentally control systems, and they are absolutely everywhere. They just don't look like a T-800 (o;


Mossberg was a professional technology critic and pundit, one of the first of his kind. He paved the way for many others who followed his lead, and he certainly deserves credit for his pioneering efforts. I enjoy reading critical reviews of new products and software, but I'm generally not a fan of tech punditry. I believe that we need fewer pundits and prognosticators, and more hard-hitting investigative reporters along the lines of John Carreyrou.


There were others before Mossberg, and IMO they did a better job. Infoworld used to be a weekly read for a lot of us in the tech community, and they had a number of experienced columnists (e.g. Dvorak, Cringely) that could have easily slid into the job that Mossberg was handed at the WSJ in the late 1980s (after nearly two decades of non-tech related coverage for the newspaper).

If you read his earliest work, you get the feeling that he sometimes had no idea what he was talking about. But he kissed up to the CEOs of the major companies, which dovetailed nicely with the editorial stance of the WSJ and the P/R needs of respective companies.


Don't forget Jerry Pournelle, if anybody holds that title it should be him.


Absolutely. Sorry I missed that name.


Carefully chosen words and full of future visions. Thank you. We will see if this will indeed go in the direction he suggests - as usually one cannot foresee what will disrupt the world next. It can be something completely new which has nothing to do with computers.


Except most of these profound visions from Mossberg are things that have been discussed in the tech community for decades.

Don Norman wrote The Invisible Computer 19 years ago.

https://mitpress.mit.edu/books/invisible-computer


Good column but this ongoing idea that the smartphone is the new personal computer is wrong. It's a different type of device. Smartphones are not replacing computers in terms of a device to get work done, other than email and phone calls. Doing real work for just about any job on a smartphone, even a big one like my iPhone 7 Plus, is not really viable. You still need a laptop or desktop to work in a spreadsheet, design a full page ad layout or do serious programming.

We need to let go of this idea that new technology is always a version of some other one, or replacing some other one. That's actually rare. Most of the time new inventions are just that; new.


> Smartphones are not replacing computers in terms of a device to get work done, other than email and phone calls.

That is A LOT of work (real work) being replaced right there.

We also need to stop thinking people bought personal computers to work or be more productive. Part of the personal computer market was sustained by families needing a device to be connected, browse the web, do homewrok, etc... those use cases are now being replaced by the phone or a tablet.

Nobody is saying that smartphones are replacing computers completely, just that for a big part of the market the computer was a complex tool that was underused. That market is now finding smartphones/tablets are simpler to use and more apt for the tasks they want. We will always have specialized devices for certain jobs, but the main computing device is (for the majority of the population) the smartphone.


It is the interface (small, touchscreen) that is new, along with the wireless network connectivity.

Other than that a smartphone is most definitely a personal computer though. There are many, less keyboard/mouse bound tasks that people regularly did on desktop PCs 10-15 years ago that we now do on our phones.


Is an ATM a personal computer? Is a laser printer a personal computer? Is a flight control system a personal computer?

Not every device that is made out of a computer is a personal computer. In fact, most are not.


The mobile tablet is not just a personal computer but a downright intimate one.


Most people doesn't have your or my kind of work. And even those that do will be able to use their phone on any screen at some point.

So I think it's fair to say that the smartphone was a continuation of the laptop not the feature phone.


My wife is an assistant. She does a lot of her work on email. But not all of it. She can't do her job on her smartphone.

Almost nobody can do their entire job through their smartphone, and that's not going to change. They are not replacing the personal computer, they're just an additional device that has a different role.


Of course it can change. There are three things inhibiting the use of the smartphone for productivity: most of the software available at the moment has not been designed for that role, it will take years for the software to catch up in terms of functionality, and it will take years for the market to adopt software that is up to the task.

Notice how there is no mention of the hardware above. Processors, memory, and I/O capabilities are already up to the task for many environments. The only unsolved problem in terms of interfacing to suitable I/O devices (i.e. screen and keyboard) is having a universal standard.

Of course, the real question is whether it is desirable to invest in bringing the software up to par. Perhaps it is worth it to have a universal platform. Perhaps it is not worth it because the market is not interested in having that universal platform.

Note: I am not arguing that the smartphone can replace all conventional computers. Some markets are simply too specialized to make investing in the transition viable. Other markets have no need for portability. Yet other markets will find that the trade-off in capacity or performance is too large. On the other hand, there may be a place for people who want a single device in their life that can adapt to many different situations.


Sorry but you are simply wrong. You keep using people who use personal computers as their work tool. But thats missing the point. We are not talking about engineers, designers and people who need to work in excel all day but they are still a minority compared to those who own a smartphone.

The point is that many people don't own a computer but use their smartphone to do everything with.

You see that once you have to build solutions for the masses.


I used my wife who is an assistant, not a developer or engineer. Exactly what jobs that currently utilize a personal computer do you see being able to be move entirely to a smart phone? If you're so insistent they exist, please supply some examples.

> You see that once you have to build solutions for the masses.

I have been a professional software developer for 20 years, my friend. I have worked on products at every level. Please stop being so patronizing.


And I have been working with this for 25 years so what? That's not relevant as this is a fairly recent development.

You keep missing the larger point and you are mixing two discussions.

You wrote and I quote

"Good column but this ongoing idea that the smartphone is the new personal computer is wrong. It's a different type of device."

What you are forgetting in all this is that people who never used computers before for their work are now using smartphones and tablets instead, what they are replacing is mostly pen and paper. Construction sites, drivers, inspectors etc. They skipped the personal computer and moved directly to smartphone.

No one is claiming that people don't use computers, what is being claimed and what is right is that people are increasingly using smartphones for things they used to use computers for and that some people are going directly to smartphones instead.

Which means that in the overall economy which is what Mossbergs article was about, the phone is the primary computation device for most people and that even in areas where the computer is a primary device the mobile takes over more and more of that work.

Thats the point and isn't negated by some notion that your wife use a computer for some of her work. With regards to whether the smartphone is a different device then you are arguing against yourself as you just said your wife uses it for things she also do on her computer.

Not sure where you get the patronizing tone from. The numbers speak for themselves both when it comes to webstats, sales stats, usage etc.


> Not sure where you get the patronizing tone from.

This would be a start:

> Most people doesn't have your or my kind of work. And even those that do will be able to use their phone on any screen at some point.

Nice assumption there.

> The numbers speak for themselves both when it comes to webstats, sales stats, usage etc.

How so... You make it sound so obvious. Webstats, sales stats, usage... you'll need to elaborate. The sales stats of smartphones vs desktop/laptop really doesn't by itself prove declining use or a shift in use. In my industry (which is not tech BTW, so stuff that notion), almost everything across levels in mostly desktop and average age of devices nearing 10 years - this refutes the relevance of sales stats. Primary software is all "native desktop", and real desktop, not some fucking Electron app, not a web app. There are players in this industry that are trying to shift to tablet/smartphone with cloud apps, but they have made little actual conversion - ie it is a rounding error. Webstats are not an issue.

No the numbers don't speak for themselves. Continue implying I'm an idiot that knows nothing outside tech (ha!), or explain.

> some people are going directly to smartphones instead.

"some people" are always doing something.

> primary device the mobile takes over more and more of that work. The other argument is that mobile devices have augmented many things without significantly replacing existing systems. And really, while that is all fine, if critical things are still dependent on the "old tech" than it is a counter to the significance of a mobile invasion. From personal experience, in a few industries, in 2017, you could make all the smartphones disappear and people would be sad and there would be some significant efficiency losses, but things would move one. On the other hand, make the old PC infrastructure disappear and everything goes to hell. That I felt was, in part, the gist of the OP's arguments and I agree.

I think you're a prick. You certainly come off that way in text.

> You see that once you have to build solutions for the masses.

Oh and I'm going to opine based on your generalizing assertions that your definition of masses must be more limited than you think. Unless you mean masses as "media consumers", in which case I'll just say it is not a small industry, but it is a minor part of the economy.


So instead of arguments you resort to name calling. Oh well. Have a great day.

Must be nice to have two handles so you can downvote arguments you dont like.


There are two ways to look at this.

1) It's really sad that we aren't able to get better progress than we have

2) It's great that we are still far from robots taking over as that leaves a huge option space for startups.

Personally I think many things literally are around the corner and that the corner is getting closer, faster and faster.


Another one gone. John Markoff retired this year, too.


Just to clarify, John Markoff was a distinguished New York Times correspondent, based in San Francisco, who covered the Bay Area tech industry.


Great column. And end of an era. Good luck to him.


Great column. What a body of work! I wish I have something that touched so many lives when I finally hang up my gloves.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: