Hacker News new | past | comments | ask | show | jobs | submit login
The end of Wintel (economist.com)
53 points by ab9 on Aug 3, 2010 | hide | past | favorite | 28 comments



The author seems to generalize the "market" here. The effects described on the markets are/willbe much more subtle than the article claims. IT has always been multi-faceted. It's not like MS, Intel, Oracle, IBM or Cisco could have stopped doing marketing and still be where they are.


There's a good joke about that, it goes like this:

A man is seated next to Bill Gates in the business class of an airplane. Mr. Gates, he asks, how come that with your name and brand recognition and your tremendous turnover as well as the locked-in nature of your customers that you still spend money on advertising.

Gates replies: Do you like the speed at which this aircraft is travelling ? How about we turn the engines off ?


I'm not overly convinced. A big reason why most major corporations advertise so heavily is just because it's something that big companies do, it can be tough to buck a trend especially when billions are on the line.

But aside from that, advertising is a business expense that is fully tax deductible. For many big businesses the net cost of advertising is zero.


If something is tax-deductible that does not mean that the net cost of doing it is zero.

For example:

We have a turnover of $100, a tax deductible cost of $40, our gross profits before taxes are $60, say we pay 50% tax the net is $30. If we had not made the $40 tax deductible expense the net would have been $50.

So the net cost of stuff that is tax deductible is not zero.


Let's say Cisco spends $1B on advertizing and ends up with an extra 800M in sales and $900M in profit after taxes. Was that a good idea?

PS: The actual numbers may look like this. By selling more units your price per unit usually drops. But, sometimes it's better to sell fewer things at a higher margin. For a company like Cisco maintains a specific brand image is worth a lot. They may sell cheap home networking equipment but they don't use the Cisco brand.


Obviously not, but the argument was made that because advertising is tax deductible that it was essentially free. Nobody talked about the effective result of the advertising on the bottom line because of the extra revenue that it might bring, or how bad it would be to advertise ineffectively.


Obviously not err my point was 1B in pretax money is generally worth significantly less than 900M in after tax money.

Clearly it's not free, but due to the tax break advertizing can have a lower ROI and still be a better investment.


That's an interesting hypothetical. However, in practice most billion dollar company's advertising budgets tend to be much less than their taxes.


I like the old John Wanamaker quote: "I know that half of my advertising budget is wasted, but I’m not sure which half"


It's a bit like buying lottery tickets.


That's true, but the author is looking at what's under the hood -- at the moment, ALL of those companies are heavily depending on Microsoft and/or Intel. In most cases, both.

Example -- IBM is still one of Intel's biggest server OEM's. And most of IBM's corporate environment runs in Windows. (For that matter, the same could be said for most of the bloated US Federal government.)

The author is also ignoring the fact that the Wintel duopoly has been making inroads in corporate markets, as well -- for years now Intel has pretty much owned the low and mid-range server markets, and is making inroads on higher end markets (in fact, the only thing that so far has been able to hurt Itanium meaingfully is... Xeon), and due largely to SQL Server, Microsoft has been making large inroads into those markets also.

Apple's growth in the PC market isn't harming Intel one bit -- quite the opposite in fact, because Apple's success is tied to Intel.

I don't think Intel's dominance is in any danger. Microsoft's is another story -- Intel will be able to gain a lot of mobile market share, because Intel has the best semiconductor manufacturing in the business.

That said, I think that ARM will be a tougher competitor than AMD, because ARM is already much, much bigger, and some the foundries making ARM processors for their partners are larger than Intel, even if they aren't quite as high-tech.

The declining margins in processors are probably why Intel's been going after the graphics market, the cluster-on-a-chip market (Larrabee), and doing the whole-system thing for the mobile market, rather than just the Atom chip by itself.


AMD designed the 64-bit architecture.


AMD designed a 64-bit architecture. There were many 64-bit architectures around long before x86-64 in 2000.


It is probably worth noting that AMD brought 64-bit to the PC world. Intel was telling customers they didn't need it.

AMD actually took a small bite out of Intel and Intel quickly changed its mind.

Breaking the Wintel monopoly will spur a lot of innovation. A third choice like Chrome or Android on tablets or notebooks might do the trick.


> Intel was telling customers they didn't need it.

They were telling customers if they did need it they should buy Itanium.


Intel was telling everyone they needed 64-bit (more accurately, post-32-bit) technology long before AMD created x86-64. It's called IA64, EPIC, or Itanium. Intel made a huge gamble, betting that the next-generation of CPUs would be founded on a radically different VLIW type architecture.

But Intel goofed. Because they didn't factor in how long it would take to fully develop the new architecture (something that takes multiple iterations). The first generation of Itanium was hugely expensive, didn't run x86 code well, and couldn't easily masquerade as a plain Jane x86 machine. (Not to mention the problem of having to modify code so that it could be re-compiled for IA64.)

Meanwhile, AMD looked at the same problem and came to a different conclusion. Rather than a revolutionary new 64-bit architecture they thought that an evolutionary architecture was preferable. Simply fix up a few of the most glaring problems of x86 (such as a dearth of registers), widen the bus to 64-bits and call it good. They probably figured that since everything under the hood is all RISC/ILP-ish hidden behind micro-code translation anyway the instruction set isn't super important, as long as its semi-reasonable. They concentrated on maximizing performance for unmodified code and providing good benefits for using actual 64-bit specific code (such as more registers, well-engineered 64-bit memory addressing, etc.)

Itanium floundered while AMD64 became hugely popular very quickly. AMD64 processors ran x86 code faster than x86-only processors, so there was no reason not to buy them, even if you never ran any 64-bit code. But once you owned the hardware it was now just a matter of an OS update sometime in the future to upgrade to 64-bit. Allowing the x64 transition to be far more incremental and far less of a leap of faith than it would otherwise have been.

Intel quickly realized their mistake and tried to implement a version of x86-64 using the net burst (Pentium 4) architecture. This led to some really sad machines and a very awkward time for Intel as they were hit with multiple setbacks at the same time. Eventually they succeeded with a very substantial redesign of their hardware architecture (the core series) which paid huge dividends in the 2nd iteration as it proved to be far superior even to AMD's offerings.

It's not at all that Intel thought that the 32-bit world would last forever, it's just they that screwed up royally in figuring out where to go from there.

Note that finally, many, many years too late, Itanium is actually a decent platform now.


From a [native compiler] developers point of view, Intel's fatal flaw: the architecture is nearly impossible to write a compiler/assembler for. That's what Intel told my company (that makes native compilers). They said we wouldn't be able to make a high performing compiler because the instruction scheduling, etc were too complex and not documented well enough for us to do it.

A port of our compiler, which normally takes a couple of months, was estimated on IA64 to be 1-2 years, for a high-performing result.

This is where they failed.

Oh, the AMD64 (what it was called when we started the port) took 2 months, and that was fully optimized.


Yeah, I only mentioned that in passing but it is a serious issue. All the more pressing because a lot of the promise of IA64 hinged on smart compilers. It was a dramatic change of course for CPU architectures since the previous trend had been to make the hardware smart enough to figure out how to take advantage of ILP on the fly, even with unmodified code (thus the superscalar architecture of the P5, the out of order & branch prediction strategies of P6, etc.)


"The first generation of Itanium was hugely expensive, didn't run x86 code well, and couldn't easily masquerade as a plain Jane x86 machine."

THAT was its biggest blunder -- the x86 part. It was pointless, and a waste of silicon, and it also throttled the chip back with a lot of extra overhead.

On top of everything else, it added NO value whatsoever, but it did make introducing the chip a lot harder, and designing it cost a lot more than it should have.

Had they stuck to their original plan of pushing the architecture as far as they could and using a binary translator to provide x86 emulation, they would probably have had a barnstormer that made even IBM quake in its boots. Instead, they hobbled it with the biggest brick they could find.

"Meanwhile, AMD looked at the same problem and came to a different conclusion."

In addition to the list you provided, AMD started abandoning backward compatibility in Long mode.

"Eventually they succeeded with a very substantial redesign of their hardware architecture (the core series) which paid huge dividends in the 2nd iteration as it proved to be far superior even to AMD's offerings."

It took Intel less than 2 years to launch Core after AMD hit the market over the head with... er, Sledgehammer. (Pun not intended, but that WAS AMD's code name.) Core was a 5-year project -- obviously, they were already working on it, and turning the cranks almost to the point of melting motherboards was clearly just to extend the life of the Netburst architecture beyond where Intel thought they'd need to.

I suspect that their original hope was that they be able to scale up the performance of Itanium far enough to exceed x86 performance even in emulation, and since Itanium was a simpler architecture (without hardware x86 compatibility, it's a FAR simpler architecture with a LOT more floating point throughput per clock cycle even now), it would be cheaper to manufacture.

Intel blundered with the hardware x86 silliness and AMD caught Intel with its pants down.

I would, however, argue that AMD's ccNUMA architecture was at least as big a deal as the fact that they introduced 64-bit computing to the mainstream -- which was huge.


Fortunately Intel is not quite so dumb as to not have multiple irons in the fire at the same time. But they came closer to killing the company with Itanium & NetBurst than they ever had before.

Itanium is a special case of the classic problem of trying to rebuild from scratch. If you do it right it can be a huge win, if you do it wrong you can kill the company. Generally speaking most times that such efforts fail it's due to ignoring the need to take the new architecture through enough iterations to gain maturity and parity with the old architecture, and ignoring the extensive costs of having 2 parallel architectures in play at the same time and the costs associated with transitioning from one to the other. The efforts that fail almost always assume that the new system will be 100% capable on day 1 and the switchover will be effortless and complete instantaneously (rather than costly, time consuming, and incremental).


Except that Intel wasn't in any danger of dying, I agree. Even with Itanium's steep costs and delays, NetBurst came pretty close to keeping pace with the Athlon and Athlon 64 -- close enough for Intel to compete with price. Admittedly, it was largely due to the fact that Intel had the edge in both fabrication technology and capacity, but either way AMD couldn't make enough processors to satisfy global demand.

By way of example, recall that in the first month after Intel started shipping its first 64-bit x86 processor (a NetBurst Xeon whose code name I have forgotten), that processor outsold the combined annual production of G5, Athlon64, AND Opteron.

AMD had a lead in many ways, but wasn't even in the ballpark as far as fabrication capacity.

"The efforts that fail almost always assume that the new system will be 100% capable on day 1 and the switchover will be effortless and complete instantaneously (rather than costly, time consuming, and incremental)."

Agreed -- and it's all the more reason that Intel should have relied exclusively on the binary translation technology that they inherited along with the Alpha development team, technology, and intellectual property.


> AMD brought 64-bit to the PC world

No. For most people, Microsoft brought 64-bits to the PC world. Had Microsoft decided not to support AMD64 in Windows and opted for some other 64-bit x86 architecture, AMD64 would be a curiosity.

It's a bit sad because I remember reading AMD's announcement on a 64-bit RISC workstation that was already a bit too old compared to its newer cousins.


Rather than mod you down : 64bit servers is where it was at. All those AMD64s in the data centre with >4Gbs RAM

XP64 was crippled by bad drivers.


I wonder if Alpha would have succeeded if, in addition to NT, Microsoft ported Office, the whole BackOffice family and Visual Studio. It's not that hard to blame them for the fact we still use hypertrophied 8080s.


Does Intel make chips for Cell phones?


There are rumors circulating that Intel is currently in talks to acquire Infineon's wireless business. http://www.eweek.com/c/a/Mobile-and-Wireless/Intel-Infineon-...


Infineon does not produce application processors for the mobile market -- except perhaps based on ARM technology. They do make baseband and RF chips that are used in cell phones.

Intel definitely wants to get into the mobile / cellular business and have made a similar attempt before (search for DSP/C -- now owned by Marvell).

In fact, Intel was even making ARM based application processors in the past (called Xscale and derived from DEC's StrongArm design). Eventually they sold the entire division.

They would love to have a version of Atom powering mobile phones. Right now ARM is the de-facto standard.


Very informative. Thank you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: