Hacker News new | past | comments | ask | show | jobs | submit login
The rise and fall of AMD: How an underdog stuck it to Intel (arstechnica.com)
160 points by sk2code on April 22, 2013 | hide | past | favorite | 72 comments



It's not so much that AMD "stuck" it to Intel as Intel shot itself in the foot. The only reason AMD was successful was that Intel made a gigantic mistake in NetBurst ("10 GHz or bust!")

When Intel finally rectified that mistake and released Conroe it was game over for AMD. AMD simply can't compete with Intel whether in architecture or fabrication.

What AMD could've done was to concede the PC market to Intel and focus on the emerging mobile device market - the classic disruptive attack. ARM went that route and now has a market cap that's 10X bigger than AMD. Qualcomm adopted a similar strategy with Snapdragon and now has a market cap equals to that of Intel's.

Lesson: never fight a dominant incumbent in its own game. You will get killed. Play a different game. Better yet, invent a new game where you know the rules best.


That's only part of it.

Intel had an epic once in a generation stumble with a combination of very bad, very high impact choices all at once.

The NetBurst (Pentium-4) architecture was both hugely expensive to develop and manufacture and also extremely underwhelming. But that's only part of it. They also staked out the next generation advancement of the instruction set to a revolutionary and untested architecture (a VLIW variant) with certain compromises, all of which turned out to be a huge and costly mistake. It took almost a decade for the architecture to fully mature and even today it's still not very suitable for everyday computing. They hitched their system wagon to RAMBUS memory, which priced their systems out of the consumer-grade sweet-spot and their long-term contract with RAMBUS screwed them when DDR came around as an equally fast or faster alternative at a much lower cost. And they struggled with making decent chipsets for several years. One of the last pre-RAMBUS/pre-NetBurst chipset was the legendary 440BX which was hugely reliable, flexible, and high performance. After that intel chipsets had a long period of utter mediocrity.

In that window AMD made several strong moves. They produced high performance, low-cost chips using a very solid architecture. They developed an excellent next-generation 64-bit instruction set (x86-64) which had a great deal of heritage with the IA32 (x86) architecture and included mostly just very sensible changes and extensions. And they were able to create hardware which both ran existing 32-bit code extremely well and also ran new 64-bit code at competitive levels of performance. They also managed to put together a "whole system" in terms of cpu, chipset, and RAM which was at a very excellent intersection of affordability and performance.

Unfortunately for AMD, Intel very much learned from their mistakes, adopted DDR ram, vastly improved their chipsets, cut their losses on mistakes like IA64 and the early generation of EM64T systems, adopted the AMD64 / x86-64 architecture, and developed some excellent CPU technology in the form of the Core, Core 2, and Core i{3,5,7} architectures. Meanwhile, AMD ran into a few stumbling blocks and has had a tough time getting over them, let alone getting up to being able to compete head to head with intel again.


IIRC, there were several points during that period where AMD tried to become the default CPU for large computer builders at the time (e.g. Gateway, Dell, etc.) due to far better price/performance ratios, yet frustratingly they stuck it out with Intel. I'm guessing Intel offered some sweet sweet long term deals to keep AMD out of the picture.

The landscape could have looked very differently today had those deals gone through.

I think AMD actually ended up with the superior technology all the way back with the Athlon line (when Intel was still PIII) with a better memory architecture and a few other good odds and ends.

They tried to get back in the game with some bold moves, like buying ATI, but it just hasn't panned out for them.


Three things that have always been in Intel's favor have been their truly massive and always cutting edge fab capability (which no other cpu manufacturer has been able to even come close to), their extremely well-funded R&D teams, and their industry relationships (largely due to those other two factors).

Intel is able to field several different processor lines and architectures simultaneously, and also develop multiple new architectures at the same time. That's not something that most other companies can do. They've also been reasonably good at recovering from mistakes quickly (something that Microsoft has also been fairly good at).

That said, it'll be interesting to see how things pan out with the changes to the computing landscape that are in the works now.


things that have always been in Intel's favor have been their truly massive and always cutting edge fab capability (which no other cpu manufacturer has been able to even come close to)

On a flight a while back I struck up a conversation with a PhD that specialized in chip design. He suggested that if AMD was able to manufacture their chip designs in Intel's fabs, that they would be out Intel chips in terms of performance and power.

The idea was that because AMD uses fabs that are, basically, two generations behind Intel, that their designs have to be more innovative to be able to work within the constraints of the older process.

I'm no electrical engineer, so I took him at his word, but it certain points out that Intel's fabs are a distinct advantage.


I wouldn't go that far. Intel definitely has an advantage in process technology, but it's not as though they rely on it exclusively to stay ahead. For example, currently the AMD and Intel CPUs in the server market are both at the same level of process technology, 32nm, and Intel's CPUs are still overall superior.


Intel's Ivy Bridge processor has been out for almost a year now, and is on the 22nm fab process.


Ivy Bridge still hasn't been released in the server space. The server market currently has Sandy Bridge chips. The Ivy Bridge chips are due at the end of the year.


It's 22nm finfet. The finfet is equivalent to a 0.5/1 generation advantage in power or speed.


Wasn't AMD moving to have (x86) chips made at TMSC?

Maybe if they partner with IBM. Still, even if Global Foundries is two generations behind Intel, that's still one of the most modern fabs and being constantly updated is very expensive.


I've always felt that there fab division should have been forced to be split off from it's design division. It would be better for the industry if architectures had to compete on their merits and all designs had access to the same quality of fabs.


Very sweet long-term deals, yeah. Dell managed to meet Wall St's earnings expectations almost entirely through payments from Intel not to ship AMD systems; they even got in trouble because they fiddled the books to hide where the money was really coming from: http://www.pcpro.co.uk/news/359770/intel-sweeteners-made-up-...


> I'm guessing Intel offered some sweet sweet long term deals to keep AMD out of the picture.

I don't know whether Intel under bid AMD's pricing or not.

What we know Intel did do, though, is that they ran a marketing blitz of Intel Inside ads so that when consumers went to buy a computer and saw "AMD" instead of "Intel" they reacted negatively. Based purely on the marketing, customers would view the computer sporting the "Intel Inside" logo as the premium product.


Methinks they did a wee bit more than that.

http://www.nbcnews.com/id/33882559/ns/business-us_business/t...


AMD did some interesting things. Another I'd point I'd add is taking RISC vs CISC off the table with the x86-outside, RISC-inside K5. This opened up the field for a lot of innovation in the core.

What Intel did best was learn well from both AMD's success and their own stumbles. I'm sure Netburst & Itanium yielded great learnings about branch prediction.

I'm not sure if it was luck, but Intel also benefited from a shift to mobile. iirc, the original Intel Core line was for mobile, but ended up dominating the Intel roadmap. At the same time, AMD was trying to make inroads into the server market, mostly because it was lucrative.

Perhaps that stifled their innovation in some key ways. Anecdotally all the "big iron" architectures are dying off. Certainly if you wound back to 2000 and said ARM is the upstart in the server industry, you'd get laughed at.


Interestingly both Intel and AMD pulled the same trick at about the same time with regard to CISC vs. RISC. The Pentium Pro was also a RISC core that used micro-op translation to present an x86 exterior, and it was introduced about half a year prior to the K5, though it didn't hit the consumer market for a while.

Also, as you point out, Intel has been good at cross-pollinating between different development lines and teams. The Core architecture was a ground up redesign but it was heavily influenced by the work on the Pentium M design done by the Israeli division. And previously there was a lot of influence on the Pentium-III design from the work done with the low-budget Celeron line.


NetBurst wasn't Intel's only big mistake. The other one was being unwilling to disrupt their own 64-bit strategy, which rested on Itanium, with a "low-end" 64-bit design.

I am grateful that AMD forced Intel to rectify both these errors. Too bad AMD wasn't able to ride that wave to long-term success.


> The only reason AMD was successful was that Intel made a gigantic mistake

I don't think that's entirely fair. AMD has put out some darn good chips that should be considered successful on their own merits.


Excuse my pedanticism, but QCOM's market cap has more to do with their intellectual property connected to CDMA, 3G, and LTE standards than anything else.


>>Lesson: never fight a dominant incumbent in its own game. You will get killed. Play a different game. Better yet, invent a new game where you know the rules best.

What about Google[1], Apple[2], Facebook[3], etc? Not disagreeing with you but I'm genuinely curious on what you think about these companies. I probably can think of many more examples but these are the ones that sprang to mind when thinking of companies that "fought the dominant incumbant in its own game".


1) Google - when Google came into existence in late 90's there were no dominant search engines. The orthodoxy at that time was that there was no money in search. Accordingly nobody wanted to be a search engine and everyone tried to be a portal. The dominant player back then - Yahoo - emphasized its use of human editors and partnered with AltaVista and Inktomi for search.

So Google never really defeated a dominant search engine - it merely took advantage of a giant vacuum in the market.

2)Apple - Apple never defeated Microsoft in the PC game. Instead it focused on emerging markets: ipod, iphone, ipad. Eventually these new products (smartphones & tablets) became powerful enough to erode the PC market. Even now Microsoft is still dominant in the PC market. It's just that PC is no longer where the action is at.

3)Facebook - there were no dominant players in social networking when facebook was founded. Myspace was the biggest player, but "big" was a relative term back then. Even at the height of its popularity the vast majority of the population was not Myspace users. Sure if you were 16 year old in 2004 then you were probably on Myspace. but outside of that demo Myspace was far from dominant. Most people were not on any social networks. In short it was a wide open market.

I'm sure there are examples where a challenger took on a dominant incumbent head on and won. But those are the edge cases. You pretty much need all the stars to align perfectly to have a chance. It's not a strategy I'd bet on.


1) I remember AltaVista to be clearly dominant in the search engine space. Yahoo was not the dominant player in search, and never has been, they were a curated directory of sites.


1) Google > Apple in mobile 2) Apple > Nokia in mobile 3) Facebook > Google in ads


1)Apple was never a dominant incumbent in mobile. It focused narrowly on the high end market. For the vast majority of the population, especially in emerging markets like Asia and Africa where Andrioid is now the strongest, iPhone was simply unaffordable. Google didn't fight Apple head on but instead take advantage of its weakness in the mass market and open sourced Android which resulted in a flood of low cost smartphones that quickly capture a large majority of the market, the part of the market where Apple never had much of a presence to begin with.

In short, Google played a different game.

2) Nokia rose to power on simple cellphones. It was never a dominant player in smartphones. In fact smartphone was a wide open market with no dominant players when Apple entered that market. So Apple never needed to fight any incumbent.

3) Google still sells 10X more ads than facebook, so it's rather premature to say facebook has beaten google in ads. And even if that does happen eventually, it'd be because facebook avoided fighting google in search ads (unlike Microsoft tried to with Bing) but instead invented the new market of social ads & newsfeed ads. Once again it's the strategy of avoiding fighting the dominant incumbent in its own game and inventing a new game instead.


What exactly do you mean by google > apple in mobile?


After Apple hit the marked everyone was buying an iPhone. Now most of the people have Android phones.


Not really. While the iPhone has been influential and a huge success for Apple, it didn't really kill any of existing smartphone incumbents. "Everyone" was only buying an iPhone for fairly limited values of "everyone" (e.g. excluding several continents). Instead, Android killed off those incumbents, taking advantage of the opening the iPhone had created.

So it's really Google > Nokia (Symbian), Microsoft (Windows Mobile), Palm and RIM, with Apple off to the side somewhere. And the complex, multi-faceted Apple vs Google battle in mobile isn't an upstart vs an incumbent, it is a battle of two upstarts who have become the incumbents.


I'm not sure whether it was intended or not, regardless, one can take "game" to mean much more than merely "market".

If you play by the same rules, same norms, same fundamentals, etc. as hugely entrenched businesses then you'll often fail. But that's not what google, apple, or facebook did, they changed the game when they came on the field.

Google is an excellent example because the market was saturated with search engines at the time. But at that time search worked very poorly and the dominant strategy of search engines was to become "portal" sites. Google made several "game changing" moves. They created pagerank, which led to better search results. And they used a combination of cutting-edge software techniques (map-reduce, sharded databases) along with cutting-edge datacenter techniques (consumer hardware, massive parallelism, heavy automation). This combination made it possible for google to create a product which was better and superior (more accurate, faster) and cheaper to operate. Which then made it possible for them to monetize their product differently (adwords, etc.) in ways that wouldn't have been possible or profitable for others. The rest is history. Similarly Apple changed the game when it came into competing in the laptop, music player, and phone markets.


Apple played a different game: focus on home usage. They created hardware that looked great, so you wouldn't mind having it in the living room, and they supplied a productivity suite that matters for home use: iLife.


>AMD began life as a second-source supplier for companies using Intel processors.

Ars should know better. AMD started in 1969, making their own logic parts. In the late minicomputer era, their Am2900 series of bit-slice[1] components was king, being used to build CPUs for models of DEC PDP-11 and PDP-10, DG Nova, Xerox Dandelion ("Star"), Wirth's Lilith, Atari vector arcade machines, and countless other machines.

[1] https://en.wikipedia.org/wiki/Bit_slicing

(Not especially relevant disclaimer: I work for Intel.)


Ars hasn't known better in a good long time.


Agreed. Their quality has gone down a lot over the past couple of years. Every article they write is full of mistakes and inaccuracies. Honestly they only reason I still visit the site is to see Aurich Lawson's artwork.


It's funny that you mention mistakes and inaccuracies, because I'm actually waiting for the author of the OP article (Cyrus) to get around to replying to an email I sent him 4 days ago pointing out that the claims about Silk Road in http://arstechnica.com/tech-policy/2012/06/fbi-halted-one-ch... are completely false and this can be proven by merely reading the PDF used as the source for the article.

Thinking about complaining to him on Twitter, maybe if I start being more public about it he'll bother to do something about a year-long mistake...


I pinged him on Twitter and he fixed it within the hour. I guess I've learned a lesson here.


Yeah. I find it kind of sad just how bad people are with email. I hate the idea of having to rely on Facebook or Twitter to get people to respond.


What a wave of nostalgia. I took a job at AMD in 2005, right at the zenith of their success. I was totally enamored of the great technology that went into K7 and K8, and I was ready to help this underdog company stick it to Intel and turn the microprocessor world on its ear.

I worked there for six years, through fumble after disaster after boondoggle. I could go on and on about all the reasons I believe AMD went down the tubes -- and I'm really looking forward to reading the second installment -- but I think a lot of it reduces to the disfunctional corporate culture alluded to in this piece.

During my time at AMD -- and the old-timers confirmed that it was ever thus -- there was always the sense that every project was make-or-break for the company, that we were always on the brink of disaster. Long-term strategic planning is simply not in the company's DNA. We lurched around like a headless chicken, and when -- through a combination of good products and missteps by Intel -- AMD finally got a taste of sucess, we squandered it in the most ham-handed and disastrous (and predictable) way.

P.S. Bulldozer project was a total horror show, beginning to end.


> P.S. Bulldozer project was a total horror show, beginning to end.

I'm intrigued. In your opinion what were the pain points - Management? Schedule? Technical? Foundry? Methodology (EDA synthesis vs old school hand-layout etc)? Something else?

All of the above?


The whole project was over-scoped and over-ambitious. Flush with the success of K8, AMD decided to undertake a completely new from-scratch processor design. New architecture, new cell libraries, new design methodologies, new tools -- we chucked everything out and started over completely.

It's like AMD had their first taste of champagne with the success of K7/K8, and we immediately got drunk and fell on our face.

I would also like to single out for opprobrium Bruce Gieseke, the technical director of the project, who shoved an utterly impractical and labor-intensive design methodology down our throats, and would not relent even when it became clear how much it handicapped the project. We should have been trying to synthesize much more of the chip from day one.

Over all, the project was plagued by delays, bugs, and dead ends. It was way over budget and well past schedule; in the end, it came to market at least two years too late to have an impact. By the time bulldozer-based products reached market, the technical innovations of the new architecture had already been bested (or at least matched) by Ivy Bridge. And of course, Intel has Haswell on deck; AMD, having poured all its resources into bulldozer, has nothing left in the tank.

But bulldozer was also disastrous for all the resources it leeched away from other projects, and the way it focused the company's energy on a product whose market was at least flat, if not yet shrinking. There were some really promising projects that got cancelled so that AMD could throw more engineering resources at bulldozer.

management: fail

schedule: fail

technical: would have been awesome in 2009

foundry: Working with Global Foundries was not entirely smooth, but it would be inaccurate to lay too much blame here.

methodology: fail++


Thanks for the write up. In particular I'd seen comments[1] that seemed to imply big disputes between the old school (highly tuned custom chip layout) vs new school (progress via faster iteration with synthesis/automation) design styles.

[1] e.g. http://www.xbitlabs.com/news/cpu/display/20111013232215_Ex_A... (ignoring the misleading intepretation in the article itself).


I worked with the engineer cited in that article. He left the company, on less than amicable terms, pretty early in the bulldozer project. I'll just say he lacks credibility, and leave it at that.


From my perspective, AMD really screwed up the AMD/ATI merger in the worst way possible. When AMD bought ATI they were well positioned to beat everyone to market to release their APU chip. However, the problems started almost immediately when neither AMD nor ATI did anything to combine resources. They should have put in the time and money to merge teams at all levels of the company. Instead they built a few small groups that included senior personnel from both units, then they left most of the lower level teams to work on whatever they were working on previously. Never mind the fact that there were a lot of really clever people all over the place ready to contribute really great ideas.

In other words, there was zero global direction down the ranks. It was just business as usual; keep doing what you've always been doing, and maybe we'll show you some nice slides a few times a year about how great APUs will be. This lack of organization meant that no one had any idea what anyone else was doing. Worse-- even if you wanted to find out there was absolutely no company-wide documentation or organization on anything. Your only hope for getting information was hoping one of your co-workers had bookmarked some magical page with the info you required. l This just got worse when you accounted for the problem of elitism. The hardware teams were just so much better than the software teams. After all, software is easy, so what sort of useful input could those code monkeys offer. And far be it from the software teams to actually talk to someone from the QA teams; those QA people were beneath notice. Finally, add in a very wide distribution of personnel seniority, insane levels of paranoia about job security, grade-school level office politics, and completely disparate management styles, then hit blend.

So really, the results are not at all surprising. You can't have two companies pretend to be one while playing tug-of-war, and still be competitive.


> They should have put in the time and money to merge teams at all levels of the company

How would forcing the teams to merge quickly rather than gradually achieve anything other than messing up everyone's development schedules?


I'm not suggesting they should have broken up all the teams and made new ones. There are much smarter ways to merge teams that involve gradually easing them together. However, having five teams in the company doing nearly identical things is not "gradually achieving" anything.

If (big if) teams had development plans they had to follow, then those plans should have been adjusted so that eventually all these teams were working towards a common purpose. If you just leave those teams alone and hope for the best not only are they going to avoid any chances to work together, but they will often go out of their way to ensure they don't happen.


Yes because teams are like liquids, just pour them in a cup and stir that stuff up. sigh


that's not what was said at all. sigh


Not to mention that they did it just as Intel was releasing Conroe.


How I remember those days! after the complete disaster that were the first PIIIs comes the Athlon. It was just so amazing, so unbelievable that you could get that much power without breaking the bank.

And it was the same story until about 2006, you could get an Athlon or a FX and get more than enough CPU power to run almost anything in the market at the time. Even the Sempron which was the cheap option was good enough, I remember guys in forums getting mobile version and OCing 'em to the very limit.

But then came Core2Duo, and AMD literally had nothing against it. The first Phenom sucked, big-time, there is no other way to put it. The Phenom II was much better but not good enough and the only talk about it was how you could turn on the disabled cores in the X2 and X3 variants since they were the same silicon than the quads.

I really wanted the FX to be as good as the Athlon-era FX, but it wasn't, not even close.


Hang on, what about consoles? AMD are doing the CPU and GPU for the PS4 and presumably the next Xbox if rumours are true.

Surely this is quite a windfall? Particularly if they're pumping out the same part for ~5 years, much longer than the average CPU stays on the market. Is this not a cash cow?

I'm disappointed AMD are no longer competitive on the CPU desktop performance market, but I'm not sure I understand why they cant compete with Intel on CPUs given the stable revenue offered by consoles and their competitive GPU line. Is it Pure R&D budget? Or Intel are too far ahead tech wise? Intel have the best engineers? What is it?


Console parts are commodity, not cash cows. I'm sure AMD will make a profit, but not much. NVIDIA and Intel never made much money here either.


Apparently AMD will have the upper hand in the next gen video game consoles PS4 -> http://en.wikipedia.org/wiki/PlayStation_4#Hardware xbox 720 -> http://www.digitalspy.co.uk/gaming/news/a471564/xbox-720-to-... or at least that's the rumor.


The margins are likely extremely low and performance doesn't seem to be anything amazing, it's likely current or slightly last generation. Don't forget that consoles will still have to sell for <$500 and make some profit.


Indirectly, it does makes AMD CPUs and GPUs more attractive to PC gamers:

http://www.eurogamer.net/articles/digitalfoundry-future-proo...


Which is another shrinking market. AMD needs a mobile and/or GPU/CPU integration strategy. I think ARM is about to eat everyone's lunch for consumer products. Not sure what may happen with Intel, as there's still some life left for servers.


> Not sure what may happen with Intel, as there's still some life left for servers.

For now. But sooner or later ARM will become powerful enough to run servers. The chips are cheaper. The operating costs will also be lower as ARM draws lower power. Before you know it data centers will switch over to ARM, first as a trickle, then as a deluge.

ARM will do to x86 what Intel did to SPARC & Alpha & PA-RISC & MIPS. Intel will have to keep retreating to higher and higher end niches.


What do you think is the moat that will keep Intel from doing to Arm what they did to the RISC chips? Intel has better process and very capable designers. They are quickly moving down-market. Why assume that Arm moves into Intel's territory before Intel moves into Arm's?


AMD has only stayed in the game to begin with because Intel was 'forced' to license x86 to them in order to avoid government anti-trust prosecution.


And we should be glad that they did or history may have turned out very different. I doubt that an Intel without any real competition would have innovated as quickly as they have.


It's a very interesting notion, that we'll never get to test unfortunately.

Without the artificially propped-up AMD perhaps the market would have served up a far superior competitor than a half-the-time mediocre AMD that is now on life support.

ARM is tremendous evidence that the market can produce competition to beat Intel. I think the ARM model is kind of like how Windows & Android both managed to beat Apple in market share, in regards to the business model (ARM being distributed amongst numerous competitors, advancing the market faster than it would with a solo supplier).

It's also worth considering that it nearly always takes a new epoch / inflection / radical shift in markets to dislodge industry standards (and that as a consequence AMD may never have stood a real chance). Which is also why ARM now has a chance to rock Intel.

I'm not aware of too many standards that have been killed off in tech without a big shift in the underlying technology segment in question.


Exactly! It's like Intel kept AMD just barely alive the whole time, now that ARM is taken over much of the CPU market it might not be necessary anymore and Intel can just mercilessly end AMD as a CPU company. Not by technology but with capital.


Does Intel face a brighter future in a world dominated by ARM?


I found it interesting that Bill Gates had a hand in AMD buying NexGen, the company that gave AMD its' K5/6 technology and brought them Raza.

I don't really know what was going on with Microsoft and Intel around 1995 but it appears pretty clear that Bill wanted to have atleast two x86n chip suppliers available for PC manufacturers.


A book recommendation relevant here: Inside Intel is a great book about the founding and early-days sparring with AMD - it's old - I bought my copy in the late nineties en route to Silicon Valley from London. But it's a great scene setter and backgrounder for these sorts of stories. Great read.


So whats its called and where can we find it?


It's called "Inside Intel", and appears to be available at Amazon. http://www.amazon.com/Inside-Intel-Worlds-Powerful-Company/d...


AMD also had a niche in supercomputers, for example the Cray XT/XE series. Not sure the absolute volume really made a difference there though, the number of Cray sales would be dwarfed by ordinary PCs and servers.


Cray's next generation is going Intel BTW.


This is a very scary image for AMD:

http://i.imgur.com/pCiDEKP.png

Their revenue is falling, and falling consistently.

They pretty much lost as a CPU manufactured in every area - desktop, laptop, tablet, phone.

Their only option at the moment is to keep making the best damn APUs and GPUs, and try to invent something new. I wish they came out with a 1000-core CPU or something.


Easier said than done, but more or less agree with your conclusion (not the 1000-core CPU part though).

They could gut out to become profitable, and go long on current fringe market things that could be the future. One interesting market off the top of my head is 3d printing. Would be pretty interesting to have a CPU one could print out to go with their 3d printed A-15's from Defense Distributed… sudo headshot :P


That revenue chart is too short term to learn anything from.


Unified CPU and GPU memory is an awesome concept that game developers in particular love (but not just them). We'll see if unified memory in both the next-gen consoles will spur something interesting in PCs. I'm not holding my breath though, since Intel have no interesting GPU story.


It makes a lot of sense for laptops and tablets and console-like PCs (e.g. SteamBox). There may be a market there for AMD.


But in the former and latter, people are likely to wish for Intel CPUs instead (even if they don't actually need them).




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: