I agree, seems like it's just some hype. It would be pretty damn surprising to see AMD sell off given the various contracts in play (as others have commented).
My understanding is that AMD and Intel's cross-licensing agreement would terminate if AMD was acquired. Previously this would have discouraged acquisition, but at this point I wonder if Intel needs AMD64 more than AMD needs IA32, SSE, etc. AMD's acquirer could renegotiate the agreement knowing that it could walk away from x86 and focus on ARM and GPUs.
"It does not really need server processors or high-end graphics cards."
I think that Microsoft needs server processors for their Azure cloud, with enough volumes Microsoft could undercut all other cloud providers on hardware acquisition costs if they own a manufacturer of CPUs, Chipsets, GPUs for OpenCL applications.
If you build your own servers from a OEM manufacturer like Quanta, Compal and Foxconn which by the way makes most branded servers anyway and you then put your own CPU, GPU and components motherboards in there you have a lower acquisitions costs on your cloud computing compared to the others.
Thus it makes sense for Microsoft on Xbox & Azure and possibly hardware for Windows tablets.
They have had all their recent APUs and GPUs fabbed at TSMC on its 28nm node. Dunno what their most recent GF product line was, last I remember was Richland 3 years ago.
Microsoft is pretty much one of the only companies that could acquire AMD and turn it around. Much how people complained of nvidia gameworks giving it a unfair advantage against AMD, microsoft has the ability to develop in house features only ATI/AMD hardware could take advantage of initially and give nvidia/intel a real run for their money
It definitely helps that microsoft could likely get a significant discount on the purchase price as well since amd is seen as dieing. The arrangement between intel and MS is very old and involves significant cash flowing both ways. I imagine as microsoft has more internal pressure to show more profits that this line of thinking was bound to come up
> Microsoft has already paid AMD around $1.26 billion for
> Xbox One chips. The acquisition of AMD could save it
> around a billion per year on Xbox One chips alone
Uh, that doesn't quite add up. AMD really make that much per chip? And wouldn't a fair chunk of that money also go into R&D...
> Microsoft pays around $100 for every Xbox One system-on-chip to AMD
I think someone forgot to account for manufacturing costs, hence the error in the calculations. The savings will be much smaller, but I believe the real benefit will be putting Sony in a terrible situation - they'll have to pay to Microsoft for the production of the PS4 GPUs and for the next generation console, they may be left with only NVidia as an option (the article mentions other options, but currently these two are the main players, plus we never know what will happen in 5-10 years)
That's only $125 per Xbox sold, which doesn't sound like a lot to me. That includes the CPU and GPU in the APU format which is pretty much the bread and butter of the entire system. On a PC, a CPU and GPU combo with comparable specs at the time of release would be what? 3x that?
I buy mid-range video cards exclusively and that's a good $150-200 with tax right there. I can't imagine also getting a competitive CPU tossed in for that amount. Hell, my PC can't play much at 1080p and the consoles pretty much play everything at 1080p. Granted, my rig is a couple years old, but its kinda amazing how powerful this generation of consoles are.
Sounds like MS and Sony are getting a bargain here.
PS4/XBONE mostly run games at 720p or 900p at 30fps, which isn't great by PC standards. A new $150-$200 dedicated card would be able to hit 1080p/60fps with comparable quality settings. In the sub-$600 space consoles are a bargain, but they have a lot of limitations compared to a slightly more expensive PC.
> On a PC, a CPU and GPU combo with comparable specs at the time of release would be what? 3x that?
Its hard to actually figure that out since the Xbone / PS4 APUs are pretty shoddy Jaguar 8 cores with mid-tier GPU class shader clusters tacked on. The A10-7850k right now retails around $125-$130. It has four Steamroller cores at 3.7ghz unlocked, versus 8 Jaguar 1.75ghz cores on the Xbone one. It has 512 shaders at 720mhz, compared to the xbones 768 at 853mhz. The more expensive 7870k just ups the frequency at stock to 866mhz. Of course the GPU is also unlocked on these parts so you can just run that 7850 at identical or higher frequencies to the xbone (I have a 5800k running at 4.5ghz / 1066mhz cpu / gpu respectively, it just takes a better cooler).
The more significant difference is the 32MB of embedded ram on the xbone SoC, whereas you need expensive 2400mhz+ DDR3 to really get results out of Kaveri, and that still isn't going to come close to the throughput of the xbone.
All things considered it probably exceeds the Xbone in theoretical CPU performance by quite a bit, whereas real world it matters less since game engines can just hyper-optimize their pipelines across the eight cores to minimize the benefit of four beefy fast cores.
So to answer your question... it would probably cost $150 retail if AMD made an APU that just dedicated more die to GPU and got, say, 1024 shader cores on there at 1ghz. Basically an r7 370 on die. The Xbone almost kinda has an r7 360 on board running at 4/5 the frequency, by comparison. Then they just throw four or eight Carrizo cores in there (the mid range not-desktop not-mobile skews). The 370 is basically a faster 265 (same chip) and those are selling for $100 on the lower end, probably $90 without shrouds, and probably $80 at break even sales. And you don't need to fab the PCB or bake in any dram, so the per-volume costs of those shader cores is probably half that. You could even throw some embedded dram in there to make the gpu faster like the xbone part for probably $20 averaged.
And when I say $150, I mean releases for $150 MSRP and sees price cuts putting it at Xbone rates or lower within months.
> I buy mid-range video cards exclusively and that's a good $150-200 with tax right there. I can't imagine also getting a competitive CPU tossed in for that amount. Hell, my PC can't play much at 1080p and the consoles pretty much play everything at 1080p.
Well that GPU is expensive because its not just a socketed GPU processor, its the ram, its the video bus, its a pci bus, its capacitors and voltage regulation, its the pcb, shroud, aluminum cooler, fan, etc. Just CPU fabs are pretty inexpensive compared to the complexity of a discrete GPU part. And $50 - $60 CPUs that blow the Jaguar cores in the xbone out of the water (Pentium G3258, Athlon 860k) are already available.
And a lot of console games this generation are not releasing at 1080p. Some are still upscaling 720 or 900 to 1080 because these consoles have modest performance specs. At release the 360 / ps3 had CPUs and GPUs competitive with some of the best in class hardware. When the Xbone and PS4 released they were already budget tier in both. And remember, console games are using lower resolution textures, lower view distances, lower poly count models, reduced or no AA, minimal particle effects, no fancy shadow effects, etc because one, you cannot really see the difference between medium tier and max tier graphics at ten feet away, and two, because these consoles are not capable of performing anywhere close to what discrete desktop class graphics cards dating back to 2010 could achieve.
As a entirely Linux user this would be worrying, I'm not convinced Microsoft owning an entire "vertical" would bode well for us, even the new shiny Microsoft of Tomorrow.
None of them have shown any desire to support open firmware. Intel makes SSDs, Wifi cards, CPUs, and motherboard chipsets that all have binary blob firmware you cannot audit or trust.
AMD really only makes GPUs with blobs, and that is probably more in the ATI legacy than anything. They have been pretty good about coreboot support of their vanilla chipsets for years, even if it is usually delayed several months to a year post release.
current gen AMD chipsets require binary components on the CPU ("Binary PI") for RAM init, like Intel. And like Intel, they now have a separate CPU running signed code (Platform Security Processor, ARM based).
Yes and I'm currently all Intel (was AMD only for nearly a decade) but the influence of the worlds largest OS vendor owning the 2nd largest x86 supplier would be powerful even for those of us not using AMD stuff.
Not completely happy about this, especially as it introduces some conflict not just in the Windows ecosystem, but also with Sony's Playstation 4.
However, if no one else is interested in buying AMD, then that's probably still better than AMD remaining independent (and dying). Does Microsoft have the necessary chip expertise and more importantly commitment to compete toe-to-toe with Intel, though?
The very first thing I'd like to see Microsoft (or whoever ends up buying AMD) do is manufacture the latest CPUs and GPUs on the most cutting edge non-Intel process technology available (which seems to be owned by Samsung right now). That along with the new high-performance Zen core should at least give AMD a fighting chance against Intel. Some overhauled, easier to understand branding of AMD's chips probably wouldn't hurt.
> Does Microsoft have the necessary chip expertise and more importantly commitment to compete toe-to-toe with Intel, though?
Do they need that deep of chip expertise -- beyond what they'd get by acquiring AMD -- to compete with Intel? If they've got AMD, they can drive their own instruction set evolution and tie Windows-compatibility certification to their instruction set, and force Intel to follow along.
With Windows "runs best on Genuine Microsoft processors", and ARM making inroads everywhere else, what's Intel going to do?
Their market cap is less than what Microsoft paid for Minecraft. Although they have another 2b in debt...
I thought the x86 cross-license agreement prevented this though?
You mean the x86 deal they have with Intel? Intel tried to sue AMD when AMD spun off their chip fabs into Global Foundries, but Intel lost. Besides, AMD has a gun to Intel, due to Intel's AMD64 (64-bit x86) license with AMD. Destroying AMD's ability to produce x86 chips might make them retaliate.
Edit: ...or renegotiate. Having AMD producing x86 CPUs significantly reduces government antitrust attention towards Intel.
Except that won't work. 386 compatibility is required, since AMD64 is an extension to it, not a replacement. One needs an x86 foundation to build a workable AMD64 CPU upon.
This article is rumormill clickbait (and likely complete bullshit), but it actually would be an incredibly smart thing for them to do, assuming they could keep the x86 licensing deal.
The variable manufacturing cost of making chips is tiny...maybe $3-5 per x86 chip. The rest of the expense of production comes from amortizing the cost of design and plant setup...the majority of which is caused by the need for so many variants for the consumer segment.
So if Microsoft dumped the consumer-focused CPU and GPU segments, they could drastically reduce the design and plant setup costs of production, probably by an order of magnitude or more. All they would need is their existing XBox chips, and a handful a handful of heavyweight server chips. With much lower design and plant setup costs, combined with extremely high volume production, it isn't hard to believe they can drive down the total cost per chip to $10-$20.
And at those price levels, they could likely undercut the PS4 as well as all the other cloud vendors in their respective segments.
> If Microsoft bought AMD, then Sony would be faced with a bad set of choices: put money in Microsoft’s pocket every time it sells a PlayStation, or try to create an entirely new platform by using technologies from Intel, Nvidia, ARM or Imagination Technologies.
It seems like everyone is putting money in MS's pocket, with all the patent licensing they do. NVidia has stated that they will never do a console-specific chip again. That said, I don't think that they would have a problem with supplying an off-the-shelf Tegra model in a console for 10 years. High-end Intel GPUs don't exist (at least not outside of Intel), but might by the next console generation.
Though this would be quite an audacious multi-year project, Microsoft actually is the best positioned company currently to compete with Apple. I've never been a Microsoft fan, but I actually think this would be a very good thing for the current state of personal computing, both desktop and mobile.
Having another Apple-like company, one that takes its entire stack seriously, from hardware to user experience, would be a big win for the industry.
Although I like Apple's products a lot, I'm constantly dismayed that other companies can't keep up. It's like the Chicago Bulls in their dominant era where there was basically no other team that could touch them, or Ferrari's F1 team at their peak where they'd basically never lose.
Industries succeed when there's serious competition. Intel had to innovate because AMD was going to eat their lunch. Nikon can't take a break or Canon will trounce them.
This isn't really new, the difference is the binary blob is being at runtime instead of being part of your UEFI image or whatever, Intel have surely had "binary blobs" before, possibly in some write only memory too.
Looks like click bait to me.