I've been an AMD guy for years, mostly because I was a poor college student when they were getting their start, and AMD had the best bang for your buck around.
Fast forward and my last purchase was an Intel for the first time in my life for no other reason that AMD just couldn't keep up, not even in the same realm.
Lo and behold if they do this again I'm going to regret not waiting a bit for the upgrade :) I'll always have a soft spot in my heart for AMD and I'm rooting for them. Even if that sounds a bit silly :)
> Fast forward and my last purchase was an Intel for the first time in my life for no other reason that AMD just couldn't keep up, not even in the same realm.
It's quite hard for me to picture an enthusiast that pays the premium to get the best single thread performance, does not already own an external graphics card, and is then unhappy with what Intel HD graphics provides but would have been fine with the extra sliver of performance of an AMD APU.
If you're maxing out your processor, both graphics options are either going to be more than sufficient for desktop use or woefully inadequate for games and compute.
Maybe this will change with AMD's recent HBM announcement.
You're spot on. I still game, although not as enthusiastically as I did in the past. The extra horsepower of my GPU probably isn't strictly necessary for what I do, but Tim Allen.
Well, my machine has a i7 4930k, which has no on-board GPU, I use an nvidia K2000 for displaying video.
in the laptop even the old GMA500 drivers are enough for 2D render, if you're playing games then don't use on-board graphics?
Onboard graphics are "good enough" for most scenarios, less so in graphics production and gaming- which only leaves onboard (or on-die) in the mobile platform. And AMD simply isn't competitive there due to power consumption.
I do use on-board graphics on my travel netbook for 3D programming on the go (just as hobby), and Intel on-board GPUs were never a match, specially given their OpenGL drivers.
Intel has improved their OpenGL support by leaps and bounds lately. Mostly because they need a good implementation for Android tablets.
The support is actually better than what AMD has now. As an example AMD has (or had, been few months) a horrible bug in OpenGL compute shaders that pretty much break them completely. Whereas with Intel it works nice and dandy.
But yeah. Price-wise cheap AMD laptop (sub 500€ range) is a better choice all in all. For anything more expensive Intel + Nvidia tends to dominate completely.
That's a bit of a niche, though. The enthusiast gamer crowd (and other niche markets requiring hefty GPUs) will be using a discrete card anyway, while non-gamers will be generally fine with Intel.
Nvidia GTX 980. The sad part? I honestly don't run anything that requires the GPU horsepower that often. But OH. MY. GOD. it renders that console sooooo hard!
All kidding aside, I've also been an 'Nvidia guy' for years due to their Linux Drivers. I know ATI does pretty well nowadays and I've owned exactly 1 of their high end GPU's (cashed in when the bitcoin frenzy crashed a few years ago and all these cheap GPU's were coming on the market).
It was a fine card, but all things being equal, I tend to stick with Nvidia unless there's a strong reason not to (like a super cheap upgrade, LOL!)
edit:
And it's kind of weird the bias' you get as you age. I will never own an MSI motherboard. I would rather compute on an old 6502 processor than deal with them again (I'm exaggerating, but you get the idea). I've had so many people tell me MSI does perfectly fine, but I've had 2 boards and trouble with both. Asus/Gigabyte or bust. I know it's silly and I don't even care :)
It's been a long time since I was last excited what AMD has to offer. Hopefully this puts AMD back on the map, the x86 market needs a second strong contender. I might choose AMD for that reason alone, if they get within 20% ballpark of Intel IPC. Especially Skylake comparable AVX-512 performance would be good.
I also hope they'll put some more effort in Catalyst drivers. Had pretty horrible experience under Ubuntu 15.04 recently...
That's only true for Intel Skylake mobile/desktop CPUs. I find desktop and mobile offerings too unreliable for anything serious, due to lack of ECC support. Data corruption on non-ECC systems has bitten me more than once. I've learned my lesson.
Skylake Xeons — the only Intel chips I'm interested in anyways — on the other hand will support AVX-512. The software I'm writing already has rudimentary support for AVX-512. The prospect of processing 64 of 16 bit elements per clock cycle per core is pretty exciting. 32x per instruction (512/16) and 2x comes hopefully from dual issue.
Besides, we're talking about AMD chips here, aren't we?
I wish the 40% IPC raise is true and bring their CPUs into i7 territory.
I'm an AMD "fan", but we all know how AMD is all bold claims 1-2 years before release date, and when the time comes.. well if they deliver half of that it's already a win.
I did, a year and a half ago. I decided upon FX-8320, as for me, more cores was better. In my PhD I'm doing automatic parameter tuning of optimization algorithms, and I thought it would be handy to have as many cores as possible also back home (the actual runs are of course run on a monstrous computation server somewhere in the basement of the university). To have 8 separate cores has proven to be handy every once in a while. Just yesterday I did some parameter tuning on a machine vision side project.
I got the A10-7850K APU on release and I couldn't be happier. I built my main dev PC with it. The integrated graphics adapter is more than enough to drive 3 monitors (it can do 4 as well) running Gnome Shell. All that using the free drivers, which is important to me because I need it to be hassle-free.
I've never noticed any slowness but then I don't do heavy video processing or hours of compiling C. Light gaming and WebGL demos are no problem either.
I used AMD for the cluster because AMD's nested virtualization is still better supported in Linux than Intel's. This comes down to the fact that AMD virt is simply better architected.
Single thread performance sucks, but if you are able to use all the cores, then price/performance of the whole chip is great.
Their embedded APU series are actually competitive – mainly because unlike their Intel equivalents they aren't artificially crippled by disabling a bunch of useful instruction sets.
Other than that? Nope. Not since the first Phenom came out.
No doubt. But I'm not doing anything with OpenCL while I do have a dedicated desktop machine for gaming (with a dedicated AMD card), so I never found an use for their desktop/notebook APUs.
About a year and a half (maybe two?) ago we purchased a dual socket opteron 4386 for use as a build machine. A lot of this was based on the fact that we only had like $1500 to spend and we wanted the most bang for the buck. The spec gcc benchmarks showed the CPU doing quite well against significantly more expensive intel machines. So, we jumped on it, and haven't regretted it for a minute. Its a good combination of lots of cores and fast single threaded performance (for the linking and packaging). Combined with the SSDs and ECC RAM, its quite a screamer.
We have a number of intel machines for other purposes, and it still costs probably about 2x in intel land to get a machine to beat it. Maybe closer to 3x if you buy the machine from HP/IBM.
I recently needed a new home/carry-around laptop, and got myself an HP Elitebook 725 which has AMD A10-7350B CPU. It cost me 900€ (incl. 24% VAT) with 8G RAM and 256G SSD.
At work, I have used Elitebook 820 which is Intel i5 with otherwise similar configuration. That costs 1350 €, with similar RAM and SSD.
The two machines are completely identical in appearance and behaviour. So far I am very happy with the AMD version - no perceivable difference in performance. It could be that the 725 with AMD CPU eats the battery somewhat faster, but when the Intel version of same hardware is almost exactly 50 % more expensive, I'm not complaining.
If I'd need i7 performance, then the AMD offering wouldn't be there.
I bought an HP MicroServer N40L (AMD Turion II Neo N40L dual core) two years back, and added in a low-profile AMD Radeon 6450. The combination is powerful enough to host a media server, play some older games and stream more modern Steam games from my Intel desktop, but low-power and cheap enough to make me very happy with my purchase.
The new generation MicroServers are Intel Celery pieces - more powerful but far more expensive (although I can't say for sure whether the price tripling involves other factors).
I have a home+working desktop.
I wanted it to be as compact as possible, so no GPU card, while still retaining 3D capabilities.
I think in this use case, AMD's APUs are still competitive.
For the sake of precision though, this assumes that any Intel+GPU card combination consumes more (and it's more noisy) than a comparable AMD CPU, for 3D workloads.
They have an advantage in the admittedly small niche of poor gamers with laptops. I could play relatively modern games in my old laptop with AMD integrated GPU (I think it was an A6-something) much better than with my newer low-end laptop (with an Intel HD 4400)
I have a nearly couple hundred AMD stations soon all A8-A10 6 or 7 series APUs.
Any one metric intel can beat but factor in price and as a whole package I think AMD are a better choice to cover doing all kinds of work. Good enough CPU and free GPU as far as I'm concerned.
No, but if you spend a lot of times on OpenGL software user fora, you'll see that Intel is far and away the least headache-inducing graphics option on Linux and the only realistic option on the BSDs.
If you see the jump Intel made by switching to HT you wouldn't think it to be a bold claim. Look at the first gen i7 compared to it's predecessor. So, AMD added SMT, 3D transistors, improved caching, etc. But yeah, more details would be much appreciated. I can't wait to see a diagram breaking down the architecture layout.
I'm not surprised of the jump, I've been waiting for AMD to make such a jump so they could try to keep up with Intel's horsepower.
To expand on this - the Bulldozer FPU is supposed to be SMT, but the integer cores are single threaded; two integer cores share an FPU, making a 'module'.
This is apparently the same scheme that DEC used with one of the Alphas. It's interesting that as respected as (I think) the Alpha was, both Intel (Netburst) and AMD (CMT) tried to adapt aspects of the design and it seems that in both cases it hurt their respective businesses.
AMD and Intel used to compete. It appeared neck-and-neck at times. Now AMD has 1.1% of the market cap of Intel. And that's with AMD also competing with Nvidia on graphics.
Ouch.
"PlayStation 4 features an AMD x86-64 Accelerated Processing Unit, in hopes of attracting a broader range of developers and support for the system. The PlayStation 4's GPU can perform 1.843 teraflops. Sony calls the PlayStation 4 'the world's most powerful console'." -- http://en.wikipedia.org/wiki/PlayStation_4
Both previous-gen ones had consumer-level PowerPC processors made by IBM, a product category which no longer even exists, so I wouldn't put too much importance on that.
By extension, all current gen consoles have AMD GPUs, and two of the last gen ones had ATI GPUs. I have no clue how much royalties/licensing fees that gives them.
A quick google search fails to bring up the original suit text, but from what I recall it was pretty scathing. I hope that cash infusion is at the root of this announcement and AMD can get back in the game.
Unfortunately with antitrust lawsuits, they usually happen too late to stop the damage from being done, as we're seeing in this case and as we've seen with Microsoft and IE.
The way this could be solved without "abusing" antitrust power too much is by looking for "anticompetitive behavior" before companies actually get 90%+ market share in a market. At that point it's obviously too late, and with heavy lobbying-influenced politicians and institutions today, the punishment is unlikely to fit the crime, too, and the monopolists will only get a slap on the wrist and get to keep their monopoly (which makes the crime all worth it).
Example of anticompetitive behavior: Microsoft not allowing other browsers to be installed on the Windows mobile platform. It doesn't matter Windows mobile only has 3% market share. Why wait until they get 90% market share (again) to do anything about that awful and anticompetitive policy?
Same goes with Apple which doesn't allow other companies to compete with its "core apps". Why would you ever let something like that happen? It should be quashed from moment zero. Not to mention Apple already has a very strong position in the mobile market (especially in the US), but that should be beside the point.
What I always wonder is, what is market dominance worth?
Intel has $50B/year in revenue, net income of $10B/year. So they paid about one month's profit as a penalty for strategies that helped them dominate the market for years.
Which makes me wonder if anti-competitive practices start to look, even in spite of possible litigation, financially prudent.
I'll buy Intel next time just because my impression is that their motherboards play nicer with Linux. If you run Linux, intel+nvidia seems unavoidable.
nvidia in terms of binary driver? i was under the impression that AMD open source driver is better than nvidia open source. Also under the impression that AMD open source approach is much better (releasing much more internal stuff, co-operating).
I'm not really all that pro-open source when it comes to drivers. I kinda view that as part of the proprietary product, and I don't care much about whether it's open source or not.
AMD's issues have mostly boiled down to single thread performance as well as power consumption. I am not sure how exactly simultaneous multithreading will help in single thread scenarios.
As for the 40% performance improvement, I don't know if people remember but AMD's K10 (Barcelona/Phenom) was so hyped, hardware forums at that time (2007) exploded with fictional test showing outrages numbers. Bulldozer sadly had been for quite some time even slower that K10.
I have never owned an Intel CPU and I hope I will never have to buy one just because there will be no one else to buy from. :(
I had both Intel and AMD computers in that era. Afaik, AMD K7 Athlon had more MHz/GHz, run on less power (Watt) and were cheaper at that time than the Intel Pentium 4. But performance wise both were almost equal and the Pentium 4 3GHz model introduced the HT (HyperThreading) technology (multithreading performance). Athlon 64 on the other hand were the first x86-64 CPUs.
The AMD special sauce back in those days was not raw clock speed, but IPC. AMD would market an Athlon 2400 e.g., clocked significantly below 2.4GHz, but roughly comparable in terms of performance to a 2.4GHz Pentium.
This, combined with a lower price point, is what made AMD chips so desirable. I do not recall power consumption.
It's interesting that in the course of <10 years, the positions flip-flopped, and Intel's architecture dominated AMD in IPC and power consumption, making price AMD's primary weapon in the CPU battles. There is also a small draw from AMD's construction equipment being much more clock-happy than Intel's current offerings, so some enthusiasts only concerned with clock speed still preferred AMD in recent times.
The primary benefit is that you can put together an entire system based on one of AMD's top offerings right now for less than Intel's best CPUs cost. AMD is strictly a budget option currently.
At this point it's a surprise to see AMD able to bounce again. They're strong survivors, always late but still going. Impressive and kind of dramatic at the same time.
This comes a couple of years too late in the game, but I'll probably use their new Zen FX in a desktop build just to encourage an alternative to Intel.
Fast forward and my last purchase was an Intel for the first time in my life for no other reason that AMD just couldn't keep up, not even in the same realm.
Lo and behold if they do this again I'm going to regret not waiting a bit for the upgrade :) I'll always have a soft spot in my heart for AMD and I'm rooting for them. Even if that sounds a bit silly :)