Hacker News new | past | comments | ask | show | jobs | submit login

People repeat that meme that SandyBridge doesn't need replaced so often. You say it 2 or 3 times by yourself.

While that's the prevailing opinion, I don't necessarily agree. I think it's SandyBridge owners trying to convince themselves more than anything else, but really it's just being swooped up in the groupthink.

Skylake is the first chip that makes a very strong case as an upgrade. You gain NVME support for your PCIE SSDs, DDR4 (which has shown an improvement over DDR3 in some benchmarks), roughly 20% IPC improvement (5% per gen give or take), DX12_1 feature level IGP, CPUs with 128MB L4 cache which absolutely destroys chips that didn't have this for gaming (Broadwell had it first and Skylake's is improved upon), vastly more power efficient and Thunderbolt3 support.

7 pretty good reasons off the top of my head. You can dismiss each of these if you want, but this is all very attractive in reality.

The whole story is that SandyBridge is only competitive, in gaming, if you overclock to 4Ghz+. You still lose out on the other improvements though and any stock SB system compared to a stock SL system will look pretty sad once you factor in the platform updates.

If it's gaming you care about, take a look at the benchmarks of the 128MB L4 Broadwell chips compared to Devil's Canyon. Let alone SandyBridge. Both get crushed where it counts and Intel is just now getting Skylake 128MB L4 CPUs out the door. If you don't care about gaming, Skylake still crushes SB.




> Skylake is the first chip that makes a very strong case as an upgrade. You gain NVME support for your PCIE SSDs

http://www.amazon.com/Ableconn-PEXM2-SSD-NGFF-Express-Adapte...

If you care about NVMe, just get a $20 expansion card. Besides, NVMe SSDs are expensive. Mushkin Reactor 1TB for $210 yo.

Hell, the fastest NVMe SSDs directly go into PCIe lanes. So if I actually cared about the faster speeds, I'd jump to an Intel 750 SSD.

http://www.newegg.com/Product/Product.aspx?Item=N82E16820167...

Name me an M.2 NVMe SSD that is more cost effective than a Mushkin Reactor (1TB, MLC, maxes out the SATA6gbps bus, $200), or has faster I/O speeds than a 750.

Yes, if I had a laptop which only had room for a M.2 card, then maybe I'd get the Samsung M.2 card. But even if one were given to me for free, I'd rather get the $20 PCIe expansion card.

I can't think of a single situation where I actually need the onboard M.2 card on the Skylake motherboards, aside from the $20 convenience.

> roughly 20% IPC improvement (5% per gen give or take)

I admit, this is a good thing. But this is very very little, especially when you consider that the iPhone 5 to iPhone6 jump was 70% IPC improvement AND battery improvement, yet many people don't consider that enough of a jump.

http://www.imore.com/a9-processor-iphone-6s-and-6s-plus-70-f...

Soooo... FIVE years gets you +20% speed, while ONE year gets you +70% speed on phones. That's why desktops aren't getting upgraded.

> DX12_1 feature level IGP

You buy a $300+ CPU without buying a $100 GPU? The cheapest of GPUs are significantly better than IGP. Hell, if I cared about DX12_1 IGP, I'd get an AMD A10 for half the cost and twice the IGP performance with drivers that actually work on games.

Except I game in capacities that far exceed even AMD's superior IGP. I also care about adaptive sync / GSync technology, which isn't supported by Intel Iris. So I have a R9 290X. Intel's IGP doesn't even come close to a $100 GPU, let alone the midrange GPUs.

> CPUs with 128MB L4 cache which absolutely destroys chips that didn't have this for gaming

NOT on the desktop. Crystalwell is laptop-only, and 45W to boot. Compared to 20W Laptop chips, I don't see the Crystalwell L4 Cache actually being useful to the majority of people.

In fact, I don't even know of any laptops with Crystalwell that I'd recommend to anyone. Here's a challenge: Name me a good laptop with Iris Pro / Crystalwell. Hint: Macbook Pro use AMD dGPUs for a reason.

And hell, we aren't even talking about laptops. We're talking about Desktops, and Crystalwell is UNAVAILABLE in desktop form. Its irrelevant to the desktop user, even if you thought that paying $600+ for a CPU was cost-effective (instead of buying a Sandy Bridge E5-2670 for $70 from Amazon).

Basically, you got DDR4 RAM and IPC +20%. That's all that I actually think the last five years will get you. Or, you can buy a 8-core 16-thread E5-2670 for $70... hell... two of them, get a nice Server Board for $300 and have a BEAST of a machine.

http://www.techspot.com/review/1155-affordable-dual-xeon-pc/


The base Macbook Pro 15" uses Crystalwell and has no dGPU.


Yeah, but would you seriously recommend it over the AMD Tonga (R9 M370X on the upscaled version)?

The 45W i7 is a heavy burden to carry with Crystalwell. Might as well get better graphics if you're going for the 15" Pro.


There is no way in hell I would prefer that AMD chip in my system over an all-Intel system. AMD are just terrible to use in Linux and most people around here want the ability to run that natively without issue. Not to mention the added complication of tacking that AMD chip onto the laptop both from an engineering / reliability stance and software complication.

You missed the irony of your 45watts as a heavy burden. A R9 370X adds about 50watts to your TDP by itself. Along with its needless complexity. If someone wanted to reduce TDP and that complexity you could step down to the base Intel IGP. But if stepping up, Intel's solution makes a lot more sense.


Your loss man. The benchmarks don't lie.

Good luck with your overpriced Crystalwell failure. If you got actual benchmark scores to talk about, please respond to me here: https://news.ycombinator.com/item?id=11536519

But I actually know the benchmarks of everything you're talking about like the back of my hand. Your argument has no technical legs to stand on what-so-ever. Don't feel bad if I'm just calling out your Bull$.


Wait, what are you talking about? That was in no way a response to what I said to you here. You don't need to change the topic just because you're wrong and you know it.

No one wants that AMD chip in their Macbook. It adds complexity both in engineering and software. There's PLENTY to talk about technically there and why that's a good idea. Not to mention Intel's best-in-class Linux support.


It's actually kind of annoying to have graphics card switching - it caused a number of problems in my old 15" MBP, to the point that I opted for integrated this time.


>Besides, NVMe SSDs are expensive

Yes. If you're bargain hunting for gaming hardware you should just buy a console. Or, if you're seriously suggesting to put an Intel 750 into some old system like SandyBridge.. no comment. I would never recommend someone bother doing that.

Step up to an NVME setup, Skylake and do it right. Skylake i5 setups can be had for cheap. You're just arguing to argue on that point. Whether or not you have anything useful to add. The SB argument is common knowledge, an age old argument at that with no new information or insight.

>Name me an M.2 NVMe SSD that is more cost effective than a Mushkin Reactor (1TB, MLC, maxes out the SATA6gbps bus, $200), or has faster I/O speeds than a 750.

I'm not into cost-effective bargain hunting. Anyone who would gimp a nice Intel 750 SSD on a non-NVME system is a fool and you've suggested it.

>The cheapest of GPUs are significantly better than IGP.

No they aren't. The point about DX12_1 IGPs is that it's there, it's modern and it has already sucked the life out of the low end space and moving into the midrange with Iris Pro. Your stance is the 2010-era view on computers. Same era as Sandybridge TBH.

>I also care about adaptive sync / GSync technology, which isn't supported by Intel Iris.

This demonstrates how much you know, and why people shouldn't listen to what you're saying. Which can be heard on any PC gaming forum a thousand times over. This is HN though and it won't fly.

Intel has already committed to FreeSync. It's incoming with KabyLake rumor is that it may be enabled for Skylake.

>Intel's IGP doesn't even come close to a $100 GPU, let alone the midrange GPUs.

Wrong on its face. You just haven't cared to investigate recently.

>NOT on the desktop. Crystalwell is laptop-only, and 45W to boot.

Nope. The Crystalwell chips are going into NUCs from here on out. There's a 128MB L4 NUC coming in 2 1/2 weeks and a 256MB NUC coming in 12 months.

The fact you're talking about gaming and recommending an ES-2670 for that is just silly. That might be a good machine for compiling code. If that's your goal, it's still a bad idea when distcc can utterly embarrass that old power hungry chip.

For gaming, Broadwell already demonstrated what Crystalwell adds for gaming performance with a standalone GPU. And it's a game-changer, it's faster than the i7-6700K. Yes, it is. And it definitely mops up where it counts (99th percentile frame times) on SandyBridge too.

In 2 1/2 weeks you'll see Skylake with Crystalwell and Thunderbolt 3 absolutely delete any SandyBridge gaming rig you may have (even with an Intel 750, if you made the ridiculous decision to actually put one in SB). There might be more cost-effective ways to build a gaming rig, but if you're into saving money on hardware and gaming, buy a PS4.

I understand it's the prevailing thought among PC gaming kiddies, but holding your grip tighter on some old SandyBridge system won't change that in reality it's fallen pretty far behind in both overall platform performance and power efficiency.


I can't find Iris Pro 580 on benchmark sites, because no gamer gives a care about that for gaming.

The Iris Pro 5200 GT3e achieves Passmark 1,174.

http://www.videocardbenchmark.net/video_lookup.php?gpu=Intel...

If Iris Pro 580 GT4e is twice as good (Intel only claims 50% better), that's still not very good. Thats utterly awful actually.

A $100 GPU is the R7 360, just off the top of my head. http://www.newegg.com/Product/Product.aspx?Item=N82E16814125...

Exactly $99 on Newegg right now. It achieves Passmark 3,150.

No one gives a care about the $600 Crystalwell chip that performs worse than a $100 dGPU. Its utterly awful. You'd be insane to actually recommend this product to anybody. You claim that you care about performance. Do you even look at the benchmark numbers bro? You're claims are so far away from reality I really just don't know how I'm supposed to respond to you.

Yes, a $600 Chip. I'm assuming this, unless you can figure out a cheaper Skylake Iris Pro: http://ark.intel.com/products/93336/Intel-Core-i7-6970HQ-Pro...

----------

EDIT: I see that you're an anti-AMD guy. Okay, whatever. That's why there's another company out there.

http://www.amazon.com/ZOTAC-GeForce-DisplayPort-Graphics-ZT-...

NVidea GTX 750 Ti, $105 right now on Amazon. Passmark 3,686. Still utterly crushing your $600 iGPU with cheap-as-hell GPUs, no matter the brand.

http://www.videocardbenchmark.net/gpu.php?gpu=GeForce+GTX+75...

Dude, I'm running a (what was at the time) high-end R9 290x, although this is more of a mid-range card now due to its age (Fury / 980 Ti). It has Passmark of 7,153, and you're seriously suggesting I "upgrade" to a Crystalwell Iris Pro that only achieves ~2000 Passmark?

------------

PS: Skylake performing 20% faster than Sandy Bridge after five years of updates is awful.

-----------

> I'm not into cost-effective bargain hunting.

Then why the hell are you bringing up M.2? Intel 750 is the best of the best and plugs directly into PCIe. Sandy Bridge handles it just fine.

-----------

> In 2 1/2 weeks you'll see Skylake with Crystalwell and Thunderbolt 3 absolutely delete any SandyBridge gaming rig you may have

Crystallwell has to beat a $100 dGPU first. And the benchmarks say otherwise. My bet? Crystalwell fails to beat a R7 360 / NVidia 750 Ti (NVidia's LAST generation BUDGET card) and I get to laugh at its worse-than console performance numbers despite the $600+ launch price for the chip.

But hey man, show me those benchmark numbers if you disagree with my assessment.


"Bro", "dude", "man". So I know I'm talking to some little kid at least.

But I have to say, anti-AMD! Yes, very perceptive as I type this on a machine with a Radeon in it. Consider the fact that other people can criticize products they have extensive knowledge with.

Judging from the rest of your response, my post went completely over your head. And quite the troll as you try to change the points I made in attempt to "win" the argument. But it is amusing hearing some kid saying Intel's 20% IPC boost from SB to SL as awful shows how much you don't know. I have friends that work at Intel. Go back to PC Gamer as you have no idea what you're talking about. Some dumb kid sees ONLY 20% with massive power reductions, doesn't realize that computers can do more than just play video games.

You also failed reading comprehension and the ability to hold a conversation. Congrats. But either way, there's no way around the fact that in 2 1/2 weeks I'll be benchmarking a R9 Fury to an i7-6770HQ with PCIE NVME SSD, some DDR4 and absolutely crushing any SandyBridge system you own.

Enjoy your old ES-2670 and SandyBridge with an Intel 750. What a total fruitcake. Better use of your time is to go read about logical fallacies as you just spent an hour typing about a strawman you created to beat on with points I never made.

What you want to hear because you just want to argue- you're right, I'm wrong. Hope you feel better now. I'm not giving you any more help. I get it, you like your poverty gaming rig. See ya kid.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: