Hacker News new | past | comments | ask | show | jobs | submit login
An Inconvenient Truth: Intel Larrabee story revealed (brightsideofnews.com)
81 points by s3graham on Dec 7, 2009 | hide | past | favorite | 15 comments



Fantastic article. This is the best piece I've seen written on this subject so far, with lots of context and a good background on what's happened over the last five years.

One thing I don't see mentioned, and which has been a huge red flag for me, is Intel's seeming pre-occupation with raytracing. As a graphics programmer, I find it absurd that Intel would hope to mount a technological disruption on an industry that has been following a very steady course since SGI and Pixar were founded in the mid-80s. I suspect that the Larrabee approach, as a whole, is a case of arrogance in believing that they can jump ahead of everyone else without having to slog through the hard, painful work of incremental improvements that has been taking place at nVidia and ATI/AMD.

That being said, the author correctly points out that Intel has invested in many technological mistakes in the past, and always eventually recognized their mistakes and changed course.


I think they may have a plan B ready:

http://techresearch.intel.com/articles/Tera-Scale/1826.htm

It's not trivial to engineer a massively-multi-core general-purpose processor. I would try it with MIPS or ARM instead of x86, but I guess this is not an option for Intel.

AMD, however, doesn't need to preserve the x86 ISA's status as an industry standard. Quite the contrary: anything they do to hurt Intel will only make them stronger. And they have some folks with massively-multi-core expertise in-house.


That chip appears to be worse than Larrabee in every respect. In particular it is much harder to program, which would just make Intel's late drivers later.


Before SIGGRAPH in August 2008 Intel was talking about raytracing, but since then the PR focus has been almost completely on rasterization. IMO raytracing was a diversion (sort of like Cell's mobile agents with true AI); it was never serious.


It has been about one year since I last heard any talk of raytracing, so that sounds about right. I'm glad they're not wasting time on that!


"Building 100 or so millions of lines of code for the driver part is a herculean task, but the sources at hand claim that they are working on target"

That's quite a driver!


It's lots of test code, and lots of microcode that gets loaded onto the board when it inits.

Most of that code isn't going to be running directly in the kernel's address space in the shipping driver.


Same author says "Apple ditches 32nm Arrandale, won't use Intel graphics":

http://www.brightsideofnews.com/news/2009/12/5/apple-ditches...

If this rumor is true, it is quite an extreme move. Is 3D graphics that important nowadays? I thought most laptops were bought for work, not games.


Apple's likely looking to unify their gui rendering path to the greatest degree possible, and moving things OpenGL is the easiest way to do this and have things still be nice and fast. They might might lose some on a $10 or $15 increase to the BOM on a laptop, but the potential savings in engineer hours might be tremendous.

There's also the chance that a desire to use OpenCL more widely for internal applications means that having a gpu that can run CUDA or the equivalent is becoming a defacto requirement for all macs.


Laptops are increasingly people's primary computer. If you're going to buy a laptop anyway, the question becomes do "I even need a desktop?" If the laptop acceptably plays WoW or whatever other popular games, that gives it a significant advantage.


Between all major desktop operating systems offloading more and more desktop graphics via OpenGL, and the increasing availability and those same desktop operating systems adding GPGPU APIs, the GPU really does directly affect desktop performance.


Is this someone's attempt to be hired by TechCrunch?


More like beating AnandTech + Ars Technica on their own beat by practicing journalism instead of stenography.


Seems like they are aiming at a different target. This article goes very deep with very specific inside knowledge of long-span development projects. TechCrunch is more current events across the industry.


Or current "rumours", in some cases!

This guys article is much much too in depth to be a TechCrunch article, IMHO.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: