Hacker News new | past | comments | ask | show | jobs | submit login
Intel to Cut 12,000 Jobs, Forecast Misses Amid PC Blight (bloomberg.com)
637 points by MichaelRenor on April 19, 2016 | hide | past | favorite | 502 comments



So Intel is cutting 11% of its workforce, Goldman Sachs just reported a 56% drop in profits, Morgan Stanley had a 50% drop in profits, Netflix missed subscriber growth estimates etc... yet, the Dow just hit a 9-Month high, and the S&P500 is now above 2100.

The whole market is overvalued, not just the tech unicorns.


Goldman's stock price is approximately 25% off it's 12 month high. [0] Morgan Stanley is approximately 35% off. [1] Seems like the market is properly taking in their bad news. Some parts of the market do fine, others get hit. And yes, some might be over-valued. When interest rates are low, the future cash flows of companies are discounted back at a lower rate. So if all things are equal (which they never really are, but humor me) low rates imply higher prices for both equities and bonds, and especially startups whose cash flows are mostly in the future.

[0] https://www.google.com/finance?cid=663137

[1] https://www.google.com/finance?cid=660479


Both GS and MS (as well as other FIs) are off there highs because of all the volatility in the markets the past 6 months and worries about losses from bad energy loans. BAC, WF, Citi are all off their 12-month highs by a significant margin despite doing relatively well in Q1.

Also, this has nothing to do with discounting the cash flows, it's mostly stock buybacks that's driving all the action:

http://www.bloomberg.com/news/articles/2016-04-19/early-warn...


> Both GS and MS (as well as other FIs) are off there highs because of all the volatility in the markets the past 6 months and worries about losses from bad energy loans.

No, they are off because they are making way less money (down 55% from a year ago!). In general volatility can be good for investment banks because it means higher trading volume. Revenue has been way down thanks to fewer deals (remember when tech companies had IPOs?) and reduced fixed income trading volume (GS revenue from fixed income trading is down 48% from a year ago).


It would be interesting if they could release metrics on trading or investing activity, the way some software companies release stats about active users.

Then you could calculate revenue per trade (or something more useful along those lines) to use as a signal in these cases.


They break out a lot of numbers, but I'm not sure how useful metrics on trading activity would be. Not all trades are equal, so something like revenue per trade is pretty meaningless.

http://www.goldmansachs.com/media-relations/press-releases/c...

The numbers are pretty brutal, especially in institutional client services.


These go hand in hand. The debt issued to buy back the stock is cheap. :-)


I think you may have overlooked the recent DOL Proposal and it's potential impact on profit margins if it pushes forward. Good for consumers, but bad if you have to improve processes, hire people to handle them, etc.

http://blog.emoneyadvisor.com/industry-news/trending/complet...


When similar rules occurred in the UK profit margins went up. Matt Levine on Bloomberg View talks about it regularly.


>worries about losses from bad energy loans

GS has $11BB in oil sector exposure. MS is $4.8BB. BofA and Citi are around $20BB. JPM is around $15BB. These are fractions of total loans outstanding (MS is 5%, all the rest are much smaller). Each of these banks has more in reserves than oil loan exposure. Does that sound scary to you?

Again, relatively sophisticated investors understand these things. This is basic research. Where is your evidence?


How much exposure do they have to the financial sector ? As in 2008, we should be wary of cross-pollination.

The limits on the banks' exposures are dependent on counter party risk. GS's exposure is far more, but they insured most of it with other banks, and please allow me the simplifying assumption they insured all of it with BAC. Which means that (hypothetical scenario) if BAC goes bankrupt, suddenly GS exposure to oil goes up to, say $100BB. Suddenly the reserves are woefully insufficient. Then there's the sudden risk that GS goes belly-up, which would increase everyone else's exposure. One of them goes and ...

Additionally, oil fuels the economy. Oil is what builds the roads, what makes everything on the roads moving, what keeps planes in the air and boats going forward. Oil provides significant parts of our electricity supply, and so on and so forth. So you can look at the oil sector problem in 2 ways. Either you look at the supply side, which is producing somewhere around 2% more oil than the market is willing to buy (at any price). This is short-sighted. "If we'd all just put 2% more gas in our tanks, there wouldn't be a problem", which is not realistic.

The other oil to look at is demand-side oil. The market is simply not buying 2% of the oil, except to store it. Why not ? One explanation would be that there is a global recession and the oil price move is simply the result of that. The fact that the price crash happened with oil production/supply constant (even slightly declining) would seem to support this. For instance the baltic dry index plunged before oil started having problems, same with container shipping, and this explains quite a bit of the excess capacity in oil, and therefore, I say caused the price drop in oil. Oil crashed because manufacturing (the source of the demand for shipping) crashed a few months before the oil crash (and hasn't recovered).

In other words : you've identified the wrong problem. Oil is a symptom of the underlying situation, not a cause. You say banks are capable of withstanding one aspect of a greater problem ? Well, I'm not saying that's bad, but it's not reassuring at all (and may not be true due to financial engineering).


You talk as if sophisticated investors can't read a balance sheet or don't understand buybacks.


The GAAP P/E of the S&P is now at 24.28, that's near, if not, a record high. That's bad. It's very overvalued historically. Banking stocks are doing badly because interest rates are so low across the curve that they can't make money from the spread. But even at that, they are overvalued along with various momentum stocks like Tesla, Netflix (which we saw implode yesterday.)


I started investing my retirement money recently. Thats about a good of a signal as any that its going to plummet.


Ah yes, the old maerF0x0 indicator. Beloved in some technical analysis circles


I've heard it many times in the back-rooms: "Wherever maerF0x0 goes, don't."


I hear the name even comes from the sound you make when you've just taken a mouthful of your breakfast toast, open the newspaper to the financial pages, and see what's happened to the value of your investments.


What a lovely mini thread! Thanks for the quick laugh :)


If you're investing for retirement into an index fond then where is the problem? Just wait until the recession is over (and perhaps buy more cheap stock). If you want to retire during a rececssion don't liquidate all your assets. Only as much as you actually need. If you believe that the market is never going to recover why invest in the first place?


it was partially a joke. But yes, dollar cost averaging across a recession will make the total portfolio look ok. I do wonder if we can always continue to make higher and higher "highs" in the market though, meaning any investments I make at the top will make $0 gain.

I do agree with you overall about indexing, however "Just wait until the recession is over" is market timing and I dont believe I (or others) can do this dependably well.


The high P/E is driven by interest rates being low.

One can argue the way to judge stocks value is not by P/E but by E/P relative to interest rates; that is whether their excess return relative to risk-free assets is justified by their risk.

Interest rates remain 5% below their long-term historical average, which can justify E/P going up significantly. Currently stocks offer a 3.5% return over some classes of t-bonds, well in line with historical norms.


This is very true. It's earning yield versus fixed income yield that matters most. If one is out of whack with the other, then there there is arbitrage. (All things equal, if there's a much better yield in equities, then people will move money from bonds to equities, or vice versa)

The current danger is that both will move down at the same time, and then where should one invest?


What else would you expect to happen in a world with negative interest rates?


man, I do believe the market in general is over valued. but not from the PE ratio. that metric is almost in shape:

https://www.quandl.com/data/MULTPL/SP500_PE_RATIO_MONTH-S-P-...

check how high it was in the 2000 and 2008, we're aren't even close


If you'll look at your chart, those peaks in ~2002 & ~2009 were market bottoms, earnings had already dropped precipitously; and they were also black swan events. Historically from a F P/E and P/E ratio, the market is overvalued, even without a black swan. Also when you look at the debt market right now, it does not look healthy at all. That's usually a sign that something is brewing, usually a recession. However, stocks can continue higher, blow off tops are common place in the last phases of a bull market.

My fear is that the central banks are pumping so much liquidity into the market that they are driving up equities and pushing people out of safer assets into things like high yield bonds and momentum stocks. If we do have a recession, the pain could be worse than usual (for stocks) for the mere fact that the debt market could have liquidity problems when tons of funds begin to pull their money out at once from HY.

Not to mention all of the corporate buybacks that companies are doing by leveraging, because cash is too expensive to bring back overseas.

You can still invest in solid companies, but companies like Tesla, Netflix, anything with a super high P/E is going to be taken out back and shot (that doesn't mean the companies will go out of business, only that their stocks are much like Amazon in the 2000s.)


2002 was a black swan event (because of 9/11); 2009 not so much, many institutions had been calling the housing bubble since 2004 and even Greenspan acknowledged it in 2007. A bubble that is acknowledged as such 2 years in advance by the central bank is not a black-swan event (defined as a surprise event that is impossible to predict).

The 2008 bubble was eminently predictable. The problem was that no one knew the exact trigger, and no one had a politically acceptable means to deflate the bubble until the domino effect started, and then hedge funds and then major banks started collapsing.

Until the knife started falling, no one had a financial incentive to stop. After all, subprime mortgages have crazy interest rates, and if you're BoA or JP Morgan Chase, the government will probably step in to stop your collapse...


> companies like Tesla, Netflix, anything with a super high P/E is going to be taken out back and shot

Not that I disagree with you, but I believe a contrarian viewpoint would be something along the lines of "We are currently in the midst of an economic revolution as increasingly large swathes of activity are digitized and lingering mechanical/human processes are computerized. Companies likely to be successful in this new economy are unlikely to be the same ones which were successful in the old."

(To which the obvious rebuttal is probably "People are always saying things are about to be different, and they're usually wrong.")


Those gigantic spikes represent plunges in earnings, not spikes in share prices... If you look right before the spikes, you'll see PE ratios in the mid- to high-twenties.


Netflix at -12% is hardly an implosion. For a growth stock like Netflix, it's not too uncommon. Just look at their chart, it's had plenty of jumps in either direction when their quarterly report exceeded or failed to meet expectations.


>The GAAP P/E of the S&P is now at 24.28, that's near, if not, a record high

Not even close to a record high. It's slightly high, but is it surprising that people are will to put a premium on earnings with negative interest rates?


no worries the FED will do SOMETHING and save us, SOMEHOW. no moral hazard to see here.


If you pick and choose stocks, you can tell any story you want. Intel cutting its workforce is more of a reflection on the state of the PC market rather than the US economy as a whole.

I'm not saying the market is overvalued or not, but this is in no way indicative of that.


^ To add to this I'd lay say the over-valuation(s) are a side-effect of our monetary policy being in never-ending "stimulus" mode.

On Intel I think the "everything is going mobile" is PR for investors, the PC market doesn't have any foreseeable growth potential atm.


The stimuls mode is a permanent regime now. It's good in the short term, but I do not know if anybody understood mid- and long-term consequences of this new permanent mode.


"The stimulus mode is a permanent regime now"

As a general FYI to those reading -- please fact check before commenting.

The Federal Reserve's quantitative easing program ended on Oct 29th, 2014 (538 days ago!) It is difficult to assert that a program which ended over 500 days ago is permanent.

Additionally, one may be willing to assert that "ZIRP is permanent regime now" too, to tack onto the culture that central banking has entered a new era.

For anyone who did not see that news, ZIRP (zero interest rate policy) ended on Dec 15, 2015 with a rise, and is currently believed to rise again at the next meeting.


Yes it's permanent. Interest rates are still super low, meaning companies can still take on debt to buy back their stock. The effect of quantitive easing remains until interest rates start going up and the Fed start buying government bonds from the banks.


>meaning companies can still take on debt to buy back their stock

Please explain the problem here. You're acting like there's some big deception imposed on the public due to companies choosing to return capital to shareholders via one specific mechanism. As if somehow they're buying back stock and "fooling" people into thinking they're making more money or something, and stock prices are irrationally rising. The buybacks, earnings, financials are all open for everyone to read. It's evident by some of your commentary that you can't be bothered.

If you don't agree with the price that other people are willing to sell for shares in this market, you are more than welcome to take the other side of that trade and sell into every buyer on the planet. I'm sure you're not doing that.


Shorting a stock isn't always practical. Stock buybacks can be deceptive if a company buys back stocks with debt and doesn't have proper cashflow to pay it off. Non-technical investors see the stock going up and keep piling more cash in. If you think markets are perfect, you have a lot to learn my friend.


>Shorting a stock isn't always practical.

Nor is it remotely the only way to assume a bearish position.

>Stock buybacks can be deceptive if a company buys back stocks with debt and doesn't have proper cashflow to pay it off.

You can't tell the cash flow and debt levels from the financial reporting? You don't account for this in your valuation? Where is the deception? Report it to the SEC.

>Non-technical investors see the stock going up and keep piling more cash in.

An equivalent number of people are "pulling cash out". A stock trade is just that: a trade. Again, what makes you a better judge of the "correct" price than those actually making the deal?

>If you think markets are perfect, you have a lot to learn my friend

Not sure where that comes from, but you're right, I do have more to learn. That said, it looks like I'm a few levels up on you (and far less confident in my ability to predict anything).


Interest rates have already started going up.


They went up once and the fed is signaling a lot of caution. The earlier analyst consensus was 4 hikes this year .. it is down to 2. The latest commentary I'm hearing says normal rates by 2019 .. which is psycho IMHO.


What they ended is increasing their balance sheet. They are still rotating into new debt as old debt is maturing so they have not actually rolled back their old QE programs- they just have not increased them.


Just take things in in a perspective, not just based on the latest quarter news:

http://b-i.forbesimg.com/jessecolombo/files/2014/01/united-s...

http://moneymorning.com/wp-content/blogs.dir/1/files/2015/09...

http://www.naic.org/images/capital_markets_archive/2013/1305...

whether they do a token 0.5% hike (they probably won't), it's obvious that something very basic has changed, and quite likely irreversibly.


Why did you link three pictures of the same graph?

They also don't support your hypothesis that something has changed irreversibly.

Raising the interest rate more than 0.5% would have likely had a negative impact. It's very likely the Fed will raise it again later this year.


Its not because interest rate is just slightly above zero that its not cheap anymore.


There is nothing mysterious about the short, mid or long term effects of artificially low interest rates and quantitative easing.

edit

Here's a link to read about it. https://mises.org/library/unseen-consequences-zero-interest-...


Have you been following the current earnings season? We are witnessing horrible earnings while stocks keep going up and up. All the losses in August 2015/February 2016 have virtually disappeared after the oil price recovery.


You also have a large increase in corporate debt, some of which, maybe a lot, has gone into stock buybacks. It's a way of translating ZIRP into stock price while nothing was created.


oil is a leading indicator of demand. The fact that it picked up is a sign that demand picked up. stocks are continuing to rise because investors see the current drop in earnings as a blip, not as a market catastrophe.


In the old days broad economic demand moved faster than production declines. So production being vaguely constant the price told you a lot about main street employment.

In the modern era of frac wells where depletion rates are like 50% in 2 years, broad economic demand is now much slower than production declines.

That's the problem with secondary recovery... The "balance sheet" looks good in that you'll profit (unless prices tank) but the cashflow is terrifying you don't dig a well and collect for 30 years like the old days, you do exotic processes and get high production rates for like two years, then production drops to zilch.

On a large, century size scale it screws up Hubbert graphs. You can ramp up for a century, but extensive secondary production means the decline slope will be just a couple years instead of a nice symmetric century or whatever.

By analogy its like switching farm land from an olive orchard to corn production and being surprised at fundamental shifts in the financial sheets. (Yo its just another plant, right, well theres a bit more to it...)


no, oil has been spiking up due to incessant rumors of production cuts. even though the latest talk about a production cut was a no go, oil spiked a bit earlier this week due to kuwaiti oil worker strike.


If they can't sell computers in the digital age where software has taken over everything, you can think whatever you want but that does reflect poorly on the economy. Yes, many factors are at play. It is still a terrible sign.


More computers are being sold than ever before. The market is absolutely saturated with computing devices. And aside from the intense competition Intel is facing, they're suffering the same issue that the GPU makers face -- I once upgraded yearly because the gains made it worthwhile. Now I upgrade pretty much only when something fails.

It is absolutely picking a narrative.


Intel screwed up and didn't become a major player in the mobile market. They could have, because they are still the world's best chip maker, but they didn't. The mobile market is experiencing the most growth and also the shortest product lifetimes. I still get a new phone every 1-2 years because the upgrades are worth it, but my laptop needs upgrades less often, and my desktop less often than that. Mobile phone processors today are where desktop processors were a decade and a half ago in terms of growth potential.


Intel has tried, and failed, and tried again in the mobile segment.

They sold off their ARM processors to Marvell. They've made various forays into wireless, though WiMAX didn't work out so well for them.


> I still get a new phone every 1-2 years because the upgrades are worth it

Really? About the only reason why I'm getting a new phone every so often is when the old one stops being updated. (My Nexus 4 lasted 3 years, and it would have done another year.)

Planned obsolescence, if you want.


The phone market is still rapidly evolving. For instance, I have a Nexus 5X right now. The fingerprint scanner on it alone is worth it. It was an afterthought to me when I got the phone, but now it's the single most important feature on the phone to me. The camera is also better than anything phone camera I've had before.

Also, I play some games on my phone, and having better hardware helps a lot with that.


What is so useful about a fingerprint scanner?

I could see how infrared sensor like in that new Caterpillar phone could come in useful especially if you are in a building trade, but fingerprint scanner?


You know that it's used for unlocking the phone, right? It's nothing like the use case you're describing.

You have to get a phone with one and use it for a week or so until its use becomes routine to really understand how big of a game changer it is. I subconsciously unlock my phone now every time I pick it up. It's amazing. Every day it saves me probably 60 seconds in total typing in stupid PINs, and those savings add up real quickly.


I'm still using my Nexus 4, that thing has pretty much already set the "good enough" bar we've come to reference for why PC sales are tanking. The only problem is a lack of security updates, but we can't blame the hardware for that.

Incidentally I had a HTC Desire HD before and that became unusable on newer versions, as is the Nexus 7 now. These two clearly didn't have the good enough hardware specs.


Is half of college graduates being unemployed or working a min wage job picking a narrative too? Is pointing out that wages are completely stagnant and not rising picking a narrative too? What information would you need to receive to admit the economy is not healthy?


Yes, don't go studying 17th Romantic Literature expecting a job.


Romanticism is a 19th century thing.


Obviously that's not what he studied in college.


Might be the joke?


Nah, that would be too clever.


I said 50% of graduates are min wage or unemployed. Not the 1% who majored in romantic literature. You also didn't address the fact that wages paid to employees are completely stagnant.


I think the point went over your head. It is not the systems fault that a person that chooses to graduate in a non marketable field does not have a job, e.g. a person who graduates in Romantic Literature. Everyone who graduated cs that I know has no trouble finding a job. One that is not at McBurger.


Where are you getting this 50% number from?

http://www.epi.org/publication/the-class-of-2015/ For young college graduates, the unemployment rate is currently 7.2 percent (compared with 5.5 percent in 2007), and the underemployment rate is 14.9 percent (compared with 9.6 percent in 2007).


Anecdote, I was an avid gamer that would upgrade whenever the price/performance was good enough for my budget. However, I bought almost top tier stuff back in 2008 and haven't upgraded the core of my system since. I don't game like I used to, but it still does everything I need it to do. I used to do yearly-ish new builds as well.


Me as well, I built an i5-2500K system backend 2010(ish), it's still my main home desktop, 32GB RAM and an SSD and it's as fast as the machine at work that is two generations newer (or more correct it's imperceptibly slower).

Things have really levelled out for average loads even developer loads (mostly do web dev, run vagrant machines that kind of thing).

I can't see me upgrading til this thing dies tbh.


Same boat. I built in 2009 and have a Q9450@3Ghz and a Radeon 5870 with 8GB RAM and a one-time top of the line Intel 160GB X25-M SSD. Still works great. There's only 1 thing I don't like about it, with 3 monitor outputs enabled the idle temp on the GPU is pretty hot (~86C). I'm hoping a newer machine will bring that down. Single output it's about 57C, so a huge difference.

But I got tired of 190F slowly being pumped out of my case and into the room. Its replacement is finally on the way. I preordered Intel's Skull Canyon NUC[0]. Got 32GB of DDR4-2800Mhz memory and a 512GB Samsung 950 Pro PCIE/NVME M.2 SSD. I'll be dailychaining a single DisplayPort cable to 3 new LCDs as well.

Pretty huge leap in performance. It just made sense to stop building new computers and jump on the NUC bandwagon. All I do is development, League of Legends and the rare CSGo. The ~Geforce 750 performance levels that NUC will provide will be enough. The inclusion of the Thunderbolt3 port for an external GPU case really put my mind at ease. Not that I intend to utilize it, but I'm glad it's there. Same upgradability as any other machine. SSD/RAM/GPU. The CPU is soldered, but I never once replaced a CPU after building a computer anyway. Other than the few Athlons I killed from overclocking in ~2001.

Probably upgrade more often if these new gaming NUCs are as good as I think they'll be. Next upgrade for me will be 10nm + Thunderbolt4 NUC. And the final perk, all-Intel so it'll work great with any Linux distro natively. That's worth a lot to me.

Unless Intel failed hard with this thing, which I highly doubt.. it's Intel.. I'm all-in on NUCs from here on out.

[0]http://www.newegg.com/Product/Product.aspx?Item=N82E16856102...


> Me as well, I built an i5-2500K system backend 2010(ish), it's still my main home desktop

Good pick: it's still one of the faster chips around. However, it's TDP is 95W, which is fine for a desktop. Since then, Intel has been concentrating on delivering the same (or, often, much less) speed with lower TDP ratings.

These address the market need for thin, high-priced laptops -- preferably without fans -- that you can use in Starbucks.

You could "upgrade" to a new Intel Core i7-6600U that's actually slower than your i5-2500K but has a TDP of only 15W.


What you folks are saying is mostly true. VR and 4K are the drivers for a whole new compute cycle in the home. I think the current core counts for Xeon are good enough for cloud (by this I mean that I think CPU isn't the bottleneck for the average EC2, Google Compute, Azure instance). Any one care to comment?


In the sense of VR/4K the CPU is way less important than the GPU and we still have some headroom on those even with existing process, what will be interesting is when everyone else's process catches up with where Intel is now, they've generally stayed out in front of everyone else for a long time (except AMD for a spell).

I'm really looking forwards to VR if it catches on though, having an insanely high resolution headset so I can dump multiple monitors for programming is a big win, combine that with something that has the portability/form factor of a MS Book/Macbook Pro and you'd be able to program as capably from a hotel room as at your desk at home/work.

That would be the biggest shift in my work habits since I went from Windows to Linux in the late 90's.

Also I think once everyone can get down to the same size as intel we might start seeing more exotic architectures, Intel has often won with the "with enough thrust a brick will fly" approach to engineering, it doesn't matter if your chip is clock for clock more efficient if Intel is operating at a level where they can put 5 times as many transistors down in the same unit area and ramp the clock speed way up.


Good point. Nvidia seems like the exciting company in this regard. I have a bad feeling they're going to tumble because the latest announcements related to Pascal have focused on deep learning instead of 4K for consumers. As far as I know, they haven't even announced their consumer Pascal cards yet. The rumor I've read online is announcements in June.


What would it mean for Nvidia to focus on 4k? Do the new screen resolutions require architectural innovations in the GPU? (Honest question. I'm not a graphics engineer)


Me either so take this with a pinch of salt.

Mostly it's about shuttling around 4 times as much memory for each screen as well as 4 times as much processing.

1920x1080 has ~ 2 million pixels. 3840x2160 has ~ 8 million pixels.

Internally iirc this is often done as vectors before been rasterised out and having various filters and shaders applied but that step requires that you store multiple buffers etc, same reason a card that will play a game just comfortably at 1024x768 will run away crying at 1920x1080 I guess.


I can see how higher resolutions will require the GPU to have more memory or more FLOPS or both. I guess my question is, how, if at all, are the improvements required to support 4K different from improvements that target something like deep learning performance?


Not quite about that last part. Clock rates have been more or less stationary over the last decade or so. What Intel has been focusing on for some time is cache, and how to practically always have the right data in the cache at the right time. but yes, having more transistors to work with do allow them to pack more cache onto the die.


Yeah that wasn't perhaps very clear, I meant historically they ramped the clock up (back in the P4 days).


And ran into a brick wall, iirc.


We still need more CPU in the cloud. There's an almost unlimited appetite; and we still often look for optimizations to our program that reduce CPU load. (At least at Google. Not sure how much I am allowed to say in more detail, though..)


Touche. I've also been involved in building clouds but ours was focused on customers building web-apps. Those are probably not compute bound. I can see why data processing workloads would need more CPU.


> VR and 4K are the drivers for a whole new compute cycle in the home.

My i7 desktop will be four years old in June. I've done the research and all I need to run the Oculus or HTC Vive (or the upcoming games that look good) is a graphics card upgrade.


As a software developer I constantly find myself optimizing and scaling across CPU's. Sure those programs I optimized 10 years ago run blazingly fast now. But as computers got faster the possibilities and expectations also increased.

I would like to compare CPU's with roads: Roads increase traffic, not the other way around. So if you make a faster CPU, the demand for a faster CPU increases.

So if Intel stops making faster CPU's, the demand will (unexpectedly) go down.


Good point except facebook is targeting 6k per eye. People are definitely going to buy hardware, but maybe not in huge numbers.


That's what I've got as well. I'll probably build a new machine (actually, I'll just buy a NUC) this Spring. Another big factor in not upgrading is the effort involved in setting up a new Windows machine. I'm sure I'm in for at least 30 hours on that front.


That's precisely the "issue", and it's only a problem if you make it a problem. The early-adopter and rapid-early-development churn is slowing down because processor technology is maturing and has reached full market penetration, so the market for new devices is shrinking. It's like a hotdog stand calling it a "market crash" when they've sold everyone a hotdog and nobody's hungry after lunchtime.

The automotive industry faced the same issue a long time ago, and their solution was to turn cars into a fashion statement. You no longer upgrade your car because your old one is worn out, you upgrade to signal your wealth through conspicuous consumption. The mobile phone market is somewhat the same way. The desktop PC market? Not so much.


Ain't that a bit sad, all that energy and pollution going towards planned obsolescence so that he with the shiniest toys wins?

There is some legitimate utility derived from novelty -- I don't want to be walking around in my grandfather's battered rags either -- but I wonder if some day we can shift towards something more sustainable and less paying to dig consumerist holes then paying again to fill them back in.


Yeah, I never went in for replacing things just for the 'new car smell'. That's probably why I drive a nearly-30-year-old car. :)

The point about your grandad's battered rags is that they're battered rags, though, not that they're old. His good leather jacket is probably still perfectly serviceable today.


My leather jacket is 20 years old and in great shape.


Maybe they should adjust their business to account for this new cycle of usage, and not fight it


That's why they're cutting 11% of their workforce.


I must say, I'm surprised by how much their workforce swelled. Why did their workforce rise so much after 2009? Was this in the Bay area? Did this rise bring in additional revenue/create new business lines?


One of the things mentioned in the story is that the "period of [staff] growth includes its two largest acquisitions, McAfee in 2011 and Altera last year".

I'd assume these acquisitions added to revenues.....


The PC isn't going anywhere, but the need to update every 30 months has gone away.

It's a sign of market maturity. Intel shouldn't be trading at a premium if it's selling a highly predictable, stable product to a consolidating marketplace.


Intel has always sold predictable, stable products. But the rate of increase in the core value delivered by their primary product line has slowed dramatically.


Computing devices are selling like hotcakes, far moreso than ever before.

But most of them are full of ARM chips fab'd by companies not named Intel.


>>Computing devices

hmm... only if I kid myself that the mobile phones (even the smart ones) are actually computing devices on which "I can compute whatever I wish to compute AND the way I can compute on a PC". Most of the ARM devices sold are NOT computing devices for the general person who purchases them.

It reminds me the fallacy of calling smart-phones "supercomputers in one's pocket".

May be the ARM chip based devices are "computing devices" for companies like Samsung, Google, or some car makers but certainly not for general public.


"sure UNIX is nice, but you need a mainframe for REAL work" "sure NT is nice, but you need a UNIX workstation for REAL work" "sure mobile is nice, buy you need a windows PC for REAL work"

I crunched to inbox zero with my phone while commuting to work, while I'm typing this at my "real work computer"


I read that statement more like in terms of the freedom you're allowed to build and run applications for such devices. Mobile devices usually discourage the user to do this.


> I read that statement more like in terms of the freedom you're allowed to build and run applications for such devices

Even if we disregard how arbitrary that definition of "computing device" is, it's still a false proposition. Anyone can easily get a browser, a word processor and spreadheets on their mobile device or tablet (ARM). That's about as much computing as a significant chunk of the population requires.

I have debian chroot on my Android device which gives me Vim, GCC and javac out of the box.


How much "computing" do you think the average user does with a PC?


How much photosynthesis do you think the average planet does with its star? Enough to justify chloroplasts?

How much complexity do you think the average mind generates with its time? Enough to justify the ATP?

How much thinking do you think the average human does with his brains? Are brains worth it? Stay tuned.


Given the growth in trends such as ARM for mobile devices and dynamic resource allocation on the cloud, I still don't think you can make that broad a statement.


Can you make the statement that the economy had nothing to do with it? A business' success or failure is reliant on the economy on a fundamental level. I imagine if college graduates weren't struggling to find jobs and pay off their massive student debts they would have sold more computers, despite the trends working against them.


Because it isn't relevant. This is the impact of moving to the cloud. Those broke and indebted college kids are buying $800 iPhones instead of $800 PCs, because they don't need the hassle.

I'm a power user. I have a 4 year old MacBook and a 2 core * 4GB VDI session. I won't replace the MacBook for another 18 months.

Even in enterprise IT... Every project we did in 2004/2005 required a server purchase with two Xeon sockets and a NIC that might be Intel.

Now the average marginal unit of virtual server needs 1/50 of an Intel Xeon. And a lot of those Xeons are at AWS, which squeezes the margins.


They're focusing on being "the cloud", which definitely makes business sense when you consider that 10% of the world's electricity goes to data centers. Microsoft's next US-East Azure building will be a mile long! I think Intel will do fine if they focus on that market instead of PCs.


Yes, but most of those "computers" (that people do keep buying) happen to be smartphones these days.


What are you going on about?

Both Intel's sales and profit are even YoY in 2015, and up from 2013. This doesn't have much to do with computer sales.


>state of the PC market This is funny because the margins on ARM processors are razor thin since everyone and their grandma compete with each other. Meanwhile Intel can charge a healthy markup for their server processers which are increasingly more in demand as more people on the planet have smart phones.


I don't see why job losses are a bad thing. The last generation of machines was built so well that people don't need new ones. Good! This should be a great thing.

Valuing growth over sustainability makes these backwards ass goals of hiring more, building more and growing. The way this is spun is pretty horrible.

Amazon and Wal-mart losing is a good thing for the environment and society as a whole.

You yell, "But people lose their jobs..." There are more jobs, plus why does everyone need a job anyway? Can people in Silicon Valley not make $150 ~ $250k a year, everyone else get a minimum income and we work together to make better art, smaller factories and a world that will last much longer by recycling and rebuilding and not needing to buy an endless supply of shit?!

"Ending is better than mending. The more stitches, the less riches." -Brave New World, Huxley.


> You yell, "But people lose their jobs..." There are more jobs, plus why does everyone need a job anyway?

This is kind of silly. Job losses are a bad thing precisely because we don't yet live in a world where you can get by on a guaranteed minimum income without a job.

If you think it's worthwhile to push society in that direction, then great, but don't kid yourself that believing in it is the same as having already accomplished it.


Some european countries have already achieved this; your unemployment benefits after getting fired are a significant fraction of your previous salary, and slowly ramp down. You pay for it in taxes (and then some), but at least you know you can still make rent if you get canned.


You also pay for a bureaucracy to perform administration and fight abuse. That consumes a fraction of the system's resources. It's hard to argue that this adds much value, or any.

What this system boils down to is a compulsory savings scheme. So an alternative would be a voluntary savings scheme, which would have higher efficiency by not consuming resources for administration.


Of course the obvious downside to a voluntary savings plan is that it isn't effective at actually creating savings. We know this, because in the United States, we don't have compulsory savings plans, and thus have a voluntary system. So how's that working out? Not well. Approximately half of all Americans save 5% or less of their income[0], which isn't enough for emergency situations. Another study in 2013[1] said 27% had no savings at all. Now the main reason for this is quite simple. A lot people don't make enough to have anything so save.

[0] http://money.cnn.com/2015/03/30/pf/income-saving-habits/ [1] http://money.cnn.com/2013/06/24/pf/emergency-savings/


i think maybe only Swiss here have really sustainable model; others I know are typical socialist let's-make-more-and-more-debt-with-unsustainable-social-benefits (happy to learn if it somewhere else it works and can work in foreseeable future too without creating huge debt, forcing people to work in their late 60' etc).

What Swiss specifically have - we all pay social deductions, within 3 pillars. If you end up unemployed, then you are entitled to 1-2 years of unemployment benefits in 70% of your former income, with some reasonable cap. That is if you worked 100%, in perm contract, at least a year in a row. Otherwise not entitled or payments are much lower.

You have to actively search daily for new job and prove it to your unemployment councilor, otherwise they will cut benefits. Those benefits are really structured in a way to pay you while you are looking for a job, not to have some extra long holidays on everybody's else budget.

Motivating setup, and since there is a social net of that 70% of my income, I am not frightened on the prospect of losing job and needing to store enough cash to live at least X months/years without any income. Handy when one has big fresh loan on his shoulders.


There's nothing particularly Swiss about it, similar schemes exist throughout Northern Europe.

In Norway you can get 1-2 years benefits at about 60% your previous annual income with a cap near median national income.


These are nice schemes but the caps are worrisome. Modern professionals in North America pile on so much debt (student loans of a few 100K, mortgage debt of a lot more, etc.) that the countries average income doesn't reflect reality. In the Bay area, I hear laments from people making 200K that they cannot make ends meet. It is a crazy world.


You can have your student debt repayments frozen here while unemployed, at the national bank's rate. Plenty of people with high mortgages too, but duh. We aren't living in a perfect world.


Right, they are obviously bad for the people losing their jobs and for the society. But are they bad for Intel? I think that was his point.


>I don't see why job losses are a bad thing.

You'll see it if you lose your job and can't find a new one, like it happens to millions of people the world over.

Add a medical or family emergency to that, and you'll totally get it.

Unemployment in the real world is not about some Sci-fi automation fantasy. Might be, but not yet. And even when historically it leads to new jobs or other benefits (something not always a given) the pain and trouble is still there for those unfortunate to be left on the wrong side of those new jobs.


The more jobs are lost, the quicker the demand to fix it, with Universal Basic Income or something similar. We know it has to happen eventually, but the sooner we man up and go for it, the sooner millions of people can get relief from their suffering.


I don't see why job losses are a bad thing.

As long as we move quickly to "you don't need a job to live" because otherwise these job losses are destroying people's lives.


Unless these folks had degrees in Working At Intel, losing a job is not the same as becoming permanently unemployable. Job loss sucks. But it's pretty hyperbolic to say it's destroying peoples' lives. Particularly for skilled workers, finding a new job isn't all that far out.


>Unless these folks had degrees in Working At Intel, losing a job is not the same as becoming permanently unemployable.

A 50 year old Intel employee might find it quite difficult to get a new job any time soon. Not to mention that where they live they'll have some other thousands ex-colleagues recently fired looking for similar type of jobs.

We're talking about thousands of people -- few of them will be highly sought after chip gurus. In fact obviously it's the less sought after ones that will be let go.

Add a recent mortgage, medical issue or kid at college and here goes their pension savings (or worse). Have the same layoff happen to their spouses near the same time (which it's not that rare), and they're pretty much screwed.

Really, it's as if people have no real world experience with these things...


That depends if you can apply for redundancy you can lose the good people.

Most of the UK's mobile industry was staffed by Ex Cellnet/BT who took Redundancy with some massive TAX free payouts - Senior Guys could get 5 figure payoffs and an extra 6 years on your pension.

The head of Vodafone in the UK was an example and as soem one said at his level they threw in a gold plated wheelbarrow for you to take your cash away in.


This is largely a problem because of the pitiful savings rate in the US. If you've been saving 0% of your pay, being laid off at 50 is a disaster. If you've been saving 30-50%, this is likely to be a non-issue. Highly paid people in this country spend way too much of the money they earn.


Not only in the US, and not only because of being "highly paid" but "spending too much".

In most of the world, including large parts of western Europe, unless you're upper middle class and higher, it's either hand-to-mouth or smallish savings (nowhere near 30%).


Which parts of western Europe?

I see via google that minimum wage in Germany is 1473 euros per month for a full time worker.

According to [1], food, housing, necessary household items, and public transportation look like they could reasonably total less than 800 euros per person. Add another 100 for having fun and for the occasional wardrobe expense, and you still get to save nearly 40%.

Having children ruins all of this, of course, but that's pretty easy to avoid.

[1] https://www.expatistan.com/cost-of-living/berlin


Germans save around 9 to 11% of their disposable income, of which the average (post taxes) amount is around 25-30.000. This puts their annual savings to around $2200 to $3300.

That's nowhere near being able to save 30-40% of your pay, as the parent said. And Germans are some of the biggest savers in Europe. For places like France, crisis stricken "PIIGS" (Italy, Spain, Portugal, Greece etc) and especially the UK it's even worse.


There's a large difference between being able to and choosing to save that much. I didn't say that it was the norm, but it's certainly possible if you have a near-median salary. Not doing so is a choice if you make a reasonably good salary, albeit an extremely common one that most people don't make consciously. It requires consciously tracking your spending, and being willing to forgo or at least cut back on some of the luxuries that have become much more common in the US and presumably Europe in the past 30 years (like eating out frequently). In return, you can get better financial security, even without an astronomical salary.


You paint a contrived scenario and then lament it. What is the purpose?

Where do you have stats showing all of these job cuts will happen in the same location? Where are the stats that show when a person loses their job, their spouse is also likely to lose their job?


It is not a contrived scenario. My dad was laid off 5 years before retirement and guess what? All his skills were working in the Intel organization and nobody is going to hire a senior manager in folsom a few years before retirement. Please try to be a bit more compassionate, it actually effects real people.


The "contrived" scenario is something that has been played again and again. as opposed to the "no worries" attitude.

As for the same location: typically companies have so-called offices in a few locations --not individual people equally distributed around the country.

Plus, who said that their spouse is "equally likely to lose their job"? It's just an example of a thing that can happen, and does happen, not something that anyone claimed inevitable or "as likely"...


20% of suicides are linked to unemployment.


you have "turtles all the way down" of bubble mentality


A relative few Silicon Valley elites living like kings while everyone else lives in a tiny box trying to raise their families on their meager Basic Income stipend doesn't sound like a utopia to me.


How does that work? If I have a meager income I'm not buying an iPhone or a desirable target for ads from Instagram.


Apparently it works just as djslumdog says -- people like him/her will continue doing the small amount of important work that can't be automated, and will be compensated handsomley to do so. They will fly around the world, enjoying vacations, live in large homes, eat lavish meals.

Everyone else will live in 100 square foot apartments in Soviet-style concrete bunkers, eating ramen (or possibly Soylent). But they won't need to work! They'll spend their entire day painting and composing music!

A lot of people here push this vision, and of course they always imagine themselves to be part of the elite. It really is insufferable.


Poor people are among the most overweight because food is so cheap in the US. I don't see why implementing Basic Income would magically make food expensive or make people have to "eat ramen or possibly Soylent."

Anyway, the world you are so worried about is already here. A small number of people do continue to do important work that can't be automated. Most other work is unecessary busy work. Artists and musicians already live off of government largesse, either by taking "McJobs" that are artificially highly paid because of government minimum wage, or by actually being on welfare.

Since people can't wrap their heads around the concept of Basic Income, we end up with hordes of people paid to do stupid shit that doesn't matter. The endgame is that we're all government employees in a Dilbertesque hell. Moving papers from one pile to another and back again. Digging holes and then filling them again, since letting people decide what to do during the workday would be too disruptive.


Using the big number of overweight people as a sign that healthy food is available cheap is an amazing feat of backwards logic IMO. A significant cause of overweight is that your diet is too imbalanced and your body is missing certain important nutrients while getting too much of others. That is the exact opposite of healthy food. And in fact, the "cheap food" most people consume is exactly that - fast food and high-calorie snacks. That sounds a lot more like the concrete-bunker-and-ramen theory again...


If you supplement with protein powder which is cheap also, count calories, lift weights, fast food is just fine for not getting fat. Lots of people on /r/fitness have done McDonald's or chipotle.


> The endgame is that we're all government employees in a Dilbertesque hell.

I once heard about the British Colonial Office. The time when it had the most employees was precisely the time when it was dissolved because Britain did not have colonies anymore. They were all just shuffling papers around all day.

I sadly don't have a source for that. Does anyone have a link, by any chance?


> Does anyone have a link, by any chance?

C. Northcote Parkinson, 1955. http://www.economist.com/node/14116121

"A glance at the figures shows that the staff totals represent automatic stages in an inevitable increase. And this increase, while related to that observed in other departments, has nothing to do with the size - or even the existence - of the Empire."

What I don't know is where Parkinson got his own figures from. Some of them are cited in the article, but I think not this one.


But what's the alternative? Don't give anyone even the meager means of living? Communism? (that worked out well). In the coming years, tons of people will be out of jobs and incapable of participating in the shrinking not yet automated away economy, what do we do?


you don't get it? basic income IS communism, we/they also had a guaranteed income because everybody had to have a job (=income), even if the job didn't make sense.

elites (political in communism vs actually still working people in BI scenario) have amazing life with various luxuries. rest will be allowed to exist and not die of starvation and have a lousy place for sleep. truly bright future...


You haven't answered the question: what is the alternative?

At least with BI, you don't have to toil away at a time-wasting job that's just busy-work: you can do something you enjoy, and maybe we'll all get lucky and you'll write the next Harry Potter series and become a multimillionaire, while paying a bunch of taxes to keep the system going. If not, at least you have the dignity of not digging holes and then filling them back in all day long to justify your paycheck.

If you don't like this, then let's hear your alternative. The only alternative to this that I can think of is to ban automation, or strictly regulate it so that any automation which does a job that humans can do is banned (allowing only automation which does things that humans cannot do). Do you really want to go that route?


Pedantic but an iphone is probably a way better purchase than most things if you are low income.

For $400 you get a device that will let you watch movies, play games, check email, get phone calls, etc.

Granted you could also get a cheaper phone but this replaces your TV and computer! Smartphones are really high-value!


Why do you think only iPhone can do that? Pretty much any 50$ used smartphone can do all those things.


Honestly, $50 smartphones tend to suck a lot. Even the highest-end Firefox OS phone for a while had major keyboard lag. Chances are trying to use things like Skype/VOIP over them would be painful....

I think if you're looking to invest in one useful thing, a good phone would be a great purchase. Maybe the Moto G at $200?

Amortized over a year, you're talking about a lot of bang for the buck.


I still use Samsung Galaxy S3 with Cyanogenmod I bought for 60€ that still meets my needs. If you are on a budget, buying a new, shiny smartphone is a waste of money imho as you can get a cheap, used but fully functional one for much lower price.


So do i on Cyanogenmod 10 (Android 4.4.4) and it handles all web/music/video needs. I guess the latest games that you wouldn't really want to play on a mobile anyway might have issues.

The S3 generation of hardware has stood up very well.

My only reason to upgrade would be for some good security hardware that can support full disk encryption with no slowdown.


>> watch movies???

How do you expect that person with meager income to pay for the network bandwidth and usage bill?

>>Smartphones are really high-value!

This I agree wholeheartedly. At least for people like me, who enjoy reading books, smart-phones (with sites like project Gutenberg) provide such an utopia that I feel like living in heaven. Of course, I know that smart-phones don't solve all problems automatically, but from knowledge-seeker's PoV, smart-phones are of great value.

Also, I understand that not all people like to read, but that's a different thing to ponder upon.


Don't worry. For BI to occur, the kings will first fall. Remember, they chose this course, and will follow it until the end-- they always do.


It's important to distinguish "a bad thing for the economy in general" from "a bad thing for an individual's share of the economy". Job loss without a corresponding drop in production is a good things for the economy as a whole - it's getting what's demanded more efficiently. It's terrible for the unemployed individuals in question, since they lost the leverage they were using to extract a larger share of the total economic output.


Yeah, but what's the societal point of efficiency when wealth has become so concentrated, that the economic ladder has been pulled up?


While that might be a nice long term view, it isn't fun to go through the entire process for the average person, heck it's not even pleasant for many of the "1%ers".

If a lot of companies suddenly go bust that are today supposedly getting in the way of innovation, it might be good for long term innovation, but in the short term, say 10-15 years it can really be hell. Not to mention that in this 10-15 year period while the economy is hemorrhaging it makes the entire economic ecosystem very vulnerable, and significantly reduces its capacity to deal with external attacks.

That is while US companies may start failing, who's to say that a foreign company won't take its place. Then you have to start taking isolationistic measures, which is a whole other mess.

The ideal scenario would be to see these companies slowly, over a period of decades fade out, and be replaced by others, which I honestly don't see happening as of yet in SV, but things like this generally do come out of the left field.

So if we see these companies shrink rapidly, with no one else to take the mantle, I honestly think it would be a dangerous situation as it can be a catalyst for something bigger.


>The whole market is overvalued, not just the tech unicorns.

Yep. That's what Peter Thiel is saying as well: https://news.ycombinator.com/item?id=11485376


I am not surprised. I am a huge fan of Thiel, his book Zero to One was a real eye opener for me.


Aside from if he's right or not, that is the belief one would expect him to hold, his other option is "looks like I'm a bad investor after all".


One can't be a bad investor and learn from their mistakes, presumably?


Yes one can learn but when one's investment is correlated to how one talks about it, best to keep those lessons to yourself and make better investments in future.

"My investments are over valued, oh the lessons I have learned"

"All investments are overvalued, oh the lessons we will learn"

are two very different statements.


You should start a new index, called the 3 index, that just looks at 3 companies. That should be enough to gauge the market, right?


The DJIA only has 30 which is always surprising to me (to its credit it tracks with the S&P 500 pretty well).


The DJIA is also share-price weighted. It's an objectively bad index and I find it surprising that it's still widely followed.


>yet, the Dow just hit a 9-Month high, and the S&P500 is >now above 2100.

Its because bad news is good news in the twisted upside down world of speculating on the Fed's interest rate policy direction via the stock market. Expectations of rate hikes go further into the future with every bad economic indicator released and there are many(retail sales, auto, housing, industrial production, rail traffic, oil and gas, capex) . Will continue to wait for the day the market finally figures out that interest rate policy frameworks are broken, the Fed has no more ammunition, and the stock market jumps out the window rather than using the stairs. At that point, I'm buying lads.


What's a lad? As someone who instinctually believes this is a house of cards, understanding how to prepare financially is a real challenge.


A group of males, like a flock of swans ;)


deposit in an FDIC insured account or US Treasury securities of short duration.


I love it when people make doom and gloom predictions. If you are so convinced, put your money where your mouth is. Short the S&P500 with everything you've got and post a screen shot here.


I've sold all of my stocks and RSUs in anticipation of a crash.


>The whole market is overvalued, not just the tech unicorns.

Who is it overvalued by, and how is the "true" value determined?


Suppose the market just tracks widget factories. Every factory produces the same quality widgets, all shipping is flat-rate by distance and there are no tariffs or taxes. In this market, it's easy to determine market value of any particular widget factory: market share of world-wide production.

Suppose everyone expects the demand for widgets to grow by 1% per year for the next 10 years. This growth might get priced into the market with some sort of net present value calculation. Capital will flow into building new widget factories to fund all of this new construction.

Now what if actually an asteroid hits a city with lots of widget demand. 50% of demand is wiped out instantly. Well in this case, everyone was overvaluing the market for widgets! All of those net-present-value calculations were way off. The market will correct.

Or maybe the asteroid takes out 50% of production. Suddenly much more investment in widget factories is required. The price shoots up to capitalize all of the construction, since demand has not changed.

In either case, before the asteroid the market was not correctly valued since no one had yet priced in the asteroid.


It wasn't really over valuing the price widgets.

Valuation only includes all available information.

You can't expect random acts of God to be priced in.

Overvalution is when psychology everyone is bullish, but the facts are not backing that up.


This is what I was getting at. "Valuation" of "the market" is far more subjective and psychological rather than objective and empirically measurable.


The actual information from markets is in estimated future prices, not the current bid/ask spread.

The true value is determined by what happens next, for short to medium values of "next."

Current fundamentals simply don't support a future of continuing growth.

A correction is certain. The only questions are when, how fast, and how much.


>Bubble

Four companies don't represent the market. Come on. I swear some users here will wish will wish bubbles out of the air to gripe about.


I don't think anyone above said "bubble". Overvalued is a different thing markets are usually either over or undervalued and probably overvalued today. Bubbles proper are kind of rare, the last major clear one being the 2006 housing bubble. You could argue there's a bond bubble presently.


You're right - Claiming the markets are overvalued like the unicorn bubble isn't literally saying there is a bubble, it's just implying there is.

There is most likely a corporate bond bubble, as we know from representative market signals. With the S&P 500 though, four companies are not representative signals.


Yup overvalued. The GMO report which is about the best source I've found for this stuff had the projected 7 year return for US equities at -2%/year which is not great. https://www.gmo.com/docs/default-source/public-commentary/gm... p9


Asset inflation caused by FED flooding the banks with cheap $.


That's what happens when you add more money to the system. It flows to highly liquid assets first.


Well, there's no point in putting your money into bonds right now. For all the interest you get you may as well use it to stuff your mattress.


To be pedantic: the market is perfectly valued at every moment. Of course market prices will change, sometimes very quickly.

BTW, this is a great site for tracking S&P ratios: http://www.multpl.com/


The efficient market hypothesis (the idea that the market is by definition perfectly valued at every moment) is highly controversial, and actively rejected by many finance and economics experts of all political views.


My point is simply a tautology: the price is the price. Now the price may change, someone may have their thumb on the scale, there may be shill bidders driving sentiment, etc. But at the moment you get a bid/ask spread, that is what you get.

Practically speaking, I personally don't believe the efficient market hypothesis due to the very obvious meddling by political actors, central banks, national treasuries, etc. Behavioral finance has also consistently demonstrated that humans do not react rationally in markets.


That may have been your point, but it's not what you said. "Perfectly valued" means that the price is what it should be (for some definition of "should").


"Perfectly valued" to me means that at that moment, the price reflects the actions of everyone. It includes the actions of all buyers/sellers who choose to participate and not participate, the regulators, as well as the meddlers.

My point is a pedantic one referring to the OP using the word "overvalued". The market price is never over/under valued. It is merely the market price at that moment.


Your definition of price and value, while correct to a certain standard, give no illustrative power nor does it lend itself to analysis or discussion.

As you said, it is merely tautological.


Your point sucks.


Please add 'capitalists' to that list of meddlers, to be fair. ;-)


Not the OP, but those finance and economics experts are contradicted by Gottfried Wilhelm Leibniz and his belief that we live in the best of all possible worlds.

In other words I think it's futile to try to attach moral attributes (for example saying that a market is "best" valued at the moment or not) to things like financial transactions, which don't have any intrinsic moral values associated with them. Otherwise we risk talking about long-dead German philosophers when trying to illuminate the problem.


You can thank Fed for that. I always wondered why Janet Yellen worries so much about stock markets. I mean isn't her duty to look at economy and make decisions instead all I see are ways to placate market by postponing things for ever.


US Steel just got ride of 25% of its non-union workforce. ATI just announced that it would be laying off 1/3 of its non-union workforce.

Carrier is moving manufacturing operations to Mexico.

We're on the verge of something really big and really bad.


Of course it is...its has been and is being totally propped up by the Fed monetizing debt with equity purchasing ie QEI, II, III with QEIV being considered soon[1].

This is nothing but an asset bubble that is sure to burst someday, as is akin to what the Gov't, the credit agencies, and WS did with housing and mortgages back in the 2000's that led to the meltdown and TARP.

[1]http://www.safehaven.com/article/39140/jobs-report-moves-fed...


How can the whole market be overvalued? Prices are relative, no?


it works like this: Imagine you have a huge amount of money, enough to buy whole companies like Apple, Coca Cola and Walmart. you want to keep your money and even make it grow, so you're looking for good businesses to buy.

I have one company for sale at USD329 billions, it grows 50%-100% a year for now and currently generates USD3.29 billions

I have another company for sale at USD219 billions, it is shrinking at 1.4% a year and currently generates USD14.69 billions

And finally I have another company for sale at USD200 billions, it is growing/shrinking between -8% and 60% a year and generates USD7.35B

Most people prefer the second one, who makes you a good 7% return a year. the first one looks good, but will take at least 6 years to be as good as the second and the last one is so so. the companies where 1) Facebook 2) Walmart and 3) Coca cola. (we're ignoring how much assets and liabilities they have for simplicity)

And as you see, even when facebook has a promising future, we don't know if it will reach a good value/margin level as walmart. so we may agree it's overvalued at is current price.

Hope this shed some light, I'm not a trader or something, so this info is very simplistic

P.D.: this analysis is called Fundamental Analysis, you can go deeper and honestly it has worked very well in my portafolio. I bought stock in Gerdau (GGB), a brazilian steel company, because I studied as a whole business and discover it was priced very low. that was starting in November, I bought those stocks at 1.22 and recently they reached 2.44. I basically duplicated my money in 6 months. Remember stocks aren't just tickets, they are little parts of a big business.


That's very interesting, thank you for posting this!


sure, man. glad it helped you.


In one sense yes — if the sum total of the market is '1' and everyone is just some fraction of that. Then there can only ever be relative price changes.

But the sum of all prices does in fact vary. I'm sure you're familiar with Tulip Mania. Prices can be bid ever-higher in a cycle, and they can also decline in tandem as in the depression.

For the entire market to have an expanding total price, new money has to be entering the market, or the net real value of the underlying financial assets has to be declining. Either way, the entire market can become overvalued or undervalued, especially compared to non-financial (or illiquid) assets denominated in the same currency — e.g., wages, energy, or land.


> How can the whole market be overvalued? Prices are relative, no?

Only if you subscribe to the "Greater Fool" theory[0]. On the other hand, if you think the stock price should reflect some sort of intrinsic value, say the net present value of the stream of all future dividends, then it could be the case that you would never recoup your investment with any stock currently.

[0] https://en.wikipedia.org/wiki/Greater_fool_theory


Short answer: No. E.g. if you buy 'shares' in an index (via an ETF), you're effectively buying a weighted basket of the component shares in the index. If the individual shares are overpriced then so is the index as a whole. Hence the S&P500 has a price:earnings ratio, and you can decide for yourself if that ratio is looking 'toppy'.


This is effectively a claim that the dollar is being undervalued.

Quite a few people would disagree.


The market movement can be explained somewhat by the fed's rate hike expectations changing (worsening economic conditions means that the fed is more cautious about raising rates, so discount rates are lower and valuation models are higher). It's also about what the market expected earnings to be. Yes, the tech earnings are beating up the NASDAQ, but people actually expected worse from the financial sector given the very low rates.


If helicopter money is the next step after 0 to negative interest rates, the stock market may be a pretty good deal right now.

Get paid a lot more money, receive double digit gains on your retirement account year after year -- and be completely fucked in the process, ironically, because prices went up even faster and its more profitable to simply hoard things than sell them.


People QE seems to be the very last thing they want to do, which I think really tells a story here.

Considering the money the government is still pumping into the market directly, I don't think people QE will have the same effect. It really depends on how it's executed though.


Healthcare as a sector is doing pretty well. Consolidation down to 3 major players will likely increase their margins further.


Which aspects are you referring to?


> The whole market is overvalued, not just the tech unicorns.

To the extent that Intel and the tech unicorns are complements, Intel's profits declining may actually signal that tech unicorns are undervalued.


Can you explain why the Goldman and MS stocks didn't drop? Overvalued means the stock prices should drop.


I have no opinion on whether GS and MS are overvalued, but here's a fun quote from Keynes. "The market can stay irrational longer than you can stay solvent."


It feels mostly all fake.


Money's gotta get parked somewhere.


Spot on!


hard to understand the market.


This is what the end of Moore's Law looks like.

* Tick-tock is dead.

* 10 nm is severely delayed.

* EUV is severely delayed.

* Significant layoffs in R&D

* The ITRS roadmap is vaguer than it's ever been.

* Giant mergers are up (Intel+Altera, KLA+Lam, etc.), concentrating the industry more than ever.

* And ultimately: A 5-year-old PC still works just fine.

When I say this is the end of Moore's Law, I'm not trying to be dogmatic. Of course there will still be a semiconductor industry and of course there will still be amazing technological progress. But it seems the rate of that progress is slowing, and now the industry is adjusting.


And ultimately: A 5-year-old PC still works just fine.

I suspect that is really the dominant factor here. It's not that the progress within the chip industry has stopped; there is far more number-crunching power in the CPU and GPU in my latest PC than in the one from five years ago. And if you're doing things like playing demanding games or modelling skyscrapers in a CAD package, that progress is probably very useful. It's just that for what most people use PCs for, it doesn't matter, because they were already good enough so unless their old one broke they don't need a new one anyway.

Not only that, but when it comes to replacement, smaller and more convenient devices like smartphones and tablets will do everything a lot of people need without needing a PC at all these days. If you mostly used a PC for things like staying in touch with your friends or retrieving information from a web site, rather than anything creative beyond a quick bit of typing or anything that needs more powerful equipment, you might not even have a laptop any more.

Given that Intel has never had much penetration in the mobile device market compared to ARM designs, it doesn't seem that surprising that the demand for their products is waning as the mass market moves in that general direction.


Moore's law does explicitly talk about semiconductors - with the recent strides in light-based computing (however small) we might be able to continue the overall computational power trend in the future.

I don't think we have to dig this grave just yet.


Interesting point. If we are witnessing the end, would we recognize it at the time? Moore's Law has been something we've become accustomed to, so it almost seems hard to believe that it will end even if we are told that it will.

Makes me wonder what research money will be spent on instead of just faster chips.


Worst case scenario: nothing. The R&D goes away. Look at aerospace for a nightmare scenario.

In the early 20th century we got fixed wing flight.

In the teens and 20s we got motorized fighters and the first passenger planes.

In the 40s we got jets.

In the 50s we broke the sound barrier and orbited Sputnik.

In the 60s we landed on the Moon.

In the 70s we... stopped going to the Moon.

In the 80s nothing much happened except declassification of a few things (stealth) that were developed in the 60s and 70s.

In the 2000s we grounded the Concorde. Passenger flight got slower and more expensive.

In 2016 we fly on passenger planes no faster than what we used in the 70s and 80s, and we're stuck in low Earth orbit.

1969 was the peak of the aerospace industry. With the exception of SpaceX (which is really just picking up where NASA left off), we are less advanced today than we were in the 1960s.

There's many places we could go beyond conventional Moore's Law: multi-dimensional chips, optical, quantum, exotic materials with very low power consumption, etc. But if what we have is "good enough" and there is little demand for anything faster, the R&D dollars won't be spent. If anything the shift toward mobile computing and wimpy thin client endpoint devices might actually lead to a pull-back and loss of capability similar to the one we saw in aerospace after the 70s.

The consolidation we are seeing is not a good sign. This is what happens when an industry decides it's now a cash cow and it's time to go out to pasture.

We also could have a base on the Moon and Mars right now and be working on our first interstellar probe to Alpha Centauri. Physics didn't stop us. Economics and politics did.


That's not entirely wrong but you're ignoring the massive price difference for air-travel between the 60s and "slower" 2016.

The price drop in air-travel can be attributed to: 1. Technology. Cheaper (per seat), more efficient planes. 2. Consolidation of airlines (and airplane manufacturers). 3. The disruption of full-service airlines by low-cost carriers.

In semi-conductors we have been getting along with 1 and a bit of 2. I would say that the disruption of Intel by ARM is an example of 3 because intel is not incentivized to compete at those low price points.


3. The disruption of full-service airlines by low-cost carriers.

In the US at least, this was due to political deregulation. The process was started by Nixon, but finished in law by Jimmy Carter (!), note also the leading lights of the Democratic party in the signing picture, e.g. Teddy Kennedy 2nd from right: https://en.wikipedia.org/wiki/Airline_Deregulation_Act


There's no carbon on the moon. Pretty challenging for a human settlement.


This is pretty depressing.


Maybe it is. But if the general public is happy with what they already have, manufacturers tend to iterate instead of innovate. Getting from europe to australia two times faster, for ten times the normal ticket price makes me settle for the slower option.

Even a 200% increase in CPU performance won't bring the general public back to desktop PC's they don't really need, because they are satisfied with what they have. We need something truly disruptive and get people interested in something else.

Right now there are a lot of experiments (AR, VR, wearables, transparent screens, lightfield, new batteries, etc) which will, probably when combined into a truly attractive package, change everything. Again.


One must understand that this is re-structuring. Intel is probably letting go of some divisions it no longer intends to pursue. In the not-so-distant past, Microsoft did an internal re-structuring when Satya Nadella became the CEO. It isn't as bad as it is portrayed to be as most people end up getting re-hired in other groups or take the severance and join a new company.


and I am not down-voting you, I am re-structuring your karma.

Yes, this may be good for the future of Intel or the employees losing their jobs... but right now, 12000 people are having their lives affected in drastic, unexpected ways.


I agree that it's likely the majority of those people are in unenviable positions. I've been thru cuts, it sucks, but at the same time, as you say it may be untenable for Intel to keep carrying them --also, it's possible Intel should have been more cautious in hiring (and maybe some of those people would have opted for personally more stable jobs, or maybe they would have taken worse jobs --it's hard to say.


It doesn't say that they are firing 12,000 people. Certainly some people will be laid off, but a reduction in head count of 12,000 is distinct from layoff count.

Assuming people stay in a job 4 years, you can get a 25% reduction in headcount per year by just not hiring. Intel has over 100,000 employees. They are likely hiring 10k-25k people per year just to stay at a constant size.


I doubt with the CFO being let go most people will end up getting re-hired in other groups. Their revenues are down, they're closing up shop in some areas or realizing they need to run some areas with less people. It's actually a pretty big deal.


the CFO being let go

He's not "being let go".

From the article: Stacy Smith, who has been chief financial officer since 2007, will move to a new role as head of manufacturing and sales

I'm not sure what to make of the move, but if they really wanted to let him go they wouldn't have given him this different role.

Edit: just to elaborate, Intel is first and foremost a manufacturing company. Their manufacturing currently has a 61 percent gross margin. You don't put someone in charge of that if you want to let him go!?

But I'm not an Intel employee or close observer of the company. Maybe some Intel insiders can chime in with that they think this means.


Intels profit has grown 3 percent and their revenue is up 7 percent compared to same quarter a year ago.


You're right. I made a bold statement without looking into the details. Looking at their yearly balance sheet, though, their yearly liabilities are going up. They've got quite a bit of debt.

A good 1/3 of their assets are in property plant and equipment value. That means that the fair market value of that stuff can tank if they don't keep up the pace of growth because what if they can't sell as much of their products anymore because of a shift to other technologies?

This is for sure a move because their revenue forecasts are grim in many areas they operate in.


News like this out of the blue is strange. I was pretty sure that the post-PC era was a sham. It still is, right?


I recently started working on a little eCommerce project. I found a way to indirectly estimate the sales of some of my competitors products. So I downloaded about 13gb of this data, processed it down into about 80 million rows and than ran a bunch of stats on it. Now 80 million rows is not "big data" scale, but it's not super small either. With my 16gb of memory (that I paid I think $400 for), my humble quad core i7 that i paid $300 for, and my 500 GB SSD. that I paid $100 for, My less than $1k desktop crunched through this dataset with no effort at all. I consider myself a "power" user, and my humble machine can do everything I can throw at it, and more. It'll be a long time before I upgrade again, and when I do, i'll probably buy the cheapest processor on the market. I'm probably not alone here.


My 3+ year old Lenovo laptop running an i5 with 8GB of RAM and a 500GB HDD (at 5400 RPM, no less) easily lets me multitask on Word, Photoshop and about 20+ Chrome tabs open

The only time I've felt my computer to be "slow" was when I tried to use Photoshop and After Effects simultaneously.


Yup. I do plenty of "medium data" work on my overclocked i5-2500k and 64 GB of RAM on Windows 7 / Ubuntu mixed machine. I've had the processor since the 2500k was the thing to have (years ago). No reason to upgrade yet. Got a new GPU after 6 years but that was about it.


This sounds intriguing, care to share any insights? Is this for Amazon?


Yeah, I want to hear about estimating competitor sales too.


Maybe i'll do a write-up in the future. I assume everyone does it already, but on the off chance they don't.... i don't see any upside in writing about something that might help my competition.


There are plenty of PCs.

Here's the problem. Running a five-year-old PC used to be an issue.

Today, running a five-year-old PC is a Intel Sandy Bridge i7-2600K ( Passmark Score: 8,518 ). While a modern i7-6700K has a Passmark Score of: 10,987.

FIVE YEARS, and FOUR generations of processors have created a gain of net 28% in multithreaded situations. Far less for single-threaded applications (maybe 15%). And absolutely negligible for gamers (which are 100% GPU throttled).

If you're running a 5-year-old i7-2600K, there is absolutely no reason to upgrade to Intel Skylake. None at all. Maybe you want a new GPU to play those VR games... but Intel isn't making gains anymore in processor speed.

Intel has been trying to get people to buy their power-efficient designs (Skylake is a hell-of-a-lot more power efficient...) so Intel continues to sell laptops at a decent rate. But no one I know has major issues with their desktop speeds.

The only people I know who have upgraded their computers are those who have had hardware failures. There's still no need to upgrade a computer from Sandy Bridge.


People repeat that meme that SandyBridge doesn't need replaced so often. You say it 2 or 3 times by yourself.

While that's the prevailing opinion, I don't necessarily agree. I think it's SandyBridge owners trying to convince themselves more than anything else, but really it's just being swooped up in the groupthink.

Skylake is the first chip that makes a very strong case as an upgrade. You gain NVME support for your PCIE SSDs, DDR4 (which has shown an improvement over DDR3 in some benchmarks), roughly 20% IPC improvement (5% per gen give or take), DX12_1 feature level IGP, CPUs with 128MB L4 cache which absolutely destroys chips that didn't have this for gaming (Broadwell had it first and Skylake's is improved upon), vastly more power efficient and Thunderbolt3 support.

7 pretty good reasons off the top of my head. You can dismiss each of these if you want, but this is all very attractive in reality.

The whole story is that SandyBridge is only competitive, in gaming, if you overclock to 4Ghz+. You still lose out on the other improvements though and any stock SB system compared to a stock SL system will look pretty sad once you factor in the platform updates.

If it's gaming you care about, take a look at the benchmarks of the 128MB L4 Broadwell chips compared to Devil's Canyon. Let alone SandyBridge. Both get crushed where it counts and Intel is just now getting Skylake 128MB L4 CPUs out the door. If you don't care about gaming, Skylake still crushes SB.


> Skylake is the first chip that makes a very strong case as an upgrade. You gain NVME support for your PCIE SSDs

http://www.amazon.com/Ableconn-PEXM2-SSD-NGFF-Express-Adapte...

If you care about NVMe, just get a $20 expansion card. Besides, NVMe SSDs are expensive. Mushkin Reactor 1TB for $210 yo.

Hell, the fastest NVMe SSDs directly go into PCIe lanes. So if I actually cared about the faster speeds, I'd jump to an Intel 750 SSD.

http://www.newegg.com/Product/Product.aspx?Item=N82E16820167...

Name me an M.2 NVMe SSD that is more cost effective than a Mushkin Reactor (1TB, MLC, maxes out the SATA6gbps bus, $200), or has faster I/O speeds than a 750.

Yes, if I had a laptop which only had room for a M.2 card, then maybe I'd get the Samsung M.2 card. But even if one were given to me for free, I'd rather get the $20 PCIe expansion card.

I can't think of a single situation where I actually need the onboard M.2 card on the Skylake motherboards, aside from the $20 convenience.

> roughly 20% IPC improvement (5% per gen give or take)

I admit, this is a good thing. But this is very very little, especially when you consider that the iPhone 5 to iPhone6 jump was 70% IPC improvement AND battery improvement, yet many people don't consider that enough of a jump.

http://www.imore.com/a9-processor-iphone-6s-and-6s-plus-70-f...

Soooo... FIVE years gets you +20% speed, while ONE year gets you +70% speed on phones. That's why desktops aren't getting upgraded.

> DX12_1 feature level IGP

You buy a $300+ CPU without buying a $100 GPU? The cheapest of GPUs are significantly better than IGP. Hell, if I cared about DX12_1 IGP, I'd get an AMD A10 for half the cost and twice the IGP performance with drivers that actually work on games.

Except I game in capacities that far exceed even AMD's superior IGP. I also care about adaptive sync / GSync technology, which isn't supported by Intel Iris. So I have a R9 290X. Intel's IGP doesn't even come close to a $100 GPU, let alone the midrange GPUs.

> CPUs with 128MB L4 cache which absolutely destroys chips that didn't have this for gaming

NOT on the desktop. Crystalwell is laptop-only, and 45W to boot. Compared to 20W Laptop chips, I don't see the Crystalwell L4 Cache actually being useful to the majority of people.

In fact, I don't even know of any laptops with Crystalwell that I'd recommend to anyone. Here's a challenge: Name me a good laptop with Iris Pro / Crystalwell. Hint: Macbook Pro use AMD dGPUs for a reason.

And hell, we aren't even talking about laptops. We're talking about Desktops, and Crystalwell is UNAVAILABLE in desktop form. Its irrelevant to the desktop user, even if you thought that paying $600+ for a CPU was cost-effective (instead of buying a Sandy Bridge E5-2670 for $70 from Amazon).

Basically, you got DDR4 RAM and IPC +20%. That's all that I actually think the last five years will get you. Or, you can buy a 8-core 16-thread E5-2670 for $70... hell... two of them, get a nice Server Board for $300 and have a BEAST of a machine.

http://www.techspot.com/review/1155-affordable-dual-xeon-pc/


The base Macbook Pro 15" uses Crystalwell and has no dGPU.


Yeah, but would you seriously recommend it over the AMD Tonga (R9 M370X on the upscaled version)?

The 45W i7 is a heavy burden to carry with Crystalwell. Might as well get better graphics if you're going for the 15" Pro.


There is no way in hell I would prefer that AMD chip in my system over an all-Intel system. AMD are just terrible to use in Linux and most people around here want the ability to run that natively without issue. Not to mention the added complication of tacking that AMD chip onto the laptop both from an engineering / reliability stance and software complication.

You missed the irony of your 45watts as a heavy burden. A R9 370X adds about 50watts to your TDP by itself. Along with its needless complexity. If someone wanted to reduce TDP and that complexity you could step down to the base Intel IGP. But if stepping up, Intel's solution makes a lot more sense.


Your loss man. The benchmarks don't lie.

Good luck with your overpriced Crystalwell failure. If you got actual benchmark scores to talk about, please respond to me here: https://news.ycombinator.com/item?id=11536519

But I actually know the benchmarks of everything you're talking about like the back of my hand. Your argument has no technical legs to stand on what-so-ever. Don't feel bad if I'm just calling out your Bull$.


Wait, what are you talking about? That was in no way a response to what I said to you here. You don't need to change the topic just because you're wrong and you know it.

No one wants that AMD chip in their Macbook. It adds complexity both in engineering and software. There's PLENTY to talk about technically there and why that's a good idea. Not to mention Intel's best-in-class Linux support.


It's actually kind of annoying to have graphics card switching - it caused a number of problems in my old 15" MBP, to the point that I opted for integrated this time.


>Besides, NVMe SSDs are expensive

Yes. If you're bargain hunting for gaming hardware you should just buy a console. Or, if you're seriously suggesting to put an Intel 750 into some old system like SandyBridge.. no comment. I would never recommend someone bother doing that.

Step up to an NVME setup, Skylake and do it right. Skylake i5 setups can be had for cheap. You're just arguing to argue on that point. Whether or not you have anything useful to add. The SB argument is common knowledge, an age old argument at that with no new information or insight.

>Name me an M.2 NVMe SSD that is more cost effective than a Mushkin Reactor (1TB, MLC, maxes out the SATA6gbps bus, $200), or has faster I/O speeds than a 750.

I'm not into cost-effective bargain hunting. Anyone who would gimp a nice Intel 750 SSD on a non-NVME system is a fool and you've suggested it.

>The cheapest of GPUs are significantly better than IGP.

No they aren't. The point about DX12_1 IGPs is that it's there, it's modern and it has already sucked the life out of the low end space and moving into the midrange with Iris Pro. Your stance is the 2010-era view on computers. Same era as Sandybridge TBH.

>I also care about adaptive sync / GSync technology, which isn't supported by Intel Iris.

This demonstrates how much you know, and why people shouldn't listen to what you're saying. Which can be heard on any PC gaming forum a thousand times over. This is HN though and it won't fly.

Intel has already committed to FreeSync. It's incoming with KabyLake rumor is that it may be enabled for Skylake.

>Intel's IGP doesn't even come close to a $100 GPU, let alone the midrange GPUs.

Wrong on its face. You just haven't cared to investigate recently.

>NOT on the desktop. Crystalwell is laptop-only, and 45W to boot.

Nope. The Crystalwell chips are going into NUCs from here on out. There's a 128MB L4 NUC coming in 2 1/2 weeks and a 256MB NUC coming in 12 months.

The fact you're talking about gaming and recommending an ES-2670 for that is just silly. That might be a good machine for compiling code. If that's your goal, it's still a bad idea when distcc can utterly embarrass that old power hungry chip.

For gaming, Broadwell already demonstrated what Crystalwell adds for gaming performance with a standalone GPU. And it's a game-changer, it's faster than the i7-6700K. Yes, it is. And it definitely mops up where it counts (99th percentile frame times) on SandyBridge too.

In 2 1/2 weeks you'll see Skylake with Crystalwell and Thunderbolt 3 absolutely delete any SandyBridge gaming rig you may have (even with an Intel 750, if you made the ridiculous decision to actually put one in SB). There might be more cost-effective ways to build a gaming rig, but if you're into saving money on hardware and gaming, buy a PS4.

I understand it's the prevailing thought among PC gaming kiddies, but holding your grip tighter on some old SandyBridge system won't change that in reality it's fallen pretty far behind in both overall platform performance and power efficiency.


I can't find Iris Pro 580 on benchmark sites, because no gamer gives a care about that for gaming.

The Iris Pro 5200 GT3e achieves Passmark 1,174.

http://www.videocardbenchmark.net/video_lookup.php?gpu=Intel...

If Iris Pro 580 GT4e is twice as good (Intel only claims 50% better), that's still not very good. Thats utterly awful actually.

A $100 GPU is the R7 360, just off the top of my head. http://www.newegg.com/Product/Product.aspx?Item=N82E16814125...

Exactly $99 on Newegg right now. It achieves Passmark 3,150.

No one gives a care about the $600 Crystalwell chip that performs worse than a $100 dGPU. Its utterly awful. You'd be insane to actually recommend this product to anybody. You claim that you care about performance. Do you even look at the benchmark numbers bro? You're claims are so far away from reality I really just don't know how I'm supposed to respond to you.

Yes, a $600 Chip. I'm assuming this, unless you can figure out a cheaper Skylake Iris Pro: http://ark.intel.com/products/93336/Intel-Core-i7-6970HQ-Pro...

----------

EDIT: I see that you're an anti-AMD guy. Okay, whatever. That's why there's another company out there.

http://www.amazon.com/ZOTAC-GeForce-DisplayPort-Graphics-ZT-...

NVidea GTX 750 Ti, $105 right now on Amazon. Passmark 3,686. Still utterly crushing your $600 iGPU with cheap-as-hell GPUs, no matter the brand.

http://www.videocardbenchmark.net/gpu.php?gpu=GeForce+GTX+75...

Dude, I'm running a (what was at the time) high-end R9 290x, although this is more of a mid-range card now due to its age (Fury / 980 Ti). It has Passmark of 7,153, and you're seriously suggesting I "upgrade" to a Crystalwell Iris Pro that only achieves ~2000 Passmark?

------------

PS: Skylake performing 20% faster than Sandy Bridge after five years of updates is awful.

-----------

> I'm not into cost-effective bargain hunting.

Then why the hell are you bringing up M.2? Intel 750 is the best of the best and plugs directly into PCIe. Sandy Bridge handles it just fine.

-----------

> In 2 1/2 weeks you'll see Skylake with Crystalwell and Thunderbolt 3 absolutely delete any SandyBridge gaming rig you may have

Crystallwell has to beat a $100 dGPU first. And the benchmarks say otherwise. My bet? Crystalwell fails to beat a R7 360 / NVidia 750 Ti (NVidia's LAST generation BUDGET card) and I get to laugh at its worse-than console performance numbers despite the $600+ launch price for the chip.

But hey man, show me those benchmark numbers if you disagree with my assessment.


"Bro", "dude", "man". So I know I'm talking to some little kid at least.

But I have to say, anti-AMD! Yes, very perceptive as I type this on a machine with a Radeon in it. Consider the fact that other people can criticize products they have extensive knowledge with.

Judging from the rest of your response, my post went completely over your head. And quite the troll as you try to change the points I made in attempt to "win" the argument. But it is amusing hearing some kid saying Intel's 20% IPC boost from SB to SL as awful shows how much you don't know. I have friends that work at Intel. Go back to PC Gamer as you have no idea what you're talking about. Some dumb kid sees ONLY 20% with massive power reductions, doesn't realize that computers can do more than just play video games.

You also failed reading comprehension and the ability to hold a conversation. Congrats. But either way, there's no way around the fact that in 2 1/2 weeks I'll be benchmarking a R9 Fury to an i7-6770HQ with PCIE NVME SSD, some DDR4 and absolutely crushing any SandyBridge system you own.

Enjoy your old ES-2670 and SandyBridge with an Intel 750. What a total fruitcake. Better use of your time is to go read about logical fallacies as you just spent an hour typing about a strawman you created to beat on with points I never made.

What you want to hear because you just want to argue- you're right, I'm wrong. Hope you feel better now. I'm not giving you any more help. I get it, you like your poverty gaming rig. See ya kid.


>> i7 2600k

Sure, that's the top of the line $300 chip in the days of a whole PC being able to be bought for $300. What if you're on a five year old Pentium G620?


If you bought a $300 PC 5 years ago:

1) you're not the sort of person who buys a new rig every two years, and 2) a $300 PC today will give you exactly the same performance as the one you bought 5 years ago: the minimal gains you get in iron are naturally offset by minor losses in software (which is now built by people with SSDs, so good luck with your little spinning disks...)

The market is now artificially segmented to such a fine level, and moving so slowly at the top, that performance simply does not "trickle down" like it used to. Add to that the move to "power efficient" CPUs (aka: less powerful overall) and you will basically see zero gains if you stick to the bottom of the market.


Not quite "exactly" the same performance. A 20% improvement with today's stuff.

But yeah, its peanuts. A 20% improvement over five years is pathetic. I'm just calling out your hyperbole, in case others didn't see it. Apple had like a 50% improvement in a single generation of iPhones, so a 20% difference over five years is very ignorable.

SSDs and GPUs improved dramatically over the past five years. Well... more specifically... SSDs got dramatically cheaper and retained roughly the same quality. So its worth it to upgrade to SSD or to get a new Graphics Card. But Intel doesn't have any GPU offering, and their SSDs are "enterprise" (aka: overpriced). Mushkin / Crucial are better brands for consumers... even Samsung (although a bit more expensive)


The cores are basically the same within the generations.

A five-year-old Pentium G620 is only ~25% slower than the Skylake Pentium G4520. Both are dual-core CPUs that are cheaper than $100 aimed at the budget audience.

Frankly, the fact that AMD Vishera FX-6300 still easily beats out the Pentium G4520 in multithreaded benchmarks... this demonstrates the absolute lack of Desktop CPU improvements. I'd only recommend the G4520 to someone who is really sure that they care about single-threaded performance (ie: Gamers). Most people will appreciate the lower total-cost-of-ownership that FX-6300 offers at that price point.

https://www.cpubenchmark.net/high_end_cpus.html

* AMD FX-6300 Passmark: 6,342

* Modern Skylake Pentium G4520 Passmark: 4,261.

The G4520 is a $80 chip, Released October 2015. FX-6300 was AMD's 2012 entry: a FOUR year old chip, now selling for $80 to $90 at Microcenter.

Microcenter has some $0 Motherboards if you buy an FX-6300 from them. That's the kind of benefit you get from buying "old". And since CPUs aren't really much faster, why the hell should you buy cutting edge?

--------

Hell, why are you spending $80 on a new G4520? Facebook just decommissioned their servers. You can get a Dual socket ready Sandy Bridge 8-core 16-thread E5-2670 on Ebay for $80. Amazon for $70

http://www.amazon.com/gp/offer-listing/B007H29FRS/ref=dp_olp...

Go get yourself a dual-socket 16-core 32-threads E5-2670 Sandy Bridge Workstation, just $80 per CPU.

Intel can't even compete against their own ghost from 5 years ago. Is it a wonder that sales are low?


I think we're in a post-PC era. Yes, I spend my work days on a laptop. But most of my personal computing happens on an iPhone or an iPad. I think that PCs (and laptops) will increasingly become "things we do work on" and smartphones and tablets will become "things we consume stuff on". A lot of PC sales, I think, were coming from people buying them for personal use to consume stuff. That market is changing.


On the consumer side its possibly back to a shared PC era. Instead of multiple PCs per household, just one PC to do work that's about 5 years old, and each person has their own annually updated phone/tablet for everything else.


They're so much slower to interface with though. I couldn't live without keyboard shortcuts.


On top of that productivity app companies are chasing mobile in order to not get left out in the cold, further exacerbating the PC decline.


This isn't at all out of the blue. As the article states, Intel revenues have been declining for over half a decade now. A few years after the first iPhone hit the market.


Are you sure about the revenues going down? According to http://www.statista.com/statistics/263559/intels-net-revenue... revenues seem to have gone up.


You can blame it on the end of Moore's law, or you can blame it on mobile computing. (The two are not unrelated.) But it is fundamental.


To me, it's more that Moore's law is over and that observable speedups are much less obvious to the non gaming consumers


To me the Moore's law will end where we all have a mediocre CPU at home connected to a fiber... any anytime you need to do work, or someone else, without being aware we share our CPU with eachother. Then there is no more Moore law - just one huge SPU (Sharable Processing Unit).


Out of the blue? There have been articles for a long time about the internal mess at Intel as they try to figure out mobile and IoT. One of the latest ones just last week:

http://www.businessinsider.com/intel-leaked-memo-murthy-rend...


I'm thinking its not so much a post-PC era as a customizable SOC era, and Intel does not want that at all.


How much of your time is now on a mobile that used to be on a desktop? Their core business is drying up.


Am I the only person for whom the answer to this question is "almost none?"

I'm serious: about 80% of my day is meeting with real people in the real world. Mobile phones haven't changed that.

The other 20% of my day is sitting at my desk creating original work product (mathematical models and thoughtful memoranda) or reviewing the work product of others. Mobile phones haven't changed that, either.

No doubt the drought in PC sales is real and permanent. But I wonder how much of that is because people just don't need to keep their laptops up to date in the age of great cloud services.


Nah. I think there are more than a few of us around. I mostly avoid using my smart phone and I don't have a data plan. I had to buy it because I was travelling abroad and it was a light device I could use to communicate and take photos with. Most of my work is done on my laptop.


Yeah, mobile hasn't figured out a good way to take over the workspace. Some of the tablet/laptop hybrids are getting closer.

As for entertainment though, are you watching Youtube extensively on your desktop?


I know this sounds crazy, but there are still some people who have cable subscriptions and watch TV on a TV. Oh the horror!


Those TVs are now 'smart' along with cable boxes and other peripherals. Is Intel Inside any of those?


> Yeah, mobile hasn't figured out a good way to take over the workspace. Some of the tablet/laptop hybrids are getting closer.

There already is a relatively mobile tablet/desktop hybrid that works pretty great for both consumption and getting work done. It's called a laptop.

> As for entertainment though, are you watching Youtube extensively on your desktop?

Yes. I have a phone, a tablet, a desktop, and a laptop. The tablet is pretty much only used for netflix and textbooks, and the phone is for travelling. The tablet is absolutely worthless for browsing, coding, writing, or gaming; and the phone is only saved by the form factor. If I had (the space for) a TV then the tablet would be a completely unjustifiable purchase.


Laptops typically aren't considered mobile devices (if we're being pedantic). Try running mobile apps on your laptop.

And yes, mobile devices have taken away entertainment share from the desktop as well as televisions.


Sure, say portable if that makes you happier. Not that it matters though, mobile programs are the ones trying to catch up to desktop programs, not the other way around.


No, you aren't the only one that barely uses mobile stuff. My desktops and use of laptops is essentially required for my job (software engineering). I've seen 1 person switch to a tablet, but I'm not sure they have enough power to do the things I need. One day all of this will merge into a single unit, but right now I see separate needs/uses for both.


Great cloud services, or the fact that my work has in no way been accelerated by newer processors. The only noticeably workflow-related speed increase I've had in the last five years is damn near zero-margin SSDs.


Almost none, the only time I use my mobile in lieu of a desktop is when my desktop is unavailable or my home Internet connection is down.

I don't count time I spend out and about on my mobile, since that isn't time I was going to spend on my desktop anyway. (And I don't think having the option of going outside with a mobile has changed how often I do so.)


All that mobile traffic increases server traffic. I don't think the switch to mobile is a big hit for Intel as long as they are the only major player in the high performance CPU market (let's see if AMD can turn it around with Zen, but I don't have my hopes up).


Also chiming in with a "none". All mobile devices have done for me is expand my use of computing devices into realms in which I never used one before.


Less than 5%. My phone stays in my pocket most of the time, even when I'm nowhere near any other device, and everyone else around me is heads down in theirs.


I pull out my phone to get my 2FA number every now and then :D


Yeah, I am not doing any development work on a mobile device. I have yet to go to an office where the cubes are filled with people working on iPads.


I think intel's core business has transitioned from desktop PCs to servers for the farms, which shows no signs of drying up.

Plus, Intel provides almost all laptop chips, and I'm guessing most businesses still have either a desktop or a laptop per person, which will still get upgraded every so often. I doubt workers are going to be using iPads for data entry, although I suppose you never know.


For me, maybe 30%, mostly because I have a computer in front of me for coding most of the day anyway. For my brother's family it's nearer to 90%. He still has a PC for work, although it is on about a 5-7 year replacement cycle, and they no longer need the more-expensive second PC.


> How much of your time is now on a mobile that used to be on a desktop?

None, since I don't want to have (and don't have or own) a portable surveillance and tracking device (also called mobile phone) in my pocket.


It's probably possible for PC sales to decline while PCs remain dominant in computing. If so, Intel probably doesn't need to restructure; they just need to get smaller.


No. Computer sales are at historic lows. We've reached the confluence of computers being "good enough" (there are no longer much performance gains from buying new computers) and more and more consumer computing moving to tablets/phones.


The particulars of Intel's situation aside, there are some (notably not all) watching the markets that believe we may be well into the prelude to a market recession.


We're pretty much post-PC froth/churn.


I'm not sure how much of an effect this would have, but from a consumer's point of view, there isn't as much of a reason to buy a new PC every few years anymore. The laptop I'm using right now was purchased in 2011, and the prices in stores now are very comparable to what I paid back then. I've made up my mind to buy a new one a few times, but then when I go to the store, it just isn't worth it.


Moore's law may become irrelevant long before it becomes invalid. Without rapid increases in usable computing power, the reasons for upgrading have largely disappeared. In particular the chips at the top of the profit margin curve, where Intel loves to play, are much less compelling.

Intel might eventually find itself in the same shoes Kodak did - when your primary business dries up, there's unlikely to be a follow-on that is successful enough to keep the company going.


I'm not convinced that moores law has ended (using the processing power doubles not transistor count doubles).

We've hit the physical limits of our current processes but we haven't remotely hit the physical limits of..well physics.

It's possible that we are on the flat before the next curve starts whether that curve will reach the consumer space I don't know but at the top end of computing (supercomputers) there is still massive demand for more processing power.


What makes you think we haven't started to hit physical limits of physics? A 4nm transistor is 7 atoms wide. Obviously, a one atom transistor is pretty much the absolute smallest possible, which would be about .5nm. Anything less than 7nm experiences quantum tunneling, which is going to really mess things up, and will get worse the smaller the size. [1] I imagine the interference between transistors will be fun...

Next, how exactly are you going to make these atom-sized objects in quantity? You can't be manually placing atoms. But lithography has a hard physical diffraction limit, forcing the use of higher frequency light. The higher the frequency, the greater the energy. Etching single-atom features will require hard x-rays, which we may not even have technology to generate. Even if we do, focusing x-rays is not trivial, as you can't just slap in a glass lens. Plus, the focal lengths may be rather long. And what kind of photoresist do you use? Hard x-rays are likely to blast pretty much anything, and the etching characteristics are probably not neat little troughs. How do you ensure you have your one atom stay put when everything around it is being blasted by high-energy x-rays?

I think the physical limits are looming pretty large.

[1] https://en.m.wikipedia.org/wiki/5_nanometer


Maybe but (and I could be way off I'm a programmer not a physicist) those limits are pretty much the limits on our current technology, it's a bit like saying "well we've reached the point of diminishing returns with this steam engine, that's it no more progress" and over in a shed somewhere else someone is inventing the AC electric motor.

I'm optimistic because I've seen "end of progress" reports on computing power since I was a kid in the 80's.

We are long way away from this https://en.wikipedia.org/wiki/Bremermann's_limit

We've known for ages that Germanium and similar offer better characteristics than Silicon but the cost to improve silicon has stayed below the cost to retool for Germanium til now, if we do hit the limit at 5nm silicon then the cost equation changes and alternate materials become worth the investment.


They already use germanium in their manufacturing processes. It's not all germanium because there are challenges that would create. There are also benefits to mixing a different sized atoms into the same lattice to affect the speed in which the electrons travel so modern manufacturing often implants germanium atoms into the silicon lattices. I do agree that material improvements can change everything so the often spouted doom and gloom of stagnation is often unfounded.


Mostly agree with what you said, but just throwing this fun tidbit out there. The stated nm "size" of transistors are no longer accurate (and haven't been for some time). For example, when a company says it's manufacturing at 16nm or 14nm...that is the size in which the chip designs are drawn and planned, but the physical size dimensions are closer to 20nm. Same will be true when they go down to 10 and 7nm, it wont actually be a 7nm transistor in physical dimensions. And as far as lithography, right now they're mostly looking at "extreme" ultraviolet wavelengths but the problem is mass producing is not currently feasible with with how long it takes the photoresist to react. That being said things like quadruple patterning allow them to get pretty small with EUV wavelengths.


It doesn't matter if Moore's law still holds or not, it's become irrelevant for most of the PC market. That's the point I was trying to make.

There are always good uses for more computing power, but not on the desktop where the big volumes are.


> Without rapid increases in usable computing power, the reasons for upgrading have largely disappeared.

1. Force Windows 10 upgrade.

2. Windows 10 sucks on this computer!

3. Buy Windows 10 compatible computer.


Microsoft's not stupid, they realize that people are keeping their computers longer than ever. They're also trying hard to make Windows 10 run on the widest range of machines possible. So I think it's a priority of theirs to make sure it doesn't suck. They don't want a repeat of Windows XP where nobody upgraded!


> They're also trying hard to make Windows 10 run on the widest range of machines possible.

Why do you say so? Do you work at Microsoft?

Anecdotally: I upgraded a somewhat older laptop to Windows 10 and found that the screen brightness was locked to full. Called up support and was told that no, there's no fix and no, there won't be one for the foreseeable future - mine isn't a "supported system", so it has to stay on Windows 7. Looked up the issue online and found many others with the same problem going back to release day.


except windows 10 works, to my anecdotal experience, better than windows 7 on the same box.


Yes, this thought really makes me wonder about where the future hardware computing advances will come from. How long is it going to take for a new company to 'disrupt' the processor industry in a way that the behemoths of today wouldn't have expected or weren't able to do. I would guess at least 10 years from now.


The end of Moore's Law has broad industry-wide implications. Imagine if Moore's Law had ended in the year 2000 - no smartphones...

A lot of future technologies that rely on exponential increases in silicon capability will not be possible. It's almost impossible to replicate an exponential.

The rate of progress will slow and look more like more mature industries such as aerospace.


Exactly the tech industry has been incapable of coming out with new functionality that really requires more power.

The only thing I can see which is going to benefit a lot from CPU power is more advance AI or machine learning built in. But we are only seeing very gradual moves in this area now.

VR and Games are also candidates of course, but most people aren't hard core gamers. They are fine with Angry Birds ;-)

And with the bandwidth we got today, it wouldn't necessarily be a big problem to offload much of that processing to servers and do it as a service instead. In many ways we might be heading back to the dumb clients of old where the majority of processing happens on a remote server.


The problem for Intel is that AI and machine learning are going to be typically done on the GPU. So, Nvidia. Also, a chance for AMD to recover.


Well, that's just another way of stating that CPUs have fallen way, way behind graphics cards in terms of power. At some point in the past the difference in power wasn't nearly as large.

If there were consumer-grade 16- and 32- core Intel processors with reasonable prices I would consider buying them for the sake of certain experiments. It would save me from dealing with CUDA or OpenCL. But right now that's the domain of pricey high-end server stuff.


Yep, Moore's law is still going on strong in GPUs.


Or maybe we just reached an entertainment / office plateau. Having more compute power benefits high end needs not average consumer ones.


Even games are slowing down a lot in their requirements, aside from that VR garbage.


Interestingly, there's a strategy game series called Europa Universalis, which moved from a 2D map to a 3D map quite a while ago... the reason being that the 2D map was rendered by the CPU, whereas a 3D map could be shunted off to the GPU to deal with, freeing up precious CPU cycles for the game logic.


2D maps can be rendered by a GPU just fine. However, when you're doing the upgrade from CPU rendering to GPU rendering anyway, you likely want to have some returns, so they probably just went to 3D in the same step with few extra development cost.


So Intel has had decades to become more than a PC parts manufacturer (albeit largest one ever).

The issue is creating compelling reasons for buying a new PC. Intel/Microsoft have not been very successful at promoting those sales-drivers.


Yes, and Intel has been trying. Remember, there once was a time when Intel did not make CPUs at all. Intel has remade itself several times, and pretty much each time it required retrenchment and refocus. Intel has been trying for a long time to get wins in the mobile space, without much luck. High gross margin per wafer and high volume is a rare combination, and a difficult act to repeat.

An interesting aspect of Intel culture is how it reacts to body blows, urgently seeking a path out of the darkness along many different paths simultaneously. Remember the legend of "operation crush" -- the sales force was tasked with getting 1000 design wins for the 8088, from whatever place they could find them. Some guy in Florida found a renegade group in Boca Raton working on a skunkworks project called the "PC". Meh, it didn't sound like much, but hey, only 999 more design wins left to go, eh? Bag the order and get on the road again... whoever that guy was, I sure hope he made his quota that quarter.


That's exactly it. The same performance isn't (that much) cheaper, and it's actually difficult to find any sort of high performance laptop (for cheep).

Plus, the last few laptops I've purchased all seem to die JUST after the warranties (extended or otherwise) expire.

Are they finally at least putting 1080p+ screens on these as a standard thing, or is everyone still stick on that crummy 1366x768 regression?


> and it's actually difficult to find any sort of high performance laptop (for cheep).

I went to a computer store recently, and told them that I was interested in dropping about 2k on a laptop, and I brought a list of specs (at least 16gb ram, lots of cores, etc.). He suggested I should get a gaming laptop and never gave any suggestions from his store.

It will be interesting to see what the future holds, because they're already getting close to the physical limits of what they can build in terms of processors. If they start slowing down financially, it may be the case that we'll still be using computers that are about as fast as they are today in 7-10 years from now.


128 cores?


While a nice idea most software cannot even use 4 or 8 cores properly. That's one of the reason Intel CPUs are still king compared to AMD in desktop space. If you can use all of that cores AMD tries to cram in their CPUs you can get good performance for a good price, but the single-threaded performance is an absolute desaster at the moment (and the one thing AMD has focused most on for their next architecture, Zen), which is still what many workloads use.


>While a nice idea most software cannot even use 4 or 8 cores properly

No that's wrong. They simply don't need to use 4 or 8 cores.


That's why we have ML and AI scheduled.


Your laptop already has a GPU, which has more cores than that.

Let's take the segment of the market with an NVidia GPU. They have CUDA, one of the most polished development environments for GPGPU today.

When was the last time you wrote something in CUDA? What was it for?


Interestingly enough, I doubt any of the GPUs actually has even 128 physical independent cores (cuda cores are not actual cores).


The problem is that most applications have to be completely rewritten for parallel algorithms, which does not fit well to many application domains (does not bring additional speed).


Most software workloads can't use 128 cores in any useful fashion.


Even if you get a high-performance processor in there, it's going to get seriously thermal throttled during any longer periods of serious work. I tried to do large-scale C++ on an ultrabook, but doing long compiles was super annoying. Not only did Qt take forever to compile, the whole machine was crawling while it was happening.


Much to my boss' disappointment, it's almost impossible to find a 1366x768 laptop, or more generally, one that isn't 1080p.


If you are in the United States, check Sam's Club. I was just there a couple days ago and they had several 1366x768 laptops--made by HP, if I recall correctly.


Writing this on a Toshiba I got at Staples on sale last year. It has 1366 x 768.


A good friend was laid off from Intel during their last round of layoffs. Without going to into details that would identify my friend, he was laid off after losing a game of internal politics. His division's vice president even apologized to him for his layoff.

My friend's story gave me the impression of Intel being a highly dysfunctional company. My friend was sad to leave Intel, but I think it was good for him in the long run.

For those about to be laid off from Intel, I hope it also works out for you.


Are there companies with >100,000 employees who are not highly dysfunctional?


Number of employees as a metric for dysfunction? Why not number of court injunctions? Number of union strikes?

I mean, if we're getting serious, we could write a dysfunction function, D = dysfunctional points:

  D = (points for threat posed by dysfunctional government)
    + 4*(points for direct environmental harm done by company)
    + 2*(points for environmental harm caused by subcontractors)
    + (points for each lobbying dollar spent)
    + 2*(points for frivolous or non-FRAND patent lawsuits)
    + 10*(points for fighting against a free and open internet)
    + (points for trying to track users / violate privacy)
    + 10*(points for getting hacked, releasing user's private data)
etc.


As Andy Glew said, "Only the paranoid survive." I was part of a section laid off en masse under similar circumstances. Our group got a paper "award" for cross-team collaboration, and the partner group got to keep their jobs.

The severance was decent, and my new job pays twice as much, so I've got that going for me, which is nice.


you mean Andy Grove?


Indeed I did. Sorry, Andies.


There's an irrational belief system at hardware companies about Windows that clouds their logic and leads to a sense of separation anxiety. Intel lost the opportunity to get a lead in mobile because they viewed the Linux/UNIX client device category as a side business rather than making it their core platform. They continued to invest substantial resources in supporting Windows 8 and Windows 10 despite the obvious reality that Windows is a dead-end platform with no growth prospects. It isn't just Intel but even nVidia/AMD lost the mobile space for this same reason, too much Windows not enough Linux/UNIX.

The Windows ecosystem has become corrosive to any industry or company it touches. We now see the end results of supporting a closed-source legacy platform is 12,000 jobs at Intel due to the lack of excitement and innovation in the PC space. Perhaps Linux will revive the PC market but in the meantime Intel and their peers at nVidia/AMD have done little to make that a reality in the mainstream sense.


Every single thing you said is complete hogwash and you have provided zero evidence to backup any of your poorly formed opinions.

To begin with, Intel didn't lose an opportunity to get a lead in mobile because of their views on Linux/UNIX. None of the players today have a "lead" in mobile because of Unix. They have a lead because of the touch-based front-end. It wouldn't have mattered what that was running on.

Furthermore, Windows is responsible for creating numerous multi-billion dollar industries and Windows obviously holds a ton of value for businesses that Linux can't even compete with. That's why just about every business runs Windows and not Linux.

The willfully obliviousness of nix fanboys like you is boring.


> Every single thing you said is complete hogwash [...] obliviousness of nix fanboys like you

This comment violates the HN guidelines egregiously. We ban accounts that do this, so please don't do it again.

Proper use of this site means commenting civilly and substantively, or not at all. Please read the rules and follow them:

https://news.ycombinator.com/newsguidelines.html

https://news.ycombinator.com/newswelcome.html


> Every single thing you said is complete hogwash and you have provided zero evidence to backup any of your poorly formed opinions...

The evidence is obvious if you read the article about Intel cutting 11% of it's global workforce which is a big deal. This was due to the declining PC business which for Intel is primarily part of the Windows ecosystem and the need to restructure to focus on mobile, IoT and servers which is part of the Linux/UNIX ecosystem of devices.

> To begin with, Intel didn't lose an opportunity to get a lead in mobile because of their views on Linux/UNIX. None of the players today have a "lead" in mobile because of Unix. They have a lead because of the touch-based front-end. It wouldn't have mattered what that was running on.

Microsoft pushed it's Windows first strategy which meant focusing on laptops/desktops and Intel's core devices business was tied deeply into this strategy after years of close collaboration. All of the Windows based mobile devices failed miserably. Microsoft struggled to port Windows to ARM which didn't work that well. This didn't leave any real opportunity for Intel to get into mobile in the early days, it had to choose between it's relationship with Microsoft or breaking out into a new direction and embracing other partners and it did not do this.

Even if you look at the ultrabook that was a byproduct of Intel's work with Apple on the MacBook. That kind of innovation just wasn't possible with Microsoft because their strategy was contrary to where the market was headed. There was a huge opportunity to push a Linux based mobile/desktop hybrid but Intel never made an effort to move into that direction.

> Furthermore, Windows is responsible for creating numerous multi-billion dollar industries and Windows obviously holds a ton of value for businesses that Linux can't even compete with. That's why just about every business runs Windows and not Linux.

In the past it "created" value but that's irrelevant to businesses as they are concerned with future value. This is what Intel is doing here in their restructuring re-focusing away from "what things were 5 years ago" to "where things are now and where they are going".

> The willfully obliviousness of nix fanboys like you is boring.

Like I said in my earlier comment, there is this irrational defensive thought process when it comes to Windows and you demonstrate that perfectly.



Last I checked they were the #1 sponsor for EEs/Computer Engineers. It's appalling that Congress looks the other way when abuse is clearly evident.


When you pay the exact amount of money and benefits and sometimes even more as joining bonus and stocks awards for a H1B employee, how can this be an abuse of the system?


The argument "we can't fill all these open positions, please let us bring in foreign workers" then firing 10,000 people contradicts why they want the government to deregulate the system.


The company has to refocus and change the way it was working, and requires restructuring because of recent losses. This cannot be associated with its stand on a law. If Google has a glowing quarter results and says it needs more H1B workers are you willing to accept its stand?

My point was the idea that H1B workforce in companies like Intel, or Google is cheap is just a myth.


No, it just means none of the 10k people were qualified enough. They were not fit for the open positions. Otherwise there is no sense in hiring an H1B with the same salary plus all the H1B related fees and expenses.


> Otherwise there is no sense in hiring an H1B with the same salary plus all the H1B related fees and expenses

I am sure some employers will perceive them as more compliant employees, given their visa status depends on their continued employment.


If you're looking for compliant employees, why go through the trouble of hiring an h1b employee, bringing them from a different country and paying all the fees associated with processing visas and probably a green card process, and paying them the same or higher (usually, the outsourcing body shops are generally the places that play below market for h1b) ... if you can just find and American to do the same thing?

Hell when you get rid of an h1b worker there are even costs associated with revoking their visa and possibly paying for their flight back!


> If you're looking for compliant employees, why go through the trouble of hiring an h1b employee, bringing them from a different country and paying all the fees associated with processing visas and probably a green card process, and paying them the same or higher (usually, the outsourcing body shops are generally the places that play below market for h1b) ... if you can just find and American to do the same thing?

The argument is that you can't, because American workers know their rights - rights that every employee should have and stand on, but sadly those from other countries don't always know about.


Intel sponsors the green card of their employees.

Just saying'.


The issue is a bit more complex than that (and I doubt middle management has a firm enough grasp on how to let go 10,000 people within a shot-time frame with proper diligence). There is a vested interest in deregulating the immigration system because it will drive down skilled labor costs, not filling "open" positions allows them to perpetuate their agenda at no cost to them (outside of a job listing). Anyone who has to work under the threat of deportation is going to work harder on average than someone who isn't. It's good for productivity, but bad for all the skilled labor in this country, INCLUDING the people who want to immigrate here for a better life, because they'll be subjected to it eventually too.


The same thing seems to happen everywhere. In Germany the industry proclaims they're in need of around 50.000 software engineers, but if look for a job, you won't find that demand. Last year VW alone advertised they intend to hire 35.000 software engineers. I don't buy the numbers at all.


Holy bloodbath..

From CNBC:

> Shares of Intel were halted after the bell Tuesday as Intel announced it would cut 12,000 jobs, or 11 percent of its workforce

> The technology company also said the CFO would step down


From CNBC: "The technology company also said the chief financial officer Stacy Smith would leave that role to lead sales."


Hmmm...CFO heading up sales is not something you see every day...

Anyone have any further context around this?


Lateral moves, even downward ones, are part of the corporate culture at Intel. At most large companies this type of job switching at the C-level would come as a surprise, but if you read some of the stuff Andy Grove and others have written about Intel's "matrix" practices, it's not too shocking.

Basically, they believe that experience across multiple areas of the business is a good thing, and that no one should become too attached to a particular position or title.


Listening to the earnings call now.. Stacy wants to learn more about the other parts of the business and BK wants to show off his leadership skills to other parts of the company


That is odd, because they also reported an 18% increase in profits.


quote from zerohedge:

  Instead of using the same tax rate as a year ago, when it had an effective 
  tax rate of 25.5%, this quarter the tax rate was down 7.1% to just 18.4%. 
  Had INTC used the historical tax rate, it would have, surprise, missed.
  
  Worse, INTC just cut its outlook, predicting revenue of $13.5 billion, 
  plus or minus $500 million, for the Q2, about $700MM below consensus estimates.


more details on that kind of stuff here: http://www.investopedia.com/articles/analyst/091901.asp


So Intel is dodging taxes as well now?


more complex than that, if they foresee lower earnings, different types of expenditures, etc, etc, you'd need a professional accounting firm to calculate that for you


When you lose money you generally pay less tax. Nothing illegitimate about that.


Intel ranks #14 in H1B use and was a large proponent of the bill to increase the allowance foreign tech workers.

So I'm assuming their inability to find 'talent' is no longer an issue? Same as Microsoft, IBM, and the numerous other big corps that have had massive layoffs recently, while also claiming an inability to find enough US tech workers?


You're assuming that the 11% is coming from roles that are so hard to hire for it becomes more economical to look overseas for the talent--is that true?

In other words, I think it's just worth pointing out that Intel's a huge company and it might be that these layoffs are coming from roles where market demand is lower.


I think this is spot on. A lot of companies I know that are doing layoffs are entirely hands-off developers, except maybe the occasional breaking off with a contracting company.


The people fired don't have the right skills. They hire with H1B because the domestic labor market doesn't have enough people with these skills.


Really, not because they are cheaper and wear handcuffs to their desk so-to-say since they'd have to go back home if they lose their job?


No, not because of that. In case you did not know, the companies are obligated to pay H1b employees more than the prevailing wage for that position in that county. This is something controlled by the Department of Labor.


What do you mean by this? I know for a fact that I was paid more than some H1B folks at previous companies, with equivalent titles and experience.


> The Immigration and Nationality Act (INA) requires that the hiring of a foreign worker will not adversely affect the wages and working conditions of U.S. workers comparably employed. To comply with the statute, the Department's regulations require that the wages offered to a foreign worker must be the prevailing wage rate for the occupational classification in the area of employment.

The prevailing wage rate is defined as the average wage paid to similarly employed workers in a specific occupation in the area of intended employment.

https://www.foreignlaborcert.doleta.gov/wages.cfm


Very interesting. It seems that violations of this law are probably quite often unreported and/or unenforced.


Not really. "prevailing wage" doesn't mean average wage at that company, it means the wage that posted on the Department of Labor website as the minimum wage for that county.

The posted prevailing wage for an EE with 5 years in Santa Clara county, for example, is about $70k (or at least it was back in 2009); so if H1Bs are paid more than that is compliant with the law.


> In case you did not know, the companies are obligated to pay H1b employees more than the prevailing wage for that position in that county

Where's your data on that? Depending on the company I've been at the public H1B numbers were lower than what I would expect.



For years Intel got to ride the rocketship of routine chip-density doubling. For years the new chips were so much superior to the old ones that it paid to replace them and the machines into which they were built.

Now, not so much. I can put a SSD and more RAM in my eight-year-old laptop and make it work just about as well as a new one.

I can switch off the old HP DL380/G5 boxes in my colo, hand them over to the steel recycling guy, move the data to some cloud service, and come out ahead electricity bill vs. cloud bill. I'm not buying many processor chips anymore. Neither is anybody else, except maybe the cloud services. And their bargaining power makes Dell and HP look like the guys in the white-box computer shop down the street.

The processor chip rocket ship has entered orbit; its occupants are now in microgravity. Some other rocket ship will be the next big ride.

It's too bad those folks are out of work. It's too bad plutocrats always behave as if les bontemps rouleront toujours.


> I can put a SSD and more RAM in my eight-year-old laptop and make it work just about as well as a new one.

I did exactly that with an old i3 based all-in-one Sony Vaio PC my dad handed down to me. It previously had a 3.5" 5400rpm spinner and 4GB of RAM and would take a month of Sundays to boot. I installed a 250GB SSD and another 4GB of RAM and it totally transformed the machine. All it cost was GBP62.00 (I got an amazing Black Friday deal on both SSD and memory). Hell, the thing can even run Visual Studio 2015 and a couple of CentOS VM's on Virtual Box and still feel quite responsive.


Didn't they spend 300 millions on a diversity initiative recently to hire women and minorities ? who are they firing now ?

http://fortune.com/2015/01/12/intel-diversity/


Does anyone working within Intel know the reasoning ?

It seems skylake is doing really well.

Is it mostly electrical engineers working on the processors or sales and marketing people ?


I hope it's the hardware side of the company waking up that the software and "solutions" side of the company is a gangrenous fifth limb.

Things like trying to start an app store when they didn't have content (e.g. Steam) or a platform (Google, Apple, Microsoft), trying to pivot that abomination into a media store after Netflix already won and the studios are all trying to start their own, trying for way too long to deliver solutions of putting parts in verticals (e.g. tablets, micros) where they are at best only competitive on one of price, power, or performance and immediately face-planting, but limping along seemingly oblivious to why any of these projects fail.

The mention of IoT though makes me think this isn't the case. Definitely part of the brain-dead trendy "solutions" thinking. I don't think you're going to compete with commodity cheap micros with the best IP or the best process size... price is king.

There are too many layers at Intel, and that leaves too many of those layers too well insulated, able to believe it was the brand that won them their markets (not the economics: Wintel monopoly in personal computing history, having the best server and laptop parts currently), and that despite the economics of new products and "solutions", the brand will somehow convince people to buy in.

Like, why do I ever see you on Television, Intel? Who is the audience for these ads, potential investors? Shouldn't they rather see higher dividends? People don't go to the mall and buy Intel chips, they go buy Macbooks and they use Instagram. Facebook and Apple buy your chips at a scale where it's all ROI, and you deliver that, so what are the ads for?

This shit is why I left, and why always did the quicksale on the stock purchase plan.

Intel needs to just keep making the best parts and leave it to the market to conjure up asinine things to do with them. Kill all the product lines where there's no road to being #1 ROI within 10 years, keep those in the lab.


I'd still like to know whose idea it was to try and compete in the Internet of Things market by pitting a buggy warmed-over 486 (the Quark) against ARM's best, most modern chip designs. Sure, Intel got a bunch of headlines in the tech press about their new IoT solution, but the technical details just didn't add up. The performance and power figures were dire, it couldn't run existing x86 code, it looked like a bear to integrate into anything, it just made no sense.


Intel's latest Quark has gotten to 1.5ua sleep current with sram retention. That's not the best ,but that's good enough for many applications. And it's using 22nm so it may be the cost leader, by far. And that's mostly what counts in embedded.


Thanks for sharing and providing that wonderful insight !

I agree with all your points - the last time I though about Intel as a brand was in 2002 while wanting to build my own PC ( a very small market of geeks at that )

What particular thing would you say that makes it difficult for them to built good software as compared to amazon ? Intel seems to have a bottomless pit when it comes to money.

thanks again !


>There are too many layers at Intel, and that leaves too many of those layers too well insulated, able to believe it was the brand that won them their markets (not the economics: Wintel monopoly in personal computing history, having the best server and laptop parts currently), and that despite the economics of new products and "solutions", the brand will somehow convince people to buy in.

Well, as a gamer, intel (and nvidia) are just the best.

I had AMD Athlon and it was an amazing processor. The modern ones just do not compare to Intel ones. AMD also pulled riddiculous shit when they stopped releasing drivers for linux on 2 or 3 years old GPU's - therefore they can go fuck themselves.

Ive got 4790K and it is just great, runs cool (just a decent upsized aftermarket radiator for ~30$ - no water!), doesn't abuse my fans, will probably last me 5 or 6 years performance wise

For a bulk buyer of crappy throwaway corporate laptops, the AMD might have some appeal...


Exactly, even if gamers were a big market, they follow the ROI too. Intel can't try to be a brand the way Nike is or the way LV is, they sell technical things to technical people but mostly to industry.


> Intel can't try to be a brand the way Nike is

You'd be surprised -- they already are. Their original "Intel Inside" campaign was groundbreaking and helped them immensely in reaching domination in the PC space. A lot of barely-technical consumers will not have heard of AMD or other minor competitors, but they will recognize Intel and buy accordingly. Techies don't care too much about brands, but non-techies do.

Part of the problem they have today is precisely that their brand is basically worthless in the only growing segment (mobile); they must find and occupy all other niches just to stay alive, and marketing is part of that.


Sure, they spend on marketing, they're visible to the public, and their products are popular, those don't mean they have a brand -- a wide and loyal base of customers asking for their products by name for subjective reasons.

The chips sell because they're the best. If people really wanted "Intel" (and knew what they were even buying there) and not just the cheapest / fastest / longest lasting device regardless of how or why it has those properties, Intel's attempts to push into mobile despite failing to hit the required price/performance/power points might have worked like management always pretended it was going to.

They have a very polished image and spend plenty to get broad awareness. Plenty of visibility across all segments, it's not that they failed to capture mobile end-customer's eyeballs, it's that they failed to deliver a mobile chip with an ROI to mobile OEMs.

The most fervent Intel fanboy can't buy your Meego/Moblin/Whatever(Maybe people will forget it failed if we rename it every year)-powered Intel-chip phone if it never hits the market because it fails to have a day's battery life and the OS is basically vaporware (we can't ship updates for existing systems, we're too busy with politics, switching UI toolkits and starting over).

Intel pays to try to build a brand and labors under the assumption that means it has one (for instance, wanting to build their own Platform [SOC+OS+App store] thinking the brand will pull it to market despite reality, instead of just accepting Android from the beginning) but it simply doesn't. All the marketing money looks to be just a bonfire.

Back when AMD got a foothold and was able to be competitive on price, people bought them, I don't remember seeing any AMD commercials.

If people only love you in the categories where you're objectively the best, maybe it's not because you have the cool circuit pattern branding, threw all those parties and bought those super bowl spots.

No matter how much they spend and pretend to be a consumer brand, it just doesn't matter one bit outside the ROI for OEMs and ISVs, since those are Intel's customers, they are only ones making buying decisions of any consequence, and they're never going to buy a part that's detrimental to their own bottom line. Intel Inside partnership was in my opinion about OEMs fleecing the marketing spend for a reduced bill of materials. The sticker meant nothing, no consumers really cared. Enthusiasts only cared when Intel meant it was a better chip, only a minority of that enthusiast minority would prefer a weaker Intel chip over an alternative.

If I were to call my grandmother (70s), mother (50s) and niece (20s) and ask them if they'd prefer a computer with an Intel chip in it, I suppose they'd say yes (and as much as my ego might want that to be because I worked there, it's most probably the ads). However I happen to know that two of the three of the CPUs in question are AMD. Why? I don't know, but the relatives all buy cheap HP or Dell Windows machines (choosing those OEM brands despite better options, real brand effect) on a whim (a.k.a when the last computer's OS installation became unbearable) from the nearest Best Buy (picking that retail brand despite better options, a brand effect) and the OEM chose an AMD part (probably for margin reasons, creating lines of products for undiscriminating consumers) and these relatives didn't notice or didn't care (no brand effect). I wonder if I asked them if their computer had an Intel CPU, if they might even think that they do. I'll try that next time I see them.

No matter how hard Intel pretends they're like organizations with similar marketing spend, they simply don't have a brand like those do: there's no verticals and no consumers aside from self-built-PC enthusiasts, and for consumers the latter are quite a touch more objective than those in fashion or apparel.


I disagree. With all their failures in mobile, you mention it yourself: consumers do know the brand. One of the reasons Intel keeps dominating the PC space is that even at its peak, AMD simply could not get get past the assumption that "chips must be from Intel" in many quarters, be them OEM or consumers. That's real brand power. In fact, their complete dominance is what undermines their own branding: people now assume what they buy will be Intel and don't even check. I bet the Best Buy sales guys studiously avoid mentioning AMD, because it would kill the sale. That's real brand power. Obviously it's not easy, being a component supplier rather than a direct b2c vendor, but I think that Intel punches well above its weight in brand power. It might not be as big as Apple or Nike yet, but it's definitely in another league from competitors.

> The sticker meant nothing

I absolutely disagree. It did mean something, when it was the only sticker. There is a reason stickers became popular right after that.


> Intel can't try to be a brand the way Nike is or the way LV is

They have a surprising amount of advertising at gaming events which seemingly tries to do exactly that. Even though they are more-or-less the only option. (although AMD was (maybe still is? I'm not following that to closely) better at integrated graphics solutions which were interesting for low-end gaming, most gaming profit is probably near the high-end)

Maybe to promote PC gaming in general, over other activities and specifically consoles?


I can't really figure it out. It's pretty mindboggling.


I asked a friend who worked at Intel and he said part of it has to do with the company's hiring and organizational structure. The basic gist is that people are brought onto teams that only exist long enough to fulfill their initial objective. While some teams are dedicated to long-lived consumer product lines, others might focus on R&D or custom B2B orders. Once a team has fulfilled its responsibilities within the larger company, it is likely to be dissolved and there's no guarantee for room in other teams for the leftover employees.

That's how I understood it, at least. If anyone more familiar with Intel's inner-workings thinks I'm off-base, please feel free to correct me.


Is this a similar issue to what was reported about Amazon getting rid of people after their product launches (even if successful)?


I'm not aware of what you're referring to. The most recent series of layoffs at Amazon was due to poor sales of the Fire Phone [1][2]. Most companies in Amazon's position would have done the same thing.

[1] http://arstechnica.com/business/2015/08/fire-phone-flop-blam...

[2] http://www.wsj.com/articles/amazon-curtails-development-of-c...


I don't believe that's a thing that happens at Amazon. I've worked here nearly 4 years, across several product launches (hardware and software) and only known of one person being let go, and that was for entirely unrelated reasons (politics and such).


Just a guess, but tech companies are occasionally release % of their workers to clean its workforce, boost productivity and make space for new, young and experienced engineers.

Stagnation and idea depletion is dangerous for tech companies, fresh employees bring fresh ideas and boost competitiveness between workers.


Perhaps some people believe that is what happens. But if I were a talented employee, I'd see an 11% staff reduction as a sign to stop ignoring those recruiter emails.


We had layoffs/firings here and that is what happened for a lot of people. There is no point in sticking around when the same moronic manager is kept around while talented, productive people are forced out. Many of us are only here until our shares vest. Our stock price has taken a huge hit as result.


Which means the best employees are the ones most likely to leave. The ones who weren't fired, but who can't easily get another gig are what you are left with. It's a tough cycle to beat.


> boost competitiveness between workers.

I do not think that this is a good thing...you want collaboration, not politics and back stabbing.


Its corporation. Morals are not really the top priority. The results are.


I don't think levemi is saying anything about morals -- rather, that results will be better if employees are cooperating with one another rather than playing zero-sum games in the hope of missing the next round of layoffs.


But it could just as easily be argued that a highly political environment which necessitates such behaviors leads to poor morale, quality employees leaving etc. etc. which couldn't be good for the company long term...


I would say it all depends on the point of view of the people in charge. The reality is what they decide and whats their take on that.


I don't know about other people, but my next Intel CPU will have AVX-512, which Skylake doesn't have. I'm expecting AVX-512 to (up to) double integer/FPU vector computation capacity.

I also want to see some proof future Intel processors (especially 10nm and smaller) have not lost durability due to electromigration [1] (or some other durability issues small feature size can cause). There's already some indication this might be happening on current Skylake CPUs.

Of course not all workloads will get faster, but it could help with things like parsing (XML, json, etc.), games, media processing, compression, physics simulation, etc. -- or really anything vectorizable that needs to run on CPU for reason or another.

[1]: https://en.wikipedia.org/wiki/Electromigration


You might be surprised to learn that AVX-512 is not available in Skylake - it will only be in the Xeon line. It also won't give you anywhere near double performance, since a lot of AVX2 code is memory bandwidth bound anyway (which is probably one of the reasons it's not in the desktop chips.) I would've bought a Skylake so that I can test code on an AV-512 machine. Now I'll keep my money and wait and see.


Current Skylake Xeons don't have AVX-512 either. For example no AVX-512 in this Skylake Xeon: http://ark.intel.com/products/93354/Intel-Xeon-Processor-E3-...

I know about bandwidth issues, but you can also use big vector sizes to mitigate, more opportunities for simple schemes for realtime (de)compression of data. (Usually called packing/unpacking).

But where you can work around bandwidth issues, you can get up to twice as much work done.

I've also been looking for such a chip to test my code. Of course it's possible to use SDE. https://software.intel.com/en-us/articles/intel-software-dev...


Yeah the E3 Xeons won't have it, they're basically desktop chips dressed up. The E5s are supposed to have it though.


Intel finally realized that ARM is dominating growing sectors (mobile and embedded[1]), while they're dominating the shrinking ones (desktop and server).

[1] still can't bring myself to say IoT.


embedded makes more sense :)


My bet is on falling PC sales. I'm surprised this didn't happen earlier, to be honest.

PC sales by year: http://i.imgur.com/IxSo1oy.png


Does this include laptops ?


Yes. Around 60% of "personal computers" (excluding tablets) sold are laptops.

What's interesting is tablets. The number of tablets sold basically equals the number of desktop and laptops sold combined.



I would bet they want restructure so they can do more stock buybacks...


What Intel needs is some reason for people to do compute intensive tasks on their computers. Today games are probably the only popular type of app that requires abundant CPU power, for most other popular activities CPU usage is low which doesn't motivate to upgrade hardware. Maybe VR can change this?


Games have been GPU-bound for years and years. Even in 2011 when it came out, Skyrim used (at most) 30-40% of CPU power. Fallout 4 (updated version of the same engine) uses proportionally far less.

In short, no, gamers aren't buying new CPUs every 18 months like they used to, either. GPUs, maybe, but Intel's weak in the gaming GPU space.


>>What Intel needs is some reason for people to do compute intensive tasks on their computers.

It used to be that the frequent Windows version bumps had incrementally higher minimum requirements, and would drive the hardware upgrades. Guessing that curve flattened out...I have Windows 10 running on very old laptops, and it works just fine, so long as I have an SSD.


Software developers have all moved to the server or mobile. Nobody pushing the envelope on PC software any more. That's the impression I get.


Large enterprise companies like this can lay off 10% anytime they want with little effect on output -- there are oodles of people slacking in big corporations basically not doing much of anything except collecting a paycheck.


And do you think these same companies are just as efficient as identifying who's slacking off? I don't think so.

They may know that 10% of the company is slacking off, but they often end up amputating a hand when they meant to just trim the fingernails.


A cynical manager who knows his employer periodically clears out 10% dead weight would keep some dead weight around, so they could deliver the cut without losing the guys they need.


A very common approach is giving every front-line leader the 10% target, rather than trying to actually trim the actual bottom 10% from a performance standpoint. Results are as you would imagine.


> “It’s acknowledging the reality that it’s a single-digit growth world,”

Oh, the horror.


I wonder if this is related at all to the Altera acquisition, and how many of those employees will be affected. Intel had to give up a lot of cash, and take on quite a few redundant employees…


Didn't they acquire McAfee too ? How did it ever make sense for an hardware company ?


Since when did Acquiring McAfee ever make sense for anybody?


it makes sense for McAfee shareholders...


What do you need to efficiently run an artificial neural net? A great mass of configurable simple cells. Sound familiar?

So it's possible the current C-suite actually know what they're doing, and Intel's disfunction is concentrated in the middle layers of territorial empire builders. Krzanich, if you're reading, lay off everyone between you and the front-line managers. I know that'll be a lot more than 11%, but still.... Replace them with employees chosen at random; you really can't do worse than what you have now.


- Cuts outlook: sees revenue up mid-single digits, down from prior outlook of mid- to high-single digits

- Also cuts full year margin guidance, sees 62% down from 63% before

- Generated $4 BN in cash from operations, of which it spent $1.2 BN on dividends, $793MM for buybacks and saved the rest for severance

- Notable difference in GAAP vs non-GAAP: GAAP Net Income: $2.046BN (missing expectations), non-GAAP Net Income: $2.629BN


Am I the only one with a sinking, gut instinct, that this is indicative of a deeper problem in the economy? It feels like a crash is lurking... Maybe I'm just paranoid?


It's just the year % 8 == 0 alarm going off.


What problem are you talking about? The fact that a large company that makes a majority of its revenue from a declining market is cutting its workforce is not indicative of anything at a macro scale.


interest at zero means that you are stepping on the gas pedal and the car is hardly moving, so of curse a crash is coming, a big one


It's been like that since the last crash.


I think that this is the result of the wider economic turmoil we're experiencing since 2008. The US economy, in theory, is recovering but the biggest player worldwide is Europe and Europe right now can't handle a few hundred thousand immigrants - should be trivial given the available resources. Europe has political midgets in a time when at least mediocre politicians would be needed all over the place (France, Germany, UK, etc.).

So I think it's mainly EUs fault. That said if the US fails to regulate the financial sector in the years to come, we're going to see recurrent financial turmoils until we end up with WW3.


> So I think it's mainly EUs fault

Yeah right, let's blame Europe for yet another US bubble. The last 2 crisis were entirely US made.


To be honest, the EU debt crisis is 100% self inflicted. In a try to unify the European countries around a set of rules (Maastricht treaty) Germany is destroying the EU while France is watching.

The EU is in no position to point fingers. Even the lame QE issued by ECB - a little too late - was so heavily contested by the Bundesbank, it's hilarious. The country that profited the most by boosting it's exports, is killing the whole project just because the other countries didn't abide to some rules written on paper 10 years ago.


Some of my friends expressed similar concerns, for what it's worth.


Not trying to be condescending, but legitimately not much. Anecdotes don't really help us here.


Jobs created: "Yay! The economy is growing! The economy created xxx,xxx jobs last month! This is a great sign!"

Jobs destroyed: "This can't be a sign of anything. Anecdotes don't help us."


You're comparing things across different levels. You have something at a macro level in one (jobs created across the economy) and at a micro level in the other (jobs destroyed at a single company). The economy as a whole might still net create jobs even with Intel laying off 11% of its workforce. Likewise, you can have the labor market across the economy shrinking while other companies are ramping up hiring. A single data point does not indicate a trend.


I probably should've realized it was a rhetorical question.


isn't a more direct conclusion is that chip progress has slowed. People's upgrade cycle has extended significantly, and thus less demand for Intel's products?


If they're cutting entire divisions, there will be a great mix of strong performers in there -- great for other companies who are hiring.

I'd always love to talk to Intel people from the hardware security projects (SGX, etc.).


I really hope this is not going to affect their already ever lagging release schedule. If I'm not getting Kaby Lake Q3 this year as promised I will be severely disappointed.


Kaby Lake in itself is already a disappointment. Intel must have lots of problems with 10nm production if they're willing to slip a full product generation.


I wouldn't call it a disappointment. It will add native USB 3.1 (Type-C) support which is major thing for me, considering that USB Type-C is projected to become the most popular socket in the history of humanity. Also: support of Intel Optane (https://en.wikipedia.org/wiki/3D_XPoint), Thunderbolt 3, HEVC, etc (https://en.wikipedia.org/wiki/Kaby_Lake).


I thought type C was just a connector. How is this supported or (not supported) by the SoC? There are USB 2.0 devices with a type C connector. This is more confusing than it should be for me...


Skylake supports all these, but needs a a seperated chip on the motherboard. With Kaby Lake it'll be built in to the chip. That should hopefully mean these connections aren't just restricted to high end laptops.


> I will be severely disappointed.

"I'm not angry, just terribly, terribly hurt", as Marvin the Martian famously put it.

(https://www.youtube.com/watch?v=Cwxc_zLH560 if you insist, it's only 72 seconds. Also featuring the line "there's a growing tendency to think of Man as a rational thinking being, which is absurd! There is simply no evidence [...]". Scriptwriter on a roll there.)


Intel's IT department is so big it was for me to believe it. On the other hand I am pretty amazed that they were so serious about keeping internal application communication encrypted end to end, in every layer.


They're likely a very juicy target for economic espionage.


Hope some of those folks are chip designers and they decide to go work on the RISC-V/lowRISC projects.


Maybe it's time to innovate a bit? I have 0 reasons to upgrade my 6 year old MAC Pro, I would need to spend 5K to get a mild performance boost.


I agree but that's a bad example. Macs are notoriously expensive for the performance they deliver.


That's a stereotype from a time long gone. The difference in price between Mac Pro and identically configured dell workstation is minimal. On the low end yes there is price difference. On mid tier iMac 27" 5k costs about the same as dell 5K monitor without the computer (they use identical LCD).


I bet VR will lead to more PC sales in the coming years.


There's this thing called the stock market where bets like this are quite common. If you're certain, now is the time... :)


> The company said it’s shifting focus to higher-growth areas, such as chips for data center machines and connected devices.

Where have I heard this before? I think in a little book called the "Innovator's Dilemma". Can anyone predict what happens next?

I wonder if Intel will try to push Atom into "Core i3" and make single-core Core i7's next to "increase profitability". They've already started making dual-core Core i7s - I mean how ridiculous is that idea?! Isn't a dual core Core i7 supposed to be a Core i5? Do their brands still mean anything anymore?


Intel is in a very cyclical industry. They over hire when the economy is doing very well, and they cut when things are not. I got a pink slip from Intel back in 2001 just after I graduated college when the dotcom bust happened.


"Robots will soon begin taking human jobs in places like retail stores, fast food restaurants, construction sites and transportation. The key technology that will fuel the transition is inexpensive computer vision systems, and the number of human jobs at risk numbers in the tens of millions. More than half of the jobs in the United States could be eliminated."

http://www.amazon.es/Manna-Visions-Humanitys-Future-English-...


Makes sense. My 5 year old laptop has a i7-2820QM, 8 vcpus, is plenty fast for the stuff I do, 4600 bogomips, 45W. My recent upgrade was a NUC with a Pentium N3700, 4 real cores, 3200 bogomips, 4W. Pretty impressive.


I think if there's any time Apple would switch to their own ARM designed chips for Macs, it's now. This along with Intel slowing down from their Tick-Tock schedule will probably do it.


Why? The POWER->x86 jump was made largely because of the need to have more performance from the chips (raw power) and better performance per watt (for laptops). Intel wooed Apple on what they had available at the time as well as the impressive roadmap of where the they going. I often get 10 hours of battery life on my Macbook pro, so they did something right!

I totally think Apple will go to ARM, but I can't see Apple doing shift until OS X is getting similar performance on ARM over x86. When Job introduced the x86 switch, he said they had had an internal version of OS X running x86 since day 1. I'm sure they have a version on ARM as well right now, and are just waiting for them to get "good enough". Apple's marketing on recent iOS devices has been also very interesting. Lots of use of the phrase "Desktop Class". Also, the GeekBench scores, and AnandTech teardowns support the hype: Apple's ARM chips truly are best-in-class and rapidly approaching mainstream x86 performance.


You already made the case for it. In a couple of years, intel performance will simply not matter to 90% of users. They will care more about battery life, size, screen quality, weight, thinness, quality of build materials, speed of SSD disk etc.

But the big difference between POWER->86 is that Apple now has majority of their products on a different architecture. Apple is a company that likes simplicity. They are going to want to have everything on one hardware architecture. As laptops get thinner they will basically be able to use identical hardware in iPads and laptops. It will just be different hardware exterior and OS.


I think it's obvious why and you've stated part of it. Apple's designs are rapidly approaching the level of Intel's and Intel is stretching out their R&D and reducing their investment in this area. So Apple will be asking themselves, why give away all that margin to Intel when we can do it ourselves and keep it? (Intel's gross margin is 62%, no idea what it is for their consumer processors specifically).


I misunderstood you then. I read you post as "Intel in financial trouble? This will make Apple leave them"


Even so the current Apple SoCs are phone/tablet parts which cannot match broadwell IPC or clock frequency, except maybe on the lowest-end mobile parts. You can be sure that they're working on that, but my guess is it's at least two years away from being on a retail shelf.

And on the desktop and server side, even longer. I doubt we will ever see a 64-bit ARM part that turbos up to 4GHz+ with haswell/broadwell-like IPC. You can say it's because the desktop market is dead/declining/unnecessary now/etc, but it's also because nobody else is able & willing to match Intel's superb physical design. It's a shame that Intel management has failed to monetize that more effectively.


That is the bet that I placed some time ago, that in about 3 years, I predict Apple switching to ARM. I believe the work on that is already happening in secret.

As demonstrated by things like iPad Pro, ARM is already more than capable for the things 90% of users are going to do. Realistically Apple would be best of with an intel based Mac Pro, but Apple likes to keep things simple, so I think they will just beef it up with loads of ARM chips clocked as high as possible with plenty of cooling or something.


While there are obvious market valuation issues at play, I think this signals more about Intel and it's future strategy. The layoffs come shortly after their $16.7 billion acquisition of Altera was completed.

Intel really missed out on mobile and with PC sales rapidly declining it looks like they are going to refocus on enterprise and data centers. ARM and NVIDIA/GPU computing are also expanding rapidly in those areas and that will pose a major threat to Intel.


I wonder how CEOs feel about their bottom line when they're off by such a big percentage in the workforce they need.

Couldn't they have predicted this sooner?


As you read these comments, I'll repeat mine from an earlier post: Don't come to HN for legal, medical, or economic advice!


Being in hardware is getting harder and harder.

Increasingly the footprint of hardware is becoming sparser, replaced by software, etc.

It is time to make the push to make hardware open source mainstream from the point power hops on to where software picks up.

There are many, many really good reasons to do this, but in the end, to me, it will define how free the world is.


This will have a pretty bad effect on the company's morale. Depending on how long this layoff occurs for (1 month vs 1 year), everyone will feel extremely uneasy going into work knowing today's possibly the last day.


Wondering if Intel is moving out of some markets completely - mobile chips (phones / tablets) is the one market that comes to my mind. The margins on those are almost non existent currently (on the low / medium end).


How competitive is Intel in mobile chips these days, have they been improving?


I think the trouble with application processors is you ought to have lower prices and margins compared to desktops and servers, so even if they do great there it's not that great financially. Intel is a relatively high margin business so it either needs to find markets where that works, or grow volume very drastically, which is hard given their already very high volumes, or shrink.


Guess they should have kept going with StrongARM/XScale. They could've had a mature, trusted product line by now.


Unlike Capitalism, Globalization is Zero-sum


Almost exactly decimation.


Closer to nonation, technically, but decimation is the word everyone already knows.


"decimation" is also the word with meaning. "nonation" is just turning "nineth" into a verb and then somewhat of a gerund, like "ninething". Without the historical context, it has little meaning.


Yet that is exactly the pattern used by decimation.

  decimus (tenth) + -ate (convert to verb) = decimate
  decimus (tenth) + -ation (resulting state of -ate form) = decimation
It could be used with any Latin-rooted ordinal number. Primate, secundate, tertiate, quartate, quintate, sextate, septimate, octavate, nonate, decimate, undecimate, dodecimate.

Primate, secundate, tertiate, and octavate have meanings other than either "reduce by 1/Nth" or "reduce to 1/Nth". Tertiate is actually roughly equivalent to "threepeat". Those words have about as much inherent meaning as "tenthing", but some had meanings extrapolated from "decimate" as early as 200 years ago. The "reduce to 1/Nth" meanings would be more accurately encapsulated with the "an-" (to) prefix, as in "annihilate": literally "to nothing (verb)". That would give you antertiate, anquartate, anquintate, et cetera for the reciprocal-valued words.

As I said, "decimate" is the most well-known of the bunch. "Annihilate" is also well known. But every last one of them is a perfectly acceptable English combination of Latin roots, and should be understood easily enough.


> should be understood easily enough

I think you'll find you have an easier time communicating if you stick to terms that people have heard before, even if they're slightly less accurate than new creations.

Of course, if you're trying to get famous, you should absolutely invent new words. Take Thomas Friedman's "glocalization". That one didn't do so well. Steven Colbert's "truthiness", that was a gem.


I refuse to participate in the evisceration of English vocabulary. If you can't guess the meaning of an unfamiliar English word from context, your working vocabulary simply isn't large enough. A staggeringly large quantity of English words are derived from common elements filched or inherited from Greek, Latin, Old Norse, Anglo-Saxon, and Norman.

Knowing the root words are sufficient to both guess the meaning of many of the 600k+ English words that are in the dictionary, but not typically used, and to formulate new, English-sounding words with intuitively obvious meanings. This is similar to knowing the IUPAC chemical naming rules, which allow chemists to name molecules in such a way that other chemists will know how it is structured, just by reading the name. (Z)-Hex-3-en-1-ol, for instance: hex means the longest carbon chain is 6 atoms long, 3-en means there is a double bond following the 3rd carbon, (Z) means it is in cis configuration, and 1-ol means there is an -OH alcohol group on the first carbon in place of a hydrogen. English is less formally structured, but it still has rules.

Truthiness = truth + -y (similar to) + -ness (quality of being)

Therefore, truthiness is the quality of being similar to truth, which is as Colbert's character described it. It follows the rules. It reads like an English word with unambiguous meaning, and is therefore adopted as though it already was an English word.

Glocalization = global + local + -ize (convert to verb) + -ation (state resulting from the verb action)

This smashes two dissimilar words (global, local) into one portmanteau that has ambiguous, unclear meaning (glocal), and then tries to extend it with regular English suffixes. Portmanteaus are less readily adopted without literary backup. Dodgson's frumious (furious + fuming) never would have made it without the Bandersnatch, and slithy (slimy + lithe) required a bit of explanation from Humpty Dumpty. Needless to say, Dodgson was much better at it than Friedman.

This is why I like to say that people who know English well are sesquilingual, because you need to know a little bit of several other languages to know that many of the words.


> Without the historical context, it has little meaning.

This is what school is for.


non is 9, deci is 10, the workforce was reduced by 11. How on earth is 11 closer to 9 than 10? Or were you just trying to sound smart?


I think if you reformat that percentage as a fraction, you'll understand what he's going for.


Right now, the only content in this article is "Developing...".



I saw that as well, refreshed a few times (thinking it was failing to pull in content via some over-complicated API call) and then saw the article appear.

I gather this was just a stub to help the site's SEO rankings, sort of like how support teams that are measured on "time to first response" will quickly reply to any new ticket with something like "Thanks for getting in touch. We'll look at your ticket and get back to you shortly." It may take them 24 hours to get back to you, but they've stopped the clock. Maybe Google News gives extra juice to sites that are first to break a story.


http://www.cnbc.com/2016/04/19/intel-reports-q1-2016-earning...

"Shares of Intel were halted after the bell Tuesday as Intel announced it would cut 12,000 jobs, or 11 percent of its workforce, by 2017 due to restructuring. The technology company also said the chief financial officer would leave that role."


I am actually curious. Did everyone who upvoted the title post not click on the link at all?


clearly, no.


HN's scoring algorithm should ignore or discount upvotes if the user did not click on the article link before voting. For extra credit, HN could also watch for page visibility changes that would indicate that the user, in addition to opening the article, also switched to that the article tab. :)


I sometimes read an article at work or home, and then later read the comments and upvote later at home or work. Don't assume that everyone operates the machine the same way you do, nor with the same goals.


They just announced earnings. Press release for restructuring announcement:

https://newsroom.intel.com/news-releases/news-release-intel-...


[flagged]


I think it's called twitter bootstrap, and it is terrible on both user experience and performance. So much hate...


bubble? next crisis is coming?


GoodBYE!


Oh man, they're even better than AMD at laying off people.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: