Hacker News new | past | comments | ask | show | jobs | submit login

Do you think Intel can come back, or is Intel too dysfunctional/inefficient to recover?



I had friends at the Intel fab in New Mexico in the 90s and this type of turf war was prevalent even then internally at the fab. Managers were constantly jockeying for position at the expense of manufacturing processes.

Just one anecdotal story, one of the wafer guys had an idea to improve a process. He went to his boss who shot it down instantly. Wafer guy knew he was right so went to another manager who liked the idea and implemented it. It ended up working and being a great idea. Instead of being rewarded for his idea, wafer guy was fired by his manager for failing to follow the chain of command.

It's built into the culture at Intel to be territorial to advance. It's been that way for at least 30 years. I personally don't see it changing any time soon.


I'm currently reading High Output Management which is 25 years old. So your comment addresses the time where the book was written.

Grove certainly sends quite utilitarian vibes. He also says that a collaborative culture is important. Your comment suggests that neither he nor his successors managed to make Intel's culture collaborative enough.

Higher management is probably often an iterated prisoners dilemma. Collaborating is best for the company but defecting is good for the ambitious manager.


Is is a good book you'd recommend otherwise?


Not the parent, but I read the book, and the principles have become so common nowadays that I never really had a wow moment. It was more like "Yeah, this guy is telling me things I already know." It is nice to see the common knowledge built up from scratch, rather than just automatically accepted the way it is, but I wouldn't consider the value you get from the book worth the time, today.


I agree that you get most of it by reading HN for a few years. However, it is a small old book so not that much of an investment.

It did have a few ideas which were new to me and it presents a holistic consistent management philosophy which you don't get from reading a hodge-podge of blog posts.

Two (for me) new ideas: Matrix organisation is inevitable for large companies in search of the sweet spot between agility and efficiency. Knowledge workers are middle managers.

Overall: Good book but not "you have to read it" level.


The difference is that back then Intel still had a quasi-monopoly. Even AMDs occasional successes from the late 90s on were usually temporary, and never an existential threat to Intel, so they could afford a certain level of internal dysfunction.

But we are now living in a world where mobile is more important than desktop, and mobile chip designs have enough power to be a threat to the desktop market, and perhaps even the server market in the not-too-distant future. This is something that should worry Intel.


I think they have been a commodity for a long time. Rephrased, the discussion is similar to, "Will GE make a comeback in the refrigerator market?" Intel isn't going anywhere, they have many, many big customers to satisfy who simply cannot shift to ARM due to ROIs <=0. And after 30+ years of this, I also don' think there's anything new or revolutionary in silicon fabrication on the horizon. I stopped getting excited about CPU tech when AMD crossed the 1GHz barrier in the PC space. Could be burnout though.


> I stopped getting excited about CPU tech when AMD crossed the 1GHz barrier in the PC space

That was only around 2000. There was a lot for me to get excited about since then:

* AMD64

* SMT

* Dual-core / multicore

* On-chip integrated graphics that weren’t awful (low-end discrete graphics aren’t a thing anymore)

* Gigantic L3 and L4 caches (a side-effect of on-chip graphics)

* Turbo mode making a comeback (yes, I know Turbo Mode switches actually slowed down computers)

Judging by Apple’s direction, it’s looking like we’ll start seeing actual (ultra-fast) RAM on-chip too, with Optane that could mean we can finally move-away from block-addressed storage and have a massive flat memory-space for everything .


ADM64... Aka Yamhill on Intel. Holy shit I had to sign waivers and get a special Unix acount just to ACCESS the RTL model with Intel's 64-bit iHDL.

I kinda laughed when AMD beat Intel to the punch not once, but TWICE. (1GHz, then 64b).

I also worked on the 740, 810 and 815 (both hardware and drivers, oddly, I did a lot of stuff). Game devs HATED us for the 810/815 because it was such huge volume they had to target it as the LCD.


Can you elaborate on all of that? I'm unfamiliar with "RTL", "iHDL", "740", "810", "815", I assume "LCD" is lowest-common-denominator?


Correct me if I am wrong.

RTL https://en.m.wikipedia.org/wiki/Register-transfer_level

RTL is a way to model a digital circuit and some common HW design languages are VHDL, Verilog amd Intel's proprietary iHDL.

Numbers are just Intel chipset families.


> And after 30+ years of this, I also don' think there's anything new or revolutionary in silicon fabrication on the horizon.

Why so? Few more material, and device generations are on the way. Litho has went through more generation changes in the last 1 decade, than in any previous one.

Whole new device classes are on the way: optics on silicon, new memory, logic in the backend, and logic on package.

Semiconductor engineering is still a career suicide though, that haven't changed in the last few decades.


This is my "640K should be enough comment", but we've reached the limits of optical fabrication with ~2nm. And really, it is just smaller transistors. Yes, that glibly does a disservice to the process engineers making this miracle happen, but I can get excited about "look, smaller transistors!" only so many times in my lifetime. :)

> optics on silicon,

Yep! That and quantum and post VNeumann architectures. There's some weird shit on the horizon that I probably won't live to see in household products, but I don't think it will be be optical deposition by layer. It might, who knows, I"m officially talking out my arse now. :)


Optics may come earlier than people expect.

And not even for I/O, but for on-die communications, and busses.

High speed SERDES may well be replaced by same-die optics, as a lower transistor count, dumber, and cooler solutions.


> Semiconductor engineering is still a career suicide though, that haven't changed in the last few decades.

This is so true. I am currently a hardware engineer and planning to switch to software. If you are not in Nvidia, Apple or other mega-growth hardware companies, you are suffering atleast 40% paycut compared to software industry and also working about twice as hard, using obsolete tools and flows, and almost no opportunities for technical advancement or significant impact. Meanwhile the annual raises are cost-of-living adjustments. Its truly depressing.


> Semiconductor engineering is still a career suicide though, that haven't changed in the last few decades.

Wow, so I need to thank myself for accidentally bumbling towards ops and dev? Meh, for me it’s just that in EU there’s a total dearth of opportunities in the industry


People making WordPress websites are making more than PhD research scientists in Taiwan, and Korea.


TSMC announced that they're raising salaries by 20% across the board, so that should improve things a bit: https://www.youtube.com/watch?v=T6DaxXukog4


And they cut the bonus at the same time.


Depressing


> Semiconductor engineering is still a career suicide though, that haven't changed in the last few decades.

Could you go into more detail?


Most of semiconductor jobs are in Asia, and in the same Asian countries, you will get n times less than a software person of the same seniority.

The long term is that the industry will get less, and less labour intensive as more, and more old, labour intensive fabs are retired, and new ones are 100% AMHS.


AMD came back. Fabless, bruised, but also thin, agile and with a 10 year plan that thanks to TSMC they can execute more or less on schedule.


AMD came back from being second fiddle in the right market. Intel would have to come back from being first fiddle in the wrong market.


Competition is good for markets. There's still not enough competition in the general CPU market, but it's getting closer: Intel, AMD, Apple, Qualcomm, Samsung, Broadcom.

Intel has money and equipment and existing market share: there's no reason for them not to be profitable for the long term, as long as they stop acting like they don't have any competition (as they did until 2018) or that their only competition is AMD (as they currently are pretending).


Intel will have a chance to regain relevance in several years when the industry reaches the next disruptive inflection point. That will probably be tied to the arrival of a new hardware form factor like AR goggles or direct neural interfaces or something.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: