Hacker News new | past | comments | ask | show | jobs | submit login
The death of corporate research labs (dshr.org)
445 points by fanf2 on May 21, 2020 | hide | past | favorite | 248 comments



Holdup. Lack of anti-trust enforcement is blamed (among other things) for the end of corporate R&D, but Monopoly breakup is exactly what killed Bell labs! 9 smaller companies weren't going to fund their own lab, and the only reason the lab existed was to find new markets to explore. In fact, nearly all of the examples of successful R&D labs came from corporations that so dominated their industry they put money into finding new investments: Xerox. Kodak Eastmann. Google might meet this definition.

I think we also overrate the significance of the corporate labs. There are not a lot of successful examples where the host company actually profited from the invention. Just a lot that bungled them or prematurely killed them (like when AT&T almost invented the internet). Or snuck out accidentally (like the Xerox Alto).


It's not being too large that forced a split up but their anti-competitive practices and positions. The regional bells still were big enough to run their own research labs.

Imagine if Alphabet was broken up, Google search can still afford to run a research lab, as can youtube.

The problem I think is how easy it is for large companies to acquire smaller companies. It's how they expand or enter a market, they refuse to be bothered to bootstrap a new org-unit. They just devour smaller, innovative and creative companies. Look at Google, they couldn't be creative and patient enough to compete with youtube so they gulp up youtube. It's the bigcorp M.O.

So why bother with R&D when you can just buy a smaller company that does R&D, tests the market and builds a brand for you? My answer: you will suffer from brain drain.and reputation loss,when you buy a smaller company,consumers assume that brand is now dead. You become a cemerery of dreams and ideas. You become an IBM,HP,Xerox and AT&T. Once the damage is done it becomes nearl impossible to recover from. I like IBM as the best example, they are doing superb amounta of innovation even today but look at all their initiatives lack any traction or competitive edge. They have a ton of smart people working on brand new areas of tech like quantum computing,but their reputation and overall culture has not been great. They've been declining consistently. Look at yahoo, yahoo!! They had legitimate means to compete with google toe-to-toe,they relied too much on aquisitions as did Verizon that recently aquired them for a meager $4B.

In the end I blame all this on how publicly traded companies prioritize quarterly profits as opposed to multi-year growth. Acquiring bumps up the stock value for a while, spending billions starting from scratch competing or developing a new concept is risky so stocks go down.


> Imagine if Alphabet was broken up, Google search can still afford to run a research lab, as can youtube.

It is far more likely that YouTube will go bankrupt, since running a free video service where videos are almost infinite and cache hits much lower is very costly. Couple that with all the other advantages of being with Google like access to talent and infra, I don't see YouTube surviving with Google. It's possible that the tab will be picked up by Facebook video, and that is an even worse platform and more closed.

I don't see a cheap way of solving the YouTube problem. (Yes P2P exists but has too many problems like battery consumption on phones and NAT).


That would be fantastic tho, we would have an influx of startups competing in the video space, since they don’t have to compete with youtube being artificially propped up by a completely unrelated business model


Yeah, startups aren't going to magically make things cheaper than YouTube. So if any startups arise, they would follow a paid model most likely.

Now that might be fantastic depending on where you come from. If you have disposable income, it is a huge win to have a non ad supported video platform which might even be more privacy protecting. If you are a poor kid using YouTube to watch MIT OCW and khan academy and innumerable other resources, well you are screwed.


If you are a poor kid, you upload your interpretation of a Bach piece and get slammed by an injust, extralegal pseudo-copyright mechanism, that’s in essence an attack on human culture..

MIT and Khan Academy could easily host their content using BitTorrent and reduce their bandwidth usage by 95% ( https://nrkbeta.no/2008/03/02/thoughts-on-bittorrent-distrib... )


Why do you think that a dozen of startups in the space remaining after a possible youtube demise won't be burdened by the same policies? I think that the copyright lobby would try their best to tighten their control at a time when a huge corp stopped throwing its weight against them.

What youtube gives to the poor kid is instant and free access to a huge audience. Yes, it's comes with an (admittedly not really high) risk of a false positive and blocking for copyright reasons. I suppose the kid is smart enough to also have copies of the performance on other services, and safely stored locally.


Oh, they absolutely would! Every music streaming service is already beholden to a tiny record industry cartel, even though there are a number of music streaming services.

But it’s simple, video is more versatile, and by removing youtube’s monolithic dominance, you remove the single point of failure that copyright cartels have been able to attack.

As an example, imagine a separate youtube-clone dedicated to education that was actually willing to fight for fair use! Or a youtube that didn’t automatically demonetized you for swearing.

Internet already gives these kids (and the rest of us) both access and audience, we have just been stupid enough to lock large parts of our cultural heritage into corporate silos, protected by “intellectual property”-laws. Competition wouldn’t solve this completely, but it would make it harder to distort the market.


That's basically what PeerTube tries to do. It doesn't work well in practice for most people and organizations that try it because most videos only have 0-1 watchers at a time.


Yeah, exactly. Because they have to compete with a behemoth subsidized by an invasive advertising company..


Thanks @cannabis_sam, never thought of this but yes absolutely. The tech is slowly getting there... but innovation would speed up a lot more if there was a significant enough pain (and therefore significant enough upside).


What are you talking about?


I'm talking about being competition to an entity like YouTube which is funded by an internet advertising behemoth. Is that what you're talking about?

YouTube can afford to offer a lot of value to the customer, much more than it would have been able to sustain by itself without the Google support.

Without offering all that value, definitionally, Youtube would not be as attractive to consumers.

Therefore consumers would be more likely to choose other alternatives like PeerTube

Therefore projects like PeerTube would have more support and therefore would develop more quickly.


I think YouTube fits in perfectly with Google. It allows them to offer another medium of advertisement (video) to their customers. If YouTube was split off, both creator earnings and YouTube earnings would fall through the floor because the ads wouldn't be worth so much without Google's targeting algorithm behind it.

An unrelated company to their business model is something like Waymo


Netflix made it and proved a market for paid streaming services. Google has bungled this again and again and is currently in the middle of pitting its ad revenue against its top content creators which will only wind up killing both. They've already demolished their recommendation algorithm in favour of just showing repeats of what you just watched so they can cover their ass. Creators are still using 3rd party services to get paid since almost anything can get you demonitized nowadays. Giant channels like JRE are jumping ship. Nobody wants to pay for YouTube Premium if all they're going to do is kick the legs out from under all the worthwhile channels and turn the service into Cable TV 2.0. All this means they've got much bigger problems than paying for storage and delivery.


Netflix and YouTube are not even remotely comparable. The vast majority of Netflix traffic can fit into hardware appliances installed on ISPs, or in the cache of Netflix DCs, since the tail is very small. Plus Netflix is very expensive outside of the developed world in terms of PPP, whereas YouTube is free for even the poorest.

They have got problems, yes , but there is far too much good content on youtube with a very very long tail. In lockdown, I have tried almost dozens of different cooking channels for different things. It is the magic of YouTube that so many people are able to take their content to so many viewers seamlessly at no cost or loss of convenience to them.


I agree that there is no _free_ way of solving the YouTube problem. As for cheapness? We don't yet know what the space looks like because YouTube kills off its competitors by undercutting them preemptively. If we hadn't had YouTube all these years, by now there's at least a possibility that other approaches (ad-driven, subscription-driven, etc.) would have been explored for low-cost user-generated content, and we'd have a better idea of what the actual cost is, as opposed to this bifurcated monopoly between the networks providing professionally-produced content (including burgeoning streaming services) and YouTube capturing the remaining content.


It seems a competitor like Dailymotion was able to survive on its own, bar YouTube competition. I m not sure its viable with it, but without google it seemed profitable.


Eh, another venture-backed story with no road to profitability, handed off between various acquisitions and allowed to bleed money in the hopes of drawing business.


were you talking about facebook, youtube, google or dailymotion?


In this case, I was lumping dailymotion in with the others you mention.


Google makes a $15B a year from YouTube.

That pays for a lot of storage. And storage is very cheap at that scale.

https://www.google.com/amp/s/www.theverge.com/platform/amp/2...


Cost of streaming videos is not storage, it's dominated by bandwidth. Bandwidth is expensive when you can't cache videos the way Netflix does.


a) Google doesn't pay for bandwidth. They have peering agreements because of all the dark fiber they bought (https://www.lightreading.com/optical/dwdm/google-dark-fiber-...)

This isn't exclusive to Google though - most large bandwidth users have peering agreements.

b) Google does have edge caches[1]. It's true that the long tail of YouTube videos is longer than Netflix, but since Google doesn't pay for traffic (see (a)) this only affects speed, not cost.

[1] https://peering.google.com/#/infrastructure


I would argue that in a system where the content is "almost infinite" the average value of each content item is "almost zero". Cache hits stats could be interesting to see what percentage of the content is never watched.


> My answer: you will suffer from brain drain.and reputation loss,when you buy a smaller company,consumers assume that brand is now dead. You become a cemerery of dreams and ideas. You become an IBM,HP,Xerox and AT&T. Once the damage is done it becomes nearl impossible to recover from.

Would Apple be a counter-example? Multitouch, PA Semi, Siri, TouchID, etc etc lots of Apple technology was acquired but nobody laments the death of those “brands”


Because all the tech Apple acquired was acquired before it was a big recognisable brand. Also Apple are very good at making it look like they came up with it. Ask someone who doesn't follow the movements of big tech companies 'Who invented Siri?', 'Who invented TouchID?'etc they will say Apple.


You mention Siri as an example of Apple taking credit. Siri was on the App Store before Apple bought it and they kept the name.

But honestly, who cares but geeks about what the general population thinks about who invented it?


I worked at SRI, though not on Siri. (Wish I had; they got a nice payday when it was bought.) Siri was weaker before Apple bought it, because there was only so much they could do with the public APIs. Apple buying it was the best outcome. It allowed Siri to come into its full potential, while at the same time inspiring others to make their own.


The same happen with the Workflow app. It was a great app when it was independent but became much better when Apple bought it and integrated it into iOS.


> Siri was weaker before Apple bought it, because there was only so much they could do with the public APIs. Apple buying it was the best outcome.

Umm, this is exactly the problem the article talks about.

The fact that you had to pay obeisance to Apple to get access to private APIs and data is a failure of the anti-trust mechanisms.

The fact that Apple could just buy up Siri without anti-trust mechanisms kicking in meant they can just sit in their castle and let somebody else do the hard work.


Why would I trust random third parties with the level of access that Apple has? Should Apple also allow access to the Secure Enclave by third parties?

VCs don’t invest in startups to help them build low profit “lifestyle businesses”. They fully expect them to be acquired or rarely become public. Just as an anecdote that we are all familiar with, only two YC companies have gone public — Pagerduty and Dropbox - and Dropbox is still not GAAP profitable.

If the governments takes away the ability of companies to get acquired, investments in startups will dry up and $BigTech would be the only ones with money to do research.


I guess the line of reasoning above is that anyone concerned with remaining competitive in the global market should care, because (again, following the reasoning above) the general population is what decides on economic policy in a democracy.


Apple just acquired Dark sky weather and turn off the api, website, and android app. They look like a bunch of lazy dicks who can't be bothered to write their own weather app to me.


I don't know what the current figure is but traditionally Apple's spending on R&D has been quite low for the size of the business.


People here are constantly lamenting the lack of good multitouch on anything but a Macbook, especially in Linux. If Fingerworks had stayed as an independent company they could also sell their touchpads to Microsoft for the Surface. Of course Apple bought them before they were a well known brand; maybe that's why their acquisitions are more successful.


Where do we draw the line between innovation and new features?


> Imagine if Alphabet was broken up, Google search can still afford to run a research lab, as can youtube.

This doesn't make sense to me. Google search and youtube are just front-ends for Alphabet to make money through advertising.

If Alphabet truly was broken up, Google search and youtube would not be able to use google ads jointly in the same way, and they don't make money otherwise.


> My answer: you will suffer from brain drain.and reputation loss,when you buy a smaller company,consumers assume that brand is now dead.

Not so sure about that. Maybe if you read HN a lot, but I think most users won't even notice.


"Facebook bought WhatsApp - guess we're moving the family group chat to Signal"


if that anecdote was applicable to the broader market, wouldn’t we have seen WhatsApp’s numbers decline and Signal’s numbers rise after the acquisition?


Signal is very popular but whatsapp was aquired before Signal was well known (2014-15 I think) and their user base mostly don't even know it belongs to FB. I have had multiple people act shocked when I tell them whatsapp and Instagram are FB.


I was being sarcastic


Your core consumers that are foundational to your brand will always care. But more often than not if there are no alternatives it just becomes a long term disinterest in your brand.

Fun example: Star wars fans and Disney aquisition.


Considering Star wars media continues to print money for Disney, and core Star wars consumers were appalled at the franchise before the acquisition (prequels), that's not the best example.


On the other hand, people complain about Apple "sherlocking" apps when they release Sidecar for example. By your logic,

> they couldn't be creative and patient enough to compete with youtube so they gulp up YouTube

So what can they do? If they had launched "Google Videos" they would have surely been accused of squashing small companies... What option do they have?


> Imagine if Alphabet was broken up, Google search can still afford to run a research lab, as can youtube.

Is Youtube even profitable?

Doesn't Google make 90%+ revenue from ads?


Youtube’s main value is it’s market dominance. That’s why it’s so terrible to use, there’s no competitive incentive to actually improve the user aspects of the service (the advertisers are a COMPLETELY different ballgame tho), they already own the video space and have google/alphabet’s financial muscle to keep competitors away.


> Is Youtube even profitable?

Yes, extremely[1].

Google (and most large bandwidth users) doesn't pay for bandwidth because they have peering agreements. Storage is cheap.

[1] https://www.google.com/amp/s/www.theverge.com/platform/amp/2...


You link does not say if it's profitable.

Revenue != Profit.


Agreed.

Hence my comments pointing out that the things that most people seem to think stop YouTube from being profitable (bandwidth and storage) aren't really factors for Google.


Under Google not so much. But you just need liquidity for R&D, the point being you will develop new concepts that will improve profits. Netflix,hulu,amazon prime all came after youtube,perhaps good R&D without Alaphapet's shackles would have let them compete in those markets and more (video conferencing for example like zoom).


Netflix had a profitable dvd business, Hulu has the backing of major studios, Prime has Amazon.

I dont think Youtube would have survived without Google.


Ironically, acquiring generally lowers the acquirer’s stock price in the short term.


Increasingly they don't even buy small companies. They just copy. Look at half of Microsoft recent activity.


recent? They have been copying since the 1980s...


True. There was a phase of acquisitions in the interim.


Anti-trust enforcement can take many forms -- breaking companies up is just one of the most extreme and obvious ones -- but often just the threat of action was enough since for a while the US government backed it up.

A big reason Bell Labs was created and perpetuated was because AT&T feared being broken up. Bell Labs was effectively a PR vehicle they used to show the US government that they were giving back to the community, and an excuse to continue operating as a de-facto state-sanctioned monopoly. Today's toothless FTC and DoJ don't really inspire that kind of fear.

Source (sort of, paraphrased) - This is a big theme of The Idea Factory which describes the heyday of Bell Labs.

(Edited to further develop my thought)


This isn't quite accurate, they were mandated by law to fund bell labs, and bared from profiting from its inventions outside of the telephony domain.


Well, a lot of / most of deep machine learning breakthroughs are coming directly from research scientists at Facebook, Google, NVIDIA, (other major tech companies) or research organizations associated with them or their executives.

It's difficult for academic labs to keep up, actually.

Whilst modern economics and the startup model makes it easier for companies to let startups do the actual idea testing (and then buy up the successful experiments), there's a ton of companies that are still driving entire industries with their corporate-funded research.

We should also probably note that a lot of the funding for some very successful academic labs comes from these large corporations as well.


> It's difficult for academic labs to keep up, actually.

That's probably more an intrinsic problem with academia (lower wages, limited career opportunities, limited resources compared to companies).


There is a spectrum of the kinds of research one might do. My advisor always used to remind me that if a team of good engineers at a big corporation can do something, it's not a good research problem to tackle for a grad student in a uni lab. So we naturally tended to more speculative/fundamental problems, which seems like a good thing - instead of trying to compete with industry labs we augment them.


This is kind of odd logic though. If a well funded team of experts who care about the utility of the results would focus on a problem, you shouldn't.

What makes you think the advisor's projects were genuinely more speculative or more fundamental vs just dead ends that nobody cared about? It's not like corp labs don't fund fundamental work. Google was funding quantum computers for a long time now.


That sounds like a good thing, but realize that some day industry will build an application based on your ideas and patent the result. Patents are a monopoly on technology, and working for a university, probably the last thing you had in mind was that other people would monopolize your inventions so that their use would become limited.


University labs are generally good at dealing with patents. Researchers do generate patents where it makes sense, and the processes are set up such that the university gets some ownership with the option for the researchers to transfer the knowledge to companies e.g. student graduates and founds a startup with the patent as foundational IP. This is all intentional and it's one of the ways for the labs to do their job of disseminating their discoveries and transfer knowledge to industry.


Still, patents are a way to provide exclusive access to technology, to those who pay. That's not something I personally would like to see happening to my inventions if I'd work at a university. It's also not how universities used to operate, traditionally.

Further, I feel that if it's paid for by the public, it belongs to the public.


This is how my advisor and I reshaped my recent masters thesis. I was thinking of an engineering problem, but he reminded me of some really cool ways we could do basic science. It ended up being really rewarding and still novel.


My advisor made the exact same point. Although we did receive funding from startups to work on adjacent problems.


And also an artifact of strategic choice of pursuit of these technologies.

Due to the resource and scaling requirements (as well as inherent skill needed), replication of efforts for expansion purposes is difficult and that's one reason (not the only reason) these sort of modeling approaches are being pursued.

A lot of technology choices are being chosen because of an inherent high barrier to entry, limiting competition risk.


I used to think monopolies were inherently progress limiting, but as you point out it's a mixed bag. Bell labs was an amazing center of innovation even as AT&T stifled the telephone market. Not sure corporate labs could exist today with the self-defeating (and legally incorrect notion) that the primary duty of a public company is to make money for shareholders. That philosophy today would/will prevent large companies from investing in their own future via plush corporate labs. There's a great quote from Alan Kay that Xerox did actually make enormous profits from their Xerox Park labs, as the labs invented laser printing, but Xerox decided not to make trillions by pursuing their invention of graphical personal computers and ethernet.


Ultimately, a lot depends on individuals. Incentives, systems, and such can make things common or uncommon, but innovation tends to be uncommon anyway.

Google was founded, and mostly run, by people with a pretty broad academic interest in science and technology. Pioneering ai breakthroughs is what they wanted and they find ways to make it logical for google to do. Both the existence and the nature of their R&D reflects that. Apple or Microsoft were not. They're interested in products. What rocks Zuch's boat is is reach. I don't think he'd be that excited about purely technological breakthroughs. He'd only be excited about ways these can extend FB's footprint.


Microsoft Research is one of the biggest and most well known industry research labs and it's been around since 1991, so I don't think the comparison to Apple applies.


Don't you evaluate Microsoft Research?


Just about everything in the article doesn’t pass a sniff test to me.

Aside from the issue you mentioned, the article seems to be making the claim that we have less innovation today. Which just seems like a ridiculous claim to me.

It also mentions “pervasive short-termism” as an obstacle. It might be in large public companies, but it’s certainly not in the private equity markets. Innovative companies have access to huge amounts of capital, and can stay unprofitable for such a long period of time that many people think we’re in a bubble. Multi-billion dollar valuation and “may never be profitable” is perfectly ordinary these days, and certainly doesn’t lend any credibility to the authors argument.


> the article seems to be making the claim that we have less innovation today. Which just seems like a ridiculous claim to me.

looking at econonmic growth it's extremely hard to deny. The years ca. 1945-1975 constitute an exceptional period in innovation far surpassing anything today, both qualitatively as well as quantitavely. (new marginal innovations in tech or pharmaceuticals today tend to be about 10-50x more expensive than just a few decades ago).

Relevant reading: The Rise and Fall of American Growth: The U.S. Standard of Living Since the Civil War Robert J Gordon


In the enlightenment you could make discoveries about lightning with a kite and a key, in your free time when you weren't working as a publisher or founding America.

In 2020 you need 10,000 scientists and $9bn to build a hadron collider, and 99.9% of people won't even understand if your result is important or not.

Are we less good at science now, given that we're spending far more resources for far less impact? Or rather, are we better, but the lowest hanging and juiciest fruit got picked first?


> In the enlightenment you could make discoveries about lightning with a kite and a key, in your free time when you weren't working as a publisher or founding America.

And yet those discoveries weren't made in the previous thousands of years during which people had been standing around in thunderstorms. The enlightenment represented a paradigm shift where a new innovation - the scientific method - was applied en masse for the first time and it led to revolutions in science, technology, and society. Whenever a paradigm shift happens, be it the practical steam engine or long range electricity transmission, you invariably have a lot of suddenly accessible low hanging fruit. The mid 20th century had a surprisingly large number of such paradigm shifting innovations such as solid state transistors, integrated circuits, jet engines, nuclear reactors, antibiotics, solar cells, the internet, lasers, credit cards, the list goes on. Entire new fields of science and sectors of industry popped up in a very short period of time.

We continue to innovate, but our innovations are mostly evolutionary and there have been comparatively few paradigm altering inventions in the past 30 years, especially when one considers how much larger the population of sufficiently educated and affluent individuals who could produce such innovations has become and how much easier it is for such innovations to be communicated. It is impressive how good we've gotten at picking higher fruit, but we should still be asking why we are struggling so hard to find low hanging fruit.


The big problems in science can be solved by much smaller teams. LHC is just proving out the dominant paradigm. It’s not gonna help us understand dark matter and _that’s_ the real anomaly in the current physical understanding of the world. Crack that an you’re the next Einstein/newton or at least Planck/schrodinger/hooke.

There’s plenty of room for innovation in green energy tech, housing materials, ag science, etc... We’re objectively spending more to get those than we used to need to. And that’s WITH google and scihub! Something is rotten.


You seem to have a fundamental misunderstanding of how scientific research (especially physics) works. You take a model. You figure out some observable predictions of that model. You run an experiment to test that prediction. If it's right, the experiment stands as support to the model's accuracy. More importantly, sometimes the predictions are wrong. After a pattern of anomalies, someone comes up with a model that explains them. Rinse, repeat.

The idea that the "big problems" can be solved by small teams pontificating is missing all but that last step.


We must be talking past each other, as I understand it quite well.

My point (which I understand is controversial) is there's no requirement for billions of dollars and thousands of scientists to make the next big breakthrough. There's still low hanging fruit. Less than there used to be, but it's still there.

The LHC and big $$ funding is _one_ way to do it. So far its been unable to solve the biggest problems we have in physics (agreed that it's helped disprove some).

We're at a point now that is similar to what people were saying in 1900. Some people say "we've almost dusted this physics thing off and figured it all out". I'm fairly confident something is going to come along from some small corner of physics that will knock all our socks off the same way Quantum Mech and Relativity did... it may arrive this year or it may arrive in 120 years more to get there though.

The big problems in science can be solved by much smaller teams is my point. Sure it will take an army to completely work out the implications of the breakthrough. But the breakthrough "aha" can still come from an individual or small team.


> It’s not gonna help us understand dark matter

In case anyone wants a quick link to the LHC dark matter program, check the ATLAS [1] and CMS [2] summaries.

TL;DR: The LHC has ruled out a good chunk of a few popular dark matter variants already. There are a lot more searches in the pipeline. They haven't found anything yet, but of course nature doesn't just drop particles in the first place you look.

There are also some crazier plans in the works [3] which use the existing beams to look in other places.

[1]: https://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/CombinedSumma...

[2]: https://twiki.cern.ch/twiki/bin/view/CMSPublic/SummaryPlotsE...

[3]: https://en.wikipedia.org/wiki/FASER_experiment


> We’re objectively spending more to get those than we used to need to. And that’s WITH google and scihub! Something is rotten.

What’s rotten exactly? Take a look at this lovely graph [0] showing the cost of solar cells going from $76.67/watt in 1977, to $0.74/watt in 2013 (the price of installation is obviously more than the cell production costs, at about $2.5-$3.5/watt today).

[0]: https://cleantechnica.com/2014/09/04/solar-panel-cost-trends...


Surely you see the difference between explaining what dark matter is, and making solar cells cheaper.

Which one fits your definition of important scientific question that we should answer?


The years 1945-1988 are exactly the years of the Cold War as well. It seems likely to me that much of our innovation did not come from corporate research labs alone, but from government and defense spending. And defense spending was not just bullets and band-aids, but also basic research. (My favorite example is the Lambda the Ultimate papers were paid for by the Office of Naval Research.)


It's not surprising that the period that covers the invention of transistors, integrated circuits and microprocessors saw significant technological leaps. You can make the same observations about the periods when steam, gas and electric power came to market.

What you're describing though is a small number of companies making significant technological leaps, not the quantity of innovation. People also argue that the invention of the internet, and the subsequent global proliferation high speed internet represents the same sort of revolutionary leap forward (one that we're going through right now).


people argue that but it's not anywhere in the numbers (and even more importantly, in lived experience). The argument can be made that this innovation just lags the invention of the internet but that's kind of unfalsifiable and looks increasingly like rationalisation.

One key difference is that inventions like the microprocessor or the steam engine are deeply fundamental. They're at the 'backend' of the chain. Internet based digital services are at the consumer end. Which has produced some growth, but could be argued to mostly fuel hedonistic consumption. (It's hard to figure out what deep growth watching hours of tiktok creates)

In his book Gordon points to the thought experiment of going to sleep in 1870 and waking up in 1970 in NY. You'd be in an entirely transformed world. Cars, buildings reaching into the sky everywhere, electrified subway stations, computers, drugs that save countless of millions of lives, modern agriculture and so forth. Go to sleep in 1970 and wake up in 2020 and what's changed, other than people staring at tiny screens?


But you're not describing the pace of innovation at all. You're just describing your own judgements about the value of modern innovations. What would COVID-19 look like in 1970? I bet there wouldn't be many people working from home...


COVID-1970 would be a lot slower in the spread due to much less globalized world. The West probably would still not have experienced it, neither the Soviet Bloc due to then fresh Sino-Soviet split and heavily militarized border.


I think the Spanish flu might not agree with that assessment. Both plagues also did a pretty good job of spreading to pretty much all territories that had any form of contact at the time.


IMO the better example was 1957 Asian flu pandemic. It also started in China in December-February, but was not yet registered in the West as of May.

It also never crossed the Iron Curtain into Warsaw Pact countries.


2070 very probably will be entirely different than 1970, though. What changed is that computers have allowed us to go to space (at scale) and other incredible feats.


I remember thinking in 1980s that year 2000 would be something unrecognizable. You can feel this optimism in movies like Back to the Future, Blade Runner, Terminator or Escape From New York. They would set entirely different worlds in relatively near future, largely due to expectations from then recent technological change rates.

In reality (aside from tremendous change of political landscape) the 80s and the 00s weren't particularly different to live in.


The ability to have all information at your fingertips everywhere is a big big change.


The choice of New York drives some of this. Go to sleep in Shanghai in 1970, and wake up in 2020...


You're focusing on semiconductors but the same period saw a large number of new plastics, velcro, contact lenses, credit cards, the atomic clock, the orbital rocket / artificial satellite, the laser, the seat belt, fibre optics, the LED, and many other materials science breakthroughs.


And I guess this is the reason WeWork and their kind is notoriously hated by many people; It sell itself as a "tech startup" while it is not.

"tech startups" seems to be synonymous with what the author call the "successors of corporate labs", and that seems to be why VC:s seems to be OK with pouring moneys for a semi-long term payback.

People/Companies that play this expectation (such as WeWork and Theranos) get the moral shaming from mostly tech people, while the rest seems to not get the point of the hate. (Not much indifferent from why some people do/didn't see the point of a corporate lab more than a cost sink)

Come to think of it, Company/CEO:s that overly focus on the "tech" image might be a warning sign that they might be just snake oil (such as Theranos CEO mimicing Jobs)


I think you’re confusing innovation with technological innovation. There’s plenty of ways to innovate without advancing technology. There’s also plenty of fields of research unrelated to technology.

Even if you were only concerned with technological innovation, referencing well funded non-technical companies doesn’t negate the existence of any of the well funded technology-focused companies.

Also, Theranos is a perfect example of private equity funding technological research. The fact that it was a scam doesn’t take away from the interest it’s investors had in new technology.


Really interesting point regarding the many year unprofitable VC backed companies being almost a replacement form of corporate lab. Though most of those aren't pursuing "hard technology" equivalents to inventing transistors or the like. Well perhaps Quantum Computers at Google, but they don't seem to be pursuing that correctly. What are your thoughts on hard technology like that?


I think that moves the goal posts a bit. UNIX was one of the most significant products to come out of Bell, and that's still used today. My laptop, my playstation and all of the production servers I'm responsible for use software based on innovations, based on innovations, based on innovations... of UNIX. For things like microprocessor, display, storage, network, camera, microphone... hardware, all of that is still advanced via large corporate research, and it continues to be innovated to such an extent that we came up with things like Moore's Law to describe its' pace.

A lot of the technology I use at work is funded innovation. Linux (largely Intel), React (facebook), Rust (Mozilla - kinda), Golang (Google), Objective-C, Kotlin, Typescript... There's also a huge number of AI and some very well-funded self-driving-car startups. For hardware, it's unlikely that a startup is going to disrupt companies like Intel, AMD, nVidia, Samsung... but that's only because they're already innovating at such an impressive pace. But even those companies don't come up with all their own innovations. For instance, the technology in the new displays Samsung recently announced comes from a company called Nanosys, which has been funding reliant since 2001. Theranos also comes to mind, which while it was a scam, it was a very well funded one.


>Linux (largely Intel)

Wasn't MSFT the biggest $$ donor, and RedHat the one with more devs commiting?


Very good points.

I think we (I, maybe) default to the mistake of thinking in terms of "one big systemic explanation." Ultimately, something like investment in or success at innovation doesn't follow strict rules. The factors that go into it tend to be local. Any practicable "Theory of Innovation" is likely to be true locally, at best.

A few years back Neil Degrasse Tyson, advocating for a re-funding of the NASA space shuttle, made the argument that private companies will never pioneer space exploration. Mars missions and such. He had Elon in his sights, but also others who were starting to "invest in space" at the time.

It was a good argument. It was logical, consistent with 50 years of space exploration experience and parsimonious with analogies from other industries, history, etc.

Neil was also wrong, and I think he'd agree to moderate that argument significantly today.


Is the Space Shuttle really a great example? Two of the five exploded killing people and it ended up being more expensive and less efficient than the non reusable Russian equivalent.

https://gizmodo.com/the-space-shuttle-was-a-beautiful-but-te...


There were 135 Shuttle missions flown. Two ended in disaster.

There is a huge difference between a 1.4% failure rate and a 40% failure rate.

NASA got a whole lot of science done with those missions.


But more missions would have been flown if the Shuttle was more reliable. This is a kind of survivorship bias where we don't know how many missions would have been flown with a different design.


Is explode the right verb for Columbia?

Also, it's not as though the Russians and Soviets didn't have their share of in-flight fatalities as well.

The shuttle was capable of missions that wouldn't have otherwise been possible. It also ferried more people at once.

That said, the shuttle was severely underutilized and hence costed a lot more than it should have.


> Is explode the right verb for Columbia?

I don't think so, really.

Columbia didn’t explode in the sense of a massive detonation of something onboard. It had a hole punched in the wing during take off. During re-entry, hot gasses penetrated the wing and led to failure of the control surfaces, and the resultant loss of control (it was gliding at this point) caused heating and dynamic pressures that eventually led to vehicle break up [0].

Strictly speaking, the Challenger also wasn’t destroyed by an explosion. The failure of an O-ring on one of the Solid Rocket Boosters led to pressurised burning gas destroying the attachment hardware the held the SRB in place. The SRB ripped away causing the entire stack to tumble. Challenger ended up at the top and had its back broken by aerodynamic forces at the same time as the thin skin of the external fuel tank shredded, leading to a sudden massive burn of the fuel it contained (which caused the ‘explosion’ effect so visible from the ground). The crew compartment 'survived' all this and continued ballistically upward before falling back to the sea where it was destroyed on impact, killing the crew. [1]

[0] https://en.wikipedia.org/wiki/Space_Shuttle_Columbia

[1] https://en.wikipedia.org/wiki/Space_Shuttle_Challenger_disas...


No, and it's not precisely the argument he was making. He was arguing for manned exploration generally, and large scale space projects generally. I was abbreviating, poorly as you point out.


Very true. I think there is also a lot of hindsight bias in the article and the comments


He was wrong, although the space exploration pioneering that SpaceX may do (landing on mars, for example) seems more likely to be motivated by passion than by shareholder returns.


But most companies are motivated by passion. The idea that it's all about shareholder returns is a myth put about by the ideologically motivated. Most companies don't even have public shareholders to begin with, and of the rest, it's not like Apple, Facebook or Google ever even tried to care about what their shareholders thought. They were totally open that they were motivated by passion and would invest billions into whatever pet projects the CEOs happened to find cool. Heck the trend has been to define voting classes such that shareholders are formally excluded from even marginal influence over decision making.

This stereotype of the corporate executive who only cares about increasing profits is based on a kernel of truth, but only because the world is full of mature companies whose leadership are devoid of ideas for where they go next .. so why not focus on optimisation ("shareholder returns")? But then again the world is full of NGOs with no vision, it's full of academic departments churning out low quality grant applications just so they can expand their labs. Lack of vision and a focus on money is hardly unique to any one kind of group. At least shareholder returns are about making money for other people rather than yourselves!


It seems like a good funding model for basic research to leverage the size of monopoly companies like those mentioned.

In the monopoly position they are no longer concerned about competition so don't need to worry about their position ever more. Unlike startups which are constantly in survival mode so basic research with long term benefits is like the last thing they would spend money on.

Long term basic research in academic settings is also not very viable. It seems to favor vocal people with grant hunting skills instead of true researchers. Also academic environment seems to diverge away from practicing of science to more like practicing of religion by studying and promoting either topics completely ungrounded to reality or promoting outright politicized topics.

Monopoly companies on the other hand have a good track record of amazing research which is still practical but with long term vision which can have great benefits to the entire society.

So monopoly companies should not be broken up but forced to allocate percentage of revenue to this research like the examples of Xerox PARC and AT&T Bell Labs. If the companies have the privilege of no competition they should pay heavily for it.

Also since monopolies often fall into caregory of too-big-to-fail companies necessary for economy they should be paying a bailout tax - because you know they will need bailout money in crises, so better to fund it upfront instead of being so surprised that we need bailouts every time the market wobbles.

Monopolies are not very well studied by political and economics circles - everybody fears them and think of them as bad. They are inevitable anyway - infrastructure networks can't be form a competition unless a new technology appears. Maybe monopolies have a natural role in society.

Embrace, not fear, the monopoly. But understand it is a monopoly and you grant it expensive privileges that they would otherwise use to extract rents from the society. Let them pay for the privileges appropriately and you can form a nice symbiotic relationship with monopolies.


>Unlike startups which are constantly in survival mode so basic research with long term benefits is like the last thing they would spend money on.

Sure startups are in survival mode, but they are trying to survive while getting someone else to pay for testing an idea. Aren't startups like VC funded research in a way?


Read the Innovator’s Dilemma. It’s all about asymmetric competition and how to compete against a monopoly. All of the FAANGMs got to be that way by toppling a previous market leader.


The book "The Idea Factory: Bell Labs and the Great Age of American Innovation" is a great story about the history of Bell Labs and how the anti-trust suit effectively killed it.


This statement ignores the reason put forth by the authors of the research paper mentioned in the article - that anti-trust enforcement discouraged _mergers and acquisitions_ forcing companies to invest in research.

FTA: >Arora et al point out that the rise and fall of the labs coincided with the rise and fall of anti-trust enforcement: Historically, many large labs were set up partly because antitrust pressures constrained large firms’ ability to grow through mergers and acquisitions. In the 1930s, if a leading firm wanted to grow, it needed to develop new markets. With growth through mergers and acquisitions constrained by anti-trust pressures, and with little on offer from universities and independent inventors, it often had no choice but to invest in internal R&D. The more relaxed antitrust environment in the 1980s, however, changed this status quo. Growth through acquisitions became a more viable alternative to internal research, and hence the need to invest in internal research was reduced.

A good question would be, although there was a decline in these kinds of research labs, did the market make up for that decline though the current strategy of investment in entrepreneurs and startups?


> There are not a lot of successful examples where the host company actually profited from the invention.

Maybe we should not hold on to the bureaucracy patents bring. Especially considering in some countries international patent are used as recipe to copy innovations.

Tesla has moved their patents into open source because they are under the opinion that competition is won by attracting the best engineers.

> Elon Musk: "After Zip2, when I realized that receiving a patent really just meant that you bought a lottery ticket to a lawsuit, I avoided them whenever possible."

https://www.tesla.com/en_GB/blog/all-our-patent-are-belong-y...


Further, they're assuming causality in the direction of a decline in corporate R&D causing the rise of (generally government-supported) university research. But I don't see them providing support for that assumption.

Could it be the other way around? That government-supported research in the universities had (at least to some degree) the effect of crowding-out the corporate labs? I mean, if the government is going to foot the bill, then why would I want to spend my own money on it?


> I think we also overrate the significance of the corporate labs. There are not a lot of successful examples where the host company actually profited from the invention. Just a lot that bungled them or prematurely killed them (like when AT&T almost invented the internet). Or snuck out accidentally (like the Xerox Alto).

Only on HN would someone argue that we've overrated the significance of the invention of GUIs because it didn't make the company that invented it enough money.


Microsoft Research has had a lot of features and products come out of it. They also publish an incredible amount of basic research.


Bell Labs and Xerox PARC are the poster children for successful corporate R&D labs, but I wonder if, to some extent, they were in the right place at the right time? The advent of modern computing and digital telephony provided rich soil for impactful research.

I worked for a large R&D lab a while back after completing my PhD, but the organization turned out to be completely directionless. Funding went to snake-oil salesmen who charmed executives with flashy proposals that they could never deliver on. I don't think there were any major successes during the time I was there, or since.

They also paid their Bay Area researchers about half the salary of FAANG senior engineers, so they really couldn't retain top talent. This is the downside of being a corporate R&D lab that's not funded by a near-monopoly.


I think the largest corporate R&D lab at the moment must be Huawei. Last time I've heard from someone working there, they have more than 10K PhDs working for them and even only half of them working on R&D that is really massive. It also being helped by the fact its founder is an ex military R&D engineer.

The silver lining is that since doing R&D is necessary for progress and innovation, it will happen elsewhere. I foresee that the majority of R&D exercises will move to university and industry sponsored research will be the norm rather than the anomaly. This is also fueled by the fact that graduate students' stipends are much much lower. You did mention that in Bay Area the researchers are paid half of the FAANG senior engineers. In most developed countries, the salary of university graduate students probably a quarter (4x lower) of the company's researchers. The developing countries has the worst by being paid 10x lower than the BAY Area researchers.


Google has a ton of PhDs, too, but the vast majority don't do R&D. I'm guessing the same is true at Huawei.

Anecdata: My team of 12 or so at Google had (I think) 4 PhDs, and what we did was the usual "turn one proto into another" Google work that barely required a CS undergrad degree, to say nothing of a PhD. My wife has a PhD, works as a software engineer, and also does pretty routine data plumbing work.


I too worked in an R&D lab after my PhD. Which projects to fund was always a problem. No-one knows which projects are going to succeed. Do you stick with a group that has been making steady progress for 5 years or switch to something else that sounds more revolutionary?

Under different leaders my lab flip-flopped from "blue sky, do what interests you" phases to "we need to focus on value" phases.

Neither produced anything revolutionary. But the blue skies phases did produce some useful work.

Research is a risk. Funding it is like gambling. Don't bet more than you are prepared to lose.


> Bell Labs and Xerox PARC are the poster children for successful corporate R&D labs, but I wonder if, to some extent, they were in the right place at the right time? The dawn of modern computing and digital telephony provided rich soil for impactful research.

I think this is part of it. But als,o there were just so few companies who were doing serious R&D for technology. Breakthroughs were new and unique. These days there are so many different companies constantly innovating that breakthroughs are the expectation/norm.


Yes there are two things that drive innovation, competition and new technology/timing.

Many of these companies that have beneficial monopolies/oligopolies for a while they know if they don't innovate others still will even if there are no major competitors at that time. They know timing/technology capability, hardware, software, design or other are at a point where they must progress.

Amazon is a great example of this with how much they reinvest in research and development. It would hard for another non engineering/product innovation company to compete. Right now the Amazon market leading position is beneficial, in some cases they over step, but mostly it is beneficial. Google might be another. Both have flashes of abusing their position but mostly they are still innovating and pushing forward. These companies also inspire small startups to make products that are extended research and development divisions and select the best outcomes. While some acquisitions are bad, mostly the fact that being purchased by one of the larger companies shows they are into R&D and it leads to more of it.

ISPs and banks for instance are two areas where they are stifling innovation, growth and ISPs in particular our network suffers due to this monopoly/oligopoly grip they have on this needed utility.

Bell Labs was back in a time also that had more engineer/product/creative people with leadership roles and the ability to influence the power structures. So direction has changed quite a bit with that. R&D is very hard to justify to the value extractors even if the value creation is clear or maybe isn't as obvious yet.

The mere fear of missing out on technology/timing and potential competitors is the only thing that drives innovation at all. Once monopolies/oligopolies start to use their power position for holding others back by stifling competition rather than them moving forward and using their power position as a booster for product/innovation value creation, that is when anti-trust is needed.

Microsoft for instance in the 90s started to abuse their position, so the anti-trust started. What the world got out of that was Apple resurgence (even got a loan from Microsoft at zero hour of $100 million to stay afloat), Google, Amazon, etc. It even turned out good for Microsoft as they are a much better company today, recognizing innovation over limiting competition is the way forward.

Without the mere fear of being broken up, Microsoft slowed.

With the anti-trust case, it slowed them down just enough to allow competition to get closer.

Anti-trust is the blue shell in Mario Kart.

Anti-trust blue shell is very much needed if the main player gets too far ahead and abusing their position, that game is no fun. Anti-trust is the rubber band AI system that keeps the game competitive [1].

Additionally, monopoly/oligopoly are bad when the value creators (engineering, product, creative) lose power to the value extractors (business, finance, marketing) in a company.

It really isn't the fault of value extractors to extract the most value from the created value, but if there is no competition or balance between creation/extraction, that leads to stagnation on value creation and eventually more power plays that abuse market leading positions to stifle competitors. Ultimately we all lose when that state is entered.

Here's a great quick point by Steve Jobs about product stagnation and the managers/business side and how they can run amok if not controlled to allow value creation to continue, and how monopolies or problems that arise when only the business/managers are in charge. [1]

> It turns out the same thing can happen in technology companies that get monopolies, like IBM or Xerox. If you were a product person at IBM or Xerox, so you make a better copier or computer. So what? When you have monopoly market share, the company's not any more successful.

> So the people that can make the company more successful are sales and marketing people, and they end up running the companies. And the product people get driven out of the decision making forums, and the companies forget what it means to make great products. The product sensibility and the product genius that brought them to that monopolistic position gets rotted out by people running these companies that have no conception of a good product versus a bad product.

> They have no conception of the craftsmanship that's required to take a good idea and turn it into a good product. And they really have no feeling in their hearts, usually, about wanting to really help the customers.

Market leaders should always fear the anti-trust blue shell, when that fear is gone the game is not competitive and we all lose.

[1] https://www.giantbomb.com/rubber-band-ai/3015-35/

[2] https://www.businessinsider.com/steve-jobs-on-why-innovation...


I think the nature of research is to create the chance to be at the right place at the right time. Bell Labs and PARC in particular, many of their inventions do not seem to be inevitable extensions of what came before.

Of course, it's also possible to run a research lab poorly and do nothing of value, naturally I don't know anything about your experience.


> but I wonder if, to some extent, they were in the right place at the right time?

It's certainly true for PARC, which owes its success in large part to the ARPA research community that preceded it. Many of the same people from that community came to PARC when in the beginning, as government funding was drying up elsewhere


What are new fertile soils for such labs to reoccur ?


HP?


I am curious what the author thinks about the same problem in pharma/biotech.

On the one hand, they are kind of _forced_ to do research in one form or another to power the pipelines, so clearly corporate research is still alive there.

However, given the pathetic track record the industry seems to hace, and arguably complete lack of any real innovation (almost all of the drugs in most pipelines are just antibodies or their variants), some form of corporate research rot seems evident here as well.

One problem I see often on pharma side is that it's revealing when the CEO of the org is not a technical person. GSK recruited the CEO of L'oreal as its new CEO. TBH I can't for the life of me figure out how someone who sold lipsticks can make decisions on which preclinical trial has the highest chance of success in a human being. If the CEO of a company is not fundamentally versed in the fundamental technology they make can the company actually be successful?


I think you’re really not giving L’Oreal the credit it deserves. It’s much more than just a lipstick vendor. Sanofi, Europe’s largest pharma company, has its roots in L’Oreal (L’Oreal previously was the parent company of Synthelabo, which used to be the #3 pharma company in Europe until it merged with Sanofi).

L’Oreal also does research in things like methods to regrow human skin/hair, which aren’t on the same level as cancer treatments, but they do involve clinical trials and medical research. Hell, even something like launching a new facial moisturizer requires testing akin to clinical trials.

Aside from that, Emma Walmsley wasn’t CEO at L’Oreal, and she left there in 2010 to join GSK, where she worked for 7 years before being promoted to CEO in 2017. It’s not as if she went from being some business bean counter straight into being the head of GSK; she had 7 years to build GSK domain knowledge before taking the helm.


Thanks for the context, but I suppose I just couldn't pass up on the poetic phrasing of "selling lipstick" a la Steve Jobs' "selling sugar water" phrase.

Point still stands though. Even most of my PhD friends often lack fundamental understanding of how biology works, and I find it hard to believe an MBA can ever catch up no matter how much training they get. I'd argue that in general it also shows - biotechs run by scientists seem to do the actual path breaking research just the same as in technology.


You don't need to have a deep understanding of biology to have a decent grasp of drug development. You're operating at a higher level of abstraction. You could argue such a CEO wouldn't be able to independently gauge the value of a new innovative medicine or approach, but that's where the CSO and others come into play.


The issue is, what if the CSO is actually not competent? Can you tell if they're bullshitting or not? What if they are competent but giving incorrect advice because it's their pet project and invalidating it now means they look bad? How can you tell?

In the end, I personally believe that the CEO needs to know enough of the underlying technology that their company is working on to smell bullshit. Otherwise his execs have a very high likelihood of taking them for granted. Ive seen it happen on a smaller scale repeatedly in my lab. If our professors are not well versed in one field of science, the postdocs and students will take advantage of it in every corner.

This is especially true in biology. It doesn't matter how good your nanoparticle drug is, if you don't know that there are fundamental problems with immune recognition, half life, biodistrubution and non-specific binding, you would not know not to invest further. This evaluation is not something a CEO should outsource to another C level exec.


What good is technology if you can’t bring it to market? The CEO that hires smart and listens to their VPs don’t need to know the technology. Often the “smart people” are the worse CEOs. I would be willing to go as far as saying that Cook may be a better CEO at this point in Apple’s life than Jobs would have been.


Walmsley isn’t “an MBA”, she has a Master’s in Classics and Modern Languages ;)

Real world experience trumps a degree in almost all cases. It certainly helps, but you don’t need a PhD to be a scientist, and you definitely don’t need one to lead scientists.


Pharmaceutical companies largely focus on the mechanical bits of clinical trials, logistics, manufacturing, distribution, marketing, etc because those are much more predictable at the scale of a large multinational. They're not the R&D powerhouses they once were because they have run out of easy small molecule drugs.

Since the market size/fit/price sensitivity is extremely predictable and there are so few customers to negotiate with (insurance companies, nationalized healthcare, medical purchasing groups, etc.), this is far more efficient for pharmaceutical companies than having a broad and risky drug discovery pipeline. Instead they acquire early and mid stage biotech companies with promising pre-clinical trial results or phase 1/2 successes - in essence externalizing the risk of early R&D even further to academia, venture capital, private equity, and retail investors. With the increasing availability of easy capital of the last few decades, this has become even more of a win-win for investors and the pharma giants recently.


Exactly. Pharma companies being large multinationals are very risk averse and very lawyer run. People also underestimate the value of everything except inventing the drug. Everything else costs a lot of money to do and a lot of expertise to do well.


> it's revealing when the CEO of the org is not a technical person

Merck's CEO is a lawyer.

> they are kind of _forced_ to do research in one form or another

Modern pharma companies spend more on legal and marketing than they do on R&D [1]. Valeant and Michael Pearson are emblematic of that horrifying trend.

MBA types and McKinsey alums have very different priorities than most hackers/researchers/builders, and it is their priorities that are driving investment decisions these days. That really needs to change.

[1] https://www.bbc.com/news/business-28212223


Spending more on marketing than R&D has been a favorite thing to zero in on but it's not really a useful metric. See here: https://blogs.sciencemag.org/pipeline/archives/2013/05/23/an...


The article you linked isn't great.

It attacks a study that no one finds credible, which labeled a company's entire SG&A as "marketing" and found that "marketing" so measured was 10x R&D. It also predates the BBC link.

So, what you meant to say is "calling SG&A marketing is not really useful," which is true. But that's not what the linked BBC article did at all.


I'm not that knowledgeable on Pharma, but wasn't Valeant an anomaly rather than the trend/modern way pharma worked? I mean Michael Pearson was fired, and The Valeant even had to change their name and rebrand after their fraudulent "buy pharma, cut R&D, spike the drug price" scheme was exposed? Or are there more Valeant like pharmas, and them just the tip of the iceberg.


> TBH I can't for the life of me figure out how someone who sold lipsticks can make decisions on which preclinical trial has the highest chance of success in a human being.

I'd say a CEO's job is to hire someone who is qualified to make those decisions, and to make sure there's enough money to keep the lights on so they can do that job.

In that respect, running a company like L'Oreal isn't so irrelevant.

Not saying your broader point is wrong though. Someone needs to have a vision for the future, and I'm not sure you can really have a strong vision without having been through the trenches yourself.


>If the CEO of a company is not fundamentally versed in the fundamental technology they make can the company actually be successful?

the fundamental technology of modern pharma is the same as the one of cosmetics company - marketing. To convince people that this lipstick or anti-cholesterol drug matches their lifestyle the best, or even more - to sell people on the lifestyle of that lipstick, that drug, etc.


The original document seems to suggest that the life sciences, at least as far as publication is concerned, appeared to buck the trend.

> ... firms in the life sciences such as Pharmacia, Lilly, Bristol Myers Squibb, Pfizer, and Amgen significantly increased publications. In the case of Pfizer and Amgen in the 2000s, this increase in publishing kept up with changes in R&D expenditures. One key feature of the pharmaceutical industry during this time period was the strong merger activity. However, comparisons with other sectors that also experienced strong merger activity suggests that the publishing behavior of firms in the life sciences was not simply an artifact of merger activity.[1]

[1] https://static1.squarespace.com/static/593d9b08be65945a2e878... Page 29


There is a fundamental difference between a CEO at company operating at GSK's scale and for example, a few hundred person startup biotech. It's a mistake that engineers and domain experts make to assume the CEO always needs to be technical. In small companies, this can be a benefit because the CEO is effectively the leader of everything and often has their hands in a lot. But that just does not scale to a company the size of GSK. The challenges of leading a company of the size of GSK are fundamentally different and start to look a lot more like the challenges of leading other big, multinationals.


CEOs are usually business people. The "product" they work with is the development and sustenance of the business and shareholder value. They sometimes have domain knowledge but if they don't, the board should have access to a sufficient supply to make sound technical decisions.


Their goal is development and sustenance of the business, thats not their product unless they're Berkshire. They can only make products that can be sold profitably if they make products that work. All the marketing in the world can only do so much (not saying it cannot salvage at all) if a drug just doesn't work. And a CEO should know as best as any human can know whether a drug they will be spending billions on has a good chance of working or not. In the end, that's the most important decision for a pharma company and a CEO cannot just outsource it to some other experts. From where I can see that is the reason pharma companies don't deliver as much value as they should be given the investments they make.


Their goal is development and sustenance of the business, thats not their product

That's why I put "product" in quotes. The CEO's concern is the business. Domain experts handle the tactical decisions.

a CEO cannot just outsource it to some other experts

I'm sorry but that is precisely what happens in big business all over the world. Call it "expert guidance" if you want, but if you really think the CEO is solely responsible for making domain decisions, as well as all the business development ones, you're misinformed. They will have views, but those will almost certainly come from the business end, not the domain end.

Apart from anything else, domain knowledge can change so fast, a CEO can't be expected to stay at the bleeding edge and manage the business.


I think the patients doing great with ‘just antibodies’ would take issue with your analysis.


There’s a school of thought that CEOs don’t matter, or matter much less than we might have imagined.

Further reading: https://duckduckgo.com/?q=ceos+dont+matter&t=h_&ia=web


Did you just link a DuckDuckGo search query and call it "Further Reading"?

We know how to search, thank you very much. If you want to help us, you can link a specific book/article, and call that further reading.


Ah, right, yeah, fair point. My apologies.

I vaguely recall reading something on HN a while back, but also didn’t want to get accused here of linking to a biased source as I’m not real familiar with the leanings or any of the ones I found in that search.

Bit lazy, but had my reasons. Will try to do better next time.


Why aren't there blue sky research labs funded by curious billionaires? Xerox PARC's key computing inventions cost a total of US$48 million in today's dollars [1]. There may not be so much big cheap low-hanging fruit now, but there is still plenty within reach. Bezos could be doing this instead of or as well as Blue Origin (which he's been funding at US$1 billion/year [2]).

[1] https://www.forbes.com/sites/chunkamui/2012/08/01/the-lesson...

[2] https://en.wikipedia.org/wiki/Blue_Origin


Think of what the Vision Fund could have achieved with this mindset.

I’d love for a billionaire to offer 18-25 year-olds, $25k for a summer to explore new projects.

You could fund 1000 kids with promising projects for $25 million.

If the next great innovation will start as a toy, we need to encourage people to make more toys.


A few do.

- There are three Allen Institutes (Brain Science, Cell Science, and AI), funded by Microsoft's Paul Allen in Seattle.

- The Gates Foundation (also funded by Microsoft, I guess) funds research, though mostly at existing institutes.

- The Howard Hughes Memorial Institute funds a bunch of research at its Janelia Farms Campus, plus provides lavish support investigators at universities.

- Eli and Ethyl Broad put up about $700M to endow the Broad Institute at Harvard/MIT.

- The Michael J Fox foundation funds a lot of Parkinson's Disease research.


Don't forget:

- Chan Zuckerberg Initiative

- Simons Foundation

- Gordon and Betty Moore Foundation

- Sloan Foundation

You can even put companies like Numenta on this list, which is pretty much blue sky research, even if nominally for profit


Yes!

I was actually just looking at a job posting at the Simons Foundation, which makes its omission that much more embarrassing. The Schwartz Foundation also paid for some stuff (and a lot of pizza) in grad school,so I should give them a nod! CZI and Simons/Flatiron have their own buildings. I think GBM and Sloan are more grant-making organizations.

There's also DE Shaw Research, which is particularly interesting since the billionaire in question works there himself.


I mean, isn't that what Peter Thiel is doing? Giving college age people $100k to drop out and do a startup?

https://thielfellowship.org/


With a startup, there's a very strong profit motivation. With blue sky research, you're just exploring and discovering and inventing, and doing that without a strict goal of profit leads to unexpected things that wouldn't easily be found in other settings.


Not just startups (though the focus might have shifted toward them in recent years, not sure). The Wikipedia article says that Fellows are funded to "drop out of school and pursue other work, which could involve scientific research, creating a startup, or working on a social movement".

https://en.wikipedia.org/wiki/Thiel_Fellowship


In some ways, yes. I was advocating for something a little smaller and tied a little less directly to commercial products.

More of a mashup of the MacArthur and Thiel Fellowships.


Isn't this what universities basically do with graduate students? They tend to be a bit older, the pay maybe a little less, but universities does allow pretty high level of freedom.

Of course there are lots of problem with academic culture, like overemphasis on maximizing citation count. But the setup you describe will also require some form of simple metric to track progress- to ensure that 25k isn't going down the drain.


A lot of academic research is locked into a particular model where every project has to be doable by 1-3 main people, most of whom are "trainees", and produce a (positive) result in 1-3 years. It's not totally impossible to do other things, but the career incentives push pretty hard in this direction: trainees need 1st author papers, it's harder to fund postdocs after a few years, etc.

Proper staff scientist jobs would help break out of this mould and might even be more cost-effective by reducing churn in the lab and providing people with more guidance.


==But the setup you describe will also require some form of simple metric to track progress- to ensure that 25k isn't going down the drain.==

Maybe. I don't think MacArthur tracks the spending of their Fellowship (a much larger amount).

==According to the foundation's website, "the fellowship is not a reward for past accomplishment, but rather an investment in a person's originality, insight, and potential". The current prize is $625,000 paid over five years in quarterly installments. This figure was increased from $500,000 in 2013 with the release of a review of the MacArthur Fellows Program. Since 1981, 942 people have been named MacArthur Fellows, ranging in age from 18 to 82. The award has been called "one of the most significant awards that is truly 'no strings attached'".==

https://en.wikipedia.org/wiki/MacArthur_Fellows_Program


The article claims that university is research is primarily curiosity driven rather than commercially driven, so ultimately discovers fundamental solutions instead of practical ones that can be sold.


University research is strongly optimized to please grant commitees.


> Xerox PARC's key computing inventions cost a total of US$48 million in today's dollars [1]

This sounds like a form of suvivorship bias. If you know which research pans out it wouldn't be research. You need to also include the cost of research done by other companies that led to nothing or something less impactful.


Hold on. Corporate research labs are _fundamentally different_ than academic research labs.

Why? Time horizon.

Companies will not fund research that has a more-than-20-year expected time to product. Usually, they won't fund things that will take more than 10 years to go from R&D to product. That's because of investment-- think about startups, what LP wants to put money in a fund for more than 20 years?

On the other hand, academic labs, funded by governments, often do work that pays off more than 20 years later. Think about Watson and Crick - their work on DNA in the 40s and 50s led to antibody drugs in the 1980s that are now being used widely today.

Bell Labs was the exception that tried to do long-term, basic research work, and its failure proves that corporate funding will eventually dry up for any long-term research. There's just no business case.

The US has the biggest tech and biotech economy in the world because the US government (that is, US citizens and US society) was smart to fund longterm basic research at the highest level in the world, building universities that attract some of the best talent in the world. Corporate research labs do more short-term work.


Bell Labs was a massively successful experiment. It didn't try to do basic research work. It developed: synchronous sound motion pictures, the photovoltaic cell, encrypted speech transmission, the transistor, the laser, electronic music, cellular telephones, optical tweezers and discovered cosmic microwaves, among other things.

Why did it fade, though? AT&T was able to fund Bell Labs when it still had a monopoly over the telephone market. The article claims that antitrust laws were relaxed in the 80s, but AT&T was broken up in 1982 (https://en.wikipedia.org/wiki/Breakup_of_the_Bell_System). This break up lead to increased competition and innovation in telecom, but it did come at the cost of Bell Labs losing funding until eventually being spun off into Lucent.

Peter Thiel argued in the book Zero to One that only monopoly (or pseudo monopoly) businesses can afford to fund surplus salaries and corporate research labs that cost billions of dollars with potentially little or no gain for many years. I would argue that if a corporation exists in a competitive industry, their margins are driven too low to fund basic research.

Also note: even in the U.S. corporations fund a good chunk of basic research in academia: https://www.sciencemag.org/news/2017/03/data-check-us-govern...


Corporate labs do fund very long term research. Google was interested in and working on AI more or less from the early years (anyone remember Google Sets?), it's still funding fundamental AI research more than 20 years later. Now they've been funding self driving cars for more than 10 years and still do so.

The reason it appears rare is because funding research on the assumption it might be useful in more than 20 years from now is extremely wasteful. Companies don't work so far ahead because the risk of just going down a dead-end for half a lifetime is very, very high. In academia that doesn't matter because people are rewarded merely for researching novel things, in the real world people are rewarded for doing things that are useful.

It matters. For every DNA you can cite, others can cite dead end branches that despite decades of research have gone nowhere and probably never will. In CS, how many programmers are using Haskell every day? It's been in development for 35 years yet virtually nobody uses it - the new languages that gain traction (Rust, Swift, Go, Kotlin etc) invariably come from corporate R&D labs, and outside of a few bits of useful syntax borrow little from academic research languages. Even Haskellers have admitted now that lazyness was a dead end, new FP langs like Idris don't have it. Practically the entire field of PL research was swallowed up by FP and continues to be dominated by that paradigm (e.g. dependent types), despite the vast majority of PL users being disinterested in them.

Computational epidemiology. It's been researched for >20 years yet the models are always wrong. There are private sector epi models, but not surprisingly little work is done on them because there's obviously a missing piece, and developing huge and insanely complex simulations (e.g. the 15,000 LOC monster Ferguson produced) is obviously a dead end.

String theory. How much time has been sunk into that? Not a single testable prediction.

And in biology. You know, one biotech firm found 9 in 10 papers don't replicate. Papers professional labs can't replicate is not "useful in 20 years". That's "not useful today and never". People were selectively breeding for many centuries. Agritech firms would have eventually figured out the structure of DNA if Watson and Crick hadn't.

Meanwhile we tend to take for granted all the long term R&D projects the private sector does because it's much better at finding useful outcomes quicker. It doesn't need to wait 20 years to find products out of the research it does, and that's good!


> Practically the entire field of PL research was swallowed up by FP and continues to be dominated by that paradigm (e.g. dependent types), despite the vast majority of PL users being disinterested in them.

I couldn't disagree with this more. I'm biased because I really like PL research, but when I look at modern languages like Rust, Haskell's shadow is plain to see. ADTs, immutability and parametric polymorphism for instance.


I used to agree.

These days I think there's a lot of wishful thinking along these lines, as if nobody would have noticed without Haskell that re-using lots of global variables leads to frequent bugs. C++ got the const keyword in 1985, the same year Haskell was born. Templates were proposed in 1986. And would nobody have developed the notion of first-class functions without academic PL research? Given that even C has the notion of function pointers, it's hard to argue that.

Rust is hardly related to Haskell. If there's a shadow it's a very small one. Rust's primary research idea is adding linear types to an imperative type system. There's no lazyness, it's not pure, the syntax is obviously C-based and not ML/Haskell based. The similarities to C++ are much stronger than the similarities to Haskell.

When I look at the huge quantity of taxpayer money sunk into this line of programming languages, and how much impact it's had, I can't really support it. Academia/Haskell supporters like to lay claim to ideas and argue they "came" from academic PL research, but when you look into the histories and timelines that's just clearly not true. The ideas were either already in development a long time ago, or they were trivial and easily thought of.

Meanwhile, like I said - lazyness is now a dead end. I remember one of my CS lecturers who worked on Haskell-related DT research when I was an undergrad. He sang the praises of lazyness, how much better it made everything. They don't think that anymore and new son-of-Haskell langs don't have it. That entire line of PL theory was born, lived and died entirely within the taxpayer funded public sector.


That fund is rapidly drying up for Biotech. We have a huge under investment into many different forms of healthcare and novel medicines.


I know it's not a popular topic around here, but I'm confused that corporate taxes are not mentioned at all. Wasn't one major incentive for creating research labs that such investments in R&D were tax deductible, and the drastic lowering of these taxes since the 80s made that point moot?


https://slate.com/business/2012/07/xerox-parc-and-bell-labs-...

I’m surprised you are the only one mentioning this. You change the rules of the game, you change the game.


I actually work in a corporate research lab, with 1000+ researchers working on all kinds of things. One thing that proves to be tricky is the "mission" these kinds of labs have. There is always a tension between direct work for business units on next years product, and research for novel ideas and methods. The former is what what keeps the lab afloat, financially. The latter is what is hoped to keep the company as a whole afloat in the future. But this includes investing money in projects that will not pan out and just burn money. Finding this balance is hard.

Tensions like these might very well be the reason corporate research has declined.


R&D is not dead.

R&D has been rising globally for over 70 years with no major declines. Globally, $1.7 trillion USD a year is currently spent on RD; roughly 2% of the global GDP. There is more research going on right now than at any point in history at any scale.

R&D, similar to startups, is full of broken dreams, surviver bias, etc. — those complaining their source of funding dried up, they never got lucky, etc.

There is no magic recipe for success in R&D — and anyone that tells you they are able to outperform the market at scale as it relates to R&D outcomes is lying.


> There is no magic recipe for success in R&D — and anyone that tells you they are able to outperform the market at scale as it relates to R&D outcomes is lying.

This is complete nonsense. This logic applies well to stock and bond valuation, where you have armies of traders optimizing from public information. It's a pretty frictionless market

It's like telling a parent their kids can't outperform the market at scale in science. Of course they can. They just can't outperform the market at scale in math, athletics, leadership, foreign languages, science, and everything else all at the same time.

In virtually all other domains, you have better organizations and worse organizations. I've been in organizations that do R&D brilliantly, and ones that do it horribly. Did the ones that do it horribly die because of market forces? No. They did other things better.

An organization has many pieces: R&D, marketing, branding, advertising, legal, engineering, sales, strategy, finance, logistics, etc. Most organizations I've worked at were really good at maybe one or two of those, in most areas, followed industry best-practices, and were pretty bad in a few.

I can promise you that MANY people can outperform the market at scale in relationship to R&D outcomes. We just can't outperform the market at scale in ALL of those areas at the same time. Organizations and individuals have areas of focus.


What is the recipe? — (You have written a lot of words, no recipe.)

Honestly, love research, would be happy to be wrong, but all I hear you saying is someone did it so it must be possible— that’s called survivor bias, it’s not a recipe and it would not double the global output of R&D for the next 10-1000 years.


There isn't a simple recipe someone can follow. A recipe is what brings you up to "industry best practices," and about where a typical business might perform.

Excellence requires focus, dedication, thinking things through from first principles, having the right people in place, etc.

The closest I can offer to a recipe is to hire a CEO / President / co-founder early on who has a track record of having R&D successes in former positions, who has a great depth of knowledge, who thinks deeply, and have them focus the organization on R&D, and to do this before the culture is set.

Of course, that's not always a winning strategy. If you do that, that same person is unlikely to have that same depth in, for example, customer engagement, negotiations, or legal.

Most hard things don't have recipes ("What's the recipe for an effective fighter jet?"). If they did have simple recipes, they usually wouldn't be hard. But that doesn't make them impossible (we have a whole fleet of effective fighter jets).


To be direct: you literally failed to acknowledge my core point in response to your prior comment, that is what you’re describing is “survivor bias” — then went on in the comment I am currently responding to say a recipe is to fund, hire, cofounder, etc - the survivors. Recipe you provided was still based on survivor bias, at best an optimization based strategy — and would never double the output globally of R&D for any meaningful amount of time.


To be direct: You made a nonsense statement, and now your changing your claims. Your statement was: "anyone that tells you they are able to outperform the market at scale as it relates to R&D outcomes is lying." This is a false statement.

Your question was never about doubling global output of R&D. That's not a point one can even argue meaningfully; there's no way to offer more than an opinion there.

Your question was about being "able to outperform the market at scale." It was nonsense. Plenty of people and organizations can and do outperform the market, consistently, over many decades. That's not a survivorship bias, any more than weightlifters beating the general population at lifting weights is survivorship bias, or that Stanford CS majors have stronger technical skills than the general market is survivorship bias. It's a counterexample. Survivorship bias would be there if these were one-offs (company or individual makes ONE breakthrough, at random).

I'm signing off this thread. This is dumb.


One could even argue that a lot of startups are VC backed R&D. Only VC does not care for the scientific, not-directly monetizable outcomes.


I've worked in a couple of corporate research labs. Yes, you'll be expected to do more convincing about the value of your research for the company. That raises two problems:

- Some great researchers are not very good at convincing.

- From a purely financial standpoint some research does not make sense considering the risk (more ground breaking and longer usually means more risk).

I'm not sure that the alternative is not much better though.


> - Some great researchers are not very good at convincing.

Reminds me of this. https://www.smbc-comics.com/comics/20101209.gif


I've only got one speed; sorry. Compare this to interest rate policy.

This article mentions a lot of research labs shutting down in the 90s. The 90s was also when the current 30 year period of <8% nominal interest rates started. And other easy money policies.

Any company that invested heavily in the future would have been a loser vs. people who worked on credit. It isn't surprising that none of the big corporations are investing in research. The investment framework levers have been set to 'short term' for a very long time now - it makes more sense to buy up innovative compeititors. It isn't surprising that long term investment in research vanished from the corporate world.


The lower the discount rate is, the more the long term matters.


Is that an NPV reference? NPVs are for comparing options; and low discount rates theoretically mean that people are more willing to burn money up front for potential long term gains. But research then starts losing out to other, riskier, decisions. Nobody* borrows money to fund steady long term corporate research.

Low interest rates manifest as people starting Uber and Tesla rather than big companies finding budget for a research lab. Ford is competing with a company that doesn't feel a real need to have profit margins - that is a threat in the present. Big, cash burning machines with potential multi-billion dollar payoffs are where the credit goes and where the winners live - not boring people doing long term research in corporate labs.

It isn't like R&D is a losing proposition in this age - look at Apple's peak for example - but the resources are being directed to people who own assets or are shooting to control entire markets. Research labs aren't making companies winners.

* I'm sure there is somebody, but there won't be many.


I have often found it baffling that tech giants, with near unlimited resources to fund all sorts of things, don't establish these kinds of labs. I'm aware that there are many things called something like "{Google, Facebook, Twitter, ...} Labs" but from what I understand they are significantly different from the corporate research labs of old.


Google Research is a thing. It’s even cited in the article as a positive example.

https://research.google/


Microsoft Research is a notable exception


Also Mozilla Research


Why spend so much on product research when you can simply wait for some startup to create a new product and bring it to market while you can just acquire them or create a clone and drive them out of business abusing your dominant position.


I'm not sure what "corporate research labs of old" did differently from what Google Research, Microsoft Research, or FAIR do. They are all significant investments in personnel and equipment by these firms, and produce fundamental research and are arguably as good as the best universities in the world at it (if not better at some things owing to sheer scale). A lot of MSR's work isn't even directly applicable, people work on theoretical computer science and optimization theory while being permanent researchers there. Similar cases exist across the industry.


The Google model is better though, in my view. The problem with the older style "lab" approach, typified by Microsoft Research, is that it basically recreates academia inside a company. There's then often huge difficulties integrating resource output into actual products. MSR does OK but is still quite notorious for producing long term expensive research projects that never get integrated into any product (Singularity/Midori being perhaps the most famous), whereas the Google model doesn't have that problem. Google has an extraordinary high rate of incorporation of research ideas into products.

This is better for a lot of reasons, obviously, it makes it easier to justify funding the projects. But it also ensures that the researchers truly understand the nature of the problem they're solving. Seeing actual translation queries on Google Translate helps direct the next piece of MT research towards what kind of things people really use the tech for. Building actual big data systems helps train the researchers in how to work with huge data sets and puts them on research paths they otherwise would have been unable to explore.

Basically it recognises that a hard division between research and practice doesn't really make sense when working at the cutting edge of technology. You want the constant feedback from real world practice to guide the theory.


"Fail fast" is not conducive to good research. It takes persistence in the face of failure.


Facebook Research makes real strides in many areas.

Disclaimer: I work at FB, but my employment there does not change my opinion. I worked at Uber and my opinion of Uber's "research" was different ;)


Aren't Google's "other bets" basically just huge research labs?


Exactly. Calico Labs for example is under Alphabet and devoted to researching aging and longevity:

https://www.calicolabs.com/


Why would that be a better arrangement than, say, universities?

The main issue with corporate research is that it's not easy for them to capture the output; it very quickly leaks into the public domain, despite patents. So why bother?


This is one thing that I really have to credit MS for doing right.



Why would ad companies need research labs?


At the minimum: to find more effective ways at getting people to purchase those products being advertised. There's been a lot of interesting research done on recommender systems and data mining that have been fueled by the growth of the ad-funded Web.

Eventually, though, once a company gets large enough, it needs to diversify in order for it to not be totally dependent on one revenue stream, lest it find that stream disrupted in the future. Apple used to be strictly a personal computer company; now it sells not only computers, but it sells smartphones, tablets, headphones, smartwatches, and other consumer electronics, as well as provides services for a fee. Google has a lot of activities, and Facebook has been branching out into VR through its Oculus acquisition.

Having a research lab could be helpful with exploring other areas that the company could branch out into.


You're sadly downvoted, but you're making a good point. The digital ad companies are at this point operating mature stable businesses in monopoly markets. Do such companies really need to invest in R&D?


How do you think those companies got to be mature monopolies? Someone had to do the research to find out which “hot singles near you” banner ad has the highest conversion rates.

I say that tongue-in-cheek, but only partly. There actually is a ton of research that goes into following things like cultural trends and human psychology to ensure maximum ad consumption. Advertising in the middle of a mobile game or on Netflix requires an entirely different school of thought than advertising on traditional cable TV or news websites.

A notable “innovation” (though many may not respect it very much) is the new paradigm of advertising stuff via Instagram influencers and the functionality built into the app to facilitate it. Someone had to research and design that, just like such companies now are probably trying to find the best way to advertise in VR/AR platforms or in rideshares.


Digital ad tech may be mature but search is not. If it was then the experience of using any search engine would be very much the same but in my experience that is not close to being true yet.


But isn't the experience pretty much the identical? Google, Bing and DDG look essentially the same; the two main differences are the colour of the lipstick, and the search index. The main experience - type in words, get suggestions while typing, confirm and have it show relevant webpages + various info/image/videoboxes - it's universal now.


Right but the visual experience isn't as much a differentiator as the "does this answer my question" experience, which relies on fairly sota methods in NLP, clustering, etc.


They are not monopolies. They own a segment of users but NOT a segment of advertiser budget. They compete with each other for their true customers: advertisers and their money. If one platform gives a higher ROI than another then more advertiser budget will shift to it. Sure, most large advertisers will diversify but even a 1% shift means billions in this case so worth having people try to optimize.


Every large ad company has a bunch (or a lot) of PhDs that publish papers and generally try to out innovate their competitors. There's research on human behavior, economics (auction theory, etc.), optimization theory, machine learning, etc. For example, modern ad systems are the largest auction houses in human history by orders of magnitude.


Despite this mostly being an attempt at snark:

To research new infrastructure technology to reduce expenses or simplify operations (making outages less likely, feature development easier, etc.)

To get better at targeting ads by investing in data science/machine learning/AI

To create new products through which to serve ads, reduce CAC, or build goodwill or diversify

To generate publicity and press for recruiting purposes, public perception, etc.

To acquire patents or to genuinely just try something new and figure out how to use it later


There are so many reasons, but the main reason is despite making their money from ads, they are not only ad companies, they have many products.

Even disregarding the product reasons, research often makes these companies look good to both a technical audience and sometimes to politicians who value "innovation".


Because they want smart people, but they’re ad companies.


Because the companies that HN likes to sarcastically call "ad companies" actually ship many sophisticated products to their customers (both users and advertisers)? Products with features and infrastructure enabled by research done in their research labs?

They could stop working on their products but they wouldn't last long. The consumer tech world changes rapidly and is fiercely competitive.


I’m not sure about the premise of the article but I do concur that startups have been largely unable to develop new technology and commercialize it until relatively recently. During the 2000’s if you were doing a “hard” tech company, investors did not have the patience to invest in these companies and allow them to grow. I worked at several of these startups, many with big VC names you’d recognize and they killed these companies with their short term focus.

Later in that decade with the success of Facebook, investors turned away from these investments and were more interested in finding the next Facebook. Essentially the 2000’s were a kind of lost decade in fundamental technology development. Most of what was built during the decade was the benefactor of cheap computing enabled by 50 years and trillions of dollars in semiconductor investment. Now that we are hitting the edge of what’s possible, it’s time for big, long term, hard tech investment.

It’s only been more recently where people like Elon Musk and to a lesser extent Larry Page at Alphabet have led the way by being willing to take big bets on technology development. In the last several years, we are finally starting to see some venture capital follow where investors are taking a long view and betting on a few “moonshots.” In the last couple years there have been big investments in computing with companies like Cerebrus doing wafer scale computing, PsiQuantum and Rigetti in quantum computing, and various optical computing companies to name a few. Of course, there has also been considerable investment in the AV space as well and the AV space will need lower power, cheaper solutions that you won’t be able to simply buy off the shelf and slap together like almost all of the AV companies are doing today.


We could also embrace the idea of government being the investor of first resort, and not leave it to the whims of VC.

https://www.nytimes.com/2019/11/26/business/mariana-mazzucat...


If you're looking at aerospace/defense startups, this is usually the case.

It's not unheard of for them to employ people who's primary role is grant writing to try and get (for example) SBIR funding


I have a different take on this: corporate research labs died because we aggressively clamped down on monopolies.

When you don't have a monopoly, the investor mindset is that the company should be laser-focused on "core competencies" (buzzword, but important) and return excess capital to shareholders - who then provide it to other companies that will innovate in the field. Keep in mind, the universe of alternative investments go beyond the stock market/PE/VC.

Capital is tied to shareholder value. When you can't point to something creating value, there isn't a reason for capital to stay. For a company to maintain a research lab, it needs to be perceived as something other than a cost center. In contrast, Bell was able to entertain its own full-fledged R&D labs because shareholders expected them to create new avenues of profit themselves because they were the monopoly.


I half with you - when you have a money printing machine (a monopoly), it's easier to justify spending money on R+D.

But at the same time, we've seen a decline in corporate R+D since the start of the neoliberal era - the article mentions that this started at around Nixon's time. This is the period of time where Milton Friedman's ideas started to gain widespread acceptance:

>“there is one and only one social responsibility of business– to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game,”

Many organisations have taken the idea of "profits over everything", and interpreted it as "quarterly profits over everything". R+D labs don't result in quarterly profits. Much of the research ends up being profitable years down the track. So corporate R+D is killed off.


Can confirm. I have seen this happen from very up close in the European research arena. The big corporate labs were either completely gutted, or severely constraint and reoriented towards very short term practicalities, basically just development centers fulfilling contract research.


Europe suckiness at R&D is well documented. I am European. It hurts.

https://ec.europa.eu/eurostat/statistics-explained/index.php...


This article misses the fundamental way in research is funded. The US Government funds research through grants, and each scientist runs thier own little shop doing independent research... The problem is this isn't a directed research program.

The Manhattan Project, Space Race, etc had program managers telling scientists what problems to solve and had other scientists building the architecture, identifying gaps, and running parallel experiments. It was about the individual scientist it was about the program - so we had Feynman running computer simulations.

There was an end goal and people took less interesting, but necessary, jobs to move the project forward. Also, results were expected - not papers, but actual results to problems.


Thanks for sharing this point of view: it does indeed seems to be sorely missing in the article as well as in this thread.


The sheer scale of the short-sighted profit-seeking tunnel vision pandemic among American corporations is terrifying to anyone whose life doesn't revolve around spreadsheets. It results in businesses looking absolutely amazing right up until they run off a cliff.


While I fully agree with the outline of the change in scientific organization (move from private labs to public science in the universities), I think they have misidentified the reasoning. It is not the anti-trust deregulation, which waxed and waned many times. It is the fall of USSR, the end of the Cold War, and consequential reorganization of scientific financing. Public universities were somewhat anti-military, and didn’t receive much funding with potential military applications; all of it went to private labs. After the end of the Cold War, this situation changed.


The latitude offered researchers at the labs is missed, but the main corporate role of labs was to draw in and divert the attention of the most innovative researchers from activities that might upset markets. Innovative results were collected, patented and systematically shelved.

Thus, RISC was developed at IBM in the '60s (the 801) and shelved until the '90s, as it would have outperformed the 360/370 series. Xerox PARC created the key components of modern personal computing, but Xerox management saw no reason to bring any of it to market.


What would you call OpenAI (I know it’s not exactly corporate but..), Google Brain, any of the labs competing for quantum computing research, SRI International, etc etc

I work in an “Applied Research” group now. while not a classic corporate research lab, we do get plenty of freedom to experiment with new methods.

Perhaps there are less fundamental improvements because we don’t have a fundamental new medium? So currently most advances are interactive or building off fundamental advances.


Have they ever been alive, though? They seem more of a PR stunt, to show shareholders that the company is invested in the future.

In my country, if you land a job in such a lab, you know you can bullshit around and play with with some stuff. But nothing will ever come out of it. For profit, the corporation will acquire some product or startup.

If you really want to innovate freely, found your own startup. With the additional benefit of reaping the money yourself.


Bell Labs basically re-invented electricity. The invention of the transistor was one of the most important inventions in history, on the scale of the dynamo, gunpowder, or fucking metal tools.


To be fair, there were plenty of people working into creating a transistor, so if Bell Labs weren't there, somebody else would have done it. It is not even clear that they were really the first.


I work for one of the last traditional corporate research labs in Silicon Valley, and I consider myself very fortunate to have my position, especially in the time of COVID-19. I love doing research. But I fear losing my job, partly because I don't know where I'd find another research position in light of the 30-year decline of industrial research combined with the impending budget cuts that will hit academia starting with this upcoming school year.

I've noticed a profound shift in the past decade away from corporate research labs such as IBM Labs and HP Labs where they worked on medium-term projects developing research prototypes that were sometimes passed onto product teams. In their place, companies such as Google have pioneered a different model of research (https://research.google/pubs/pub38149/), where PhDs are hired as software engineers who solve research problems and write production code, focusing more on shipping production code rather than writing papers (although there have been many great papers that have come out of Google, most notably the MapReduce and Spanner papers). I'm noticing that the vast majority of my PhD-holding friends in computer science are hired as software engineers rather than as researchers. The ones who started out at places like IBM Labs or HP Labs with the titles "Members of Technical Staff" would often end up taking positions at other companies as software engineers.

This development may be fine for researchers who want to work on production code and who don't mind de-emphasizing publishing in exchange of product development. However, what about researchers who want to focus on solving research problems that cannot immediately be applied to products? I'm finding that there's decreasing room for these types of researchers in this economy. More companies have a short-term mindset these days, partly due to changes in management style (e.g., the rise of Carly Fiorina-style CEOs), but also due to the fact that the computer industry has shown repeatedly that large 800-pound gorillas can be taken down by smaller companies. "Why invest in long-term research and long-term planning if there is no guarantee of a long-term future" is the logic of many companies, big and small. The alternative to industry is academia, but there are only so many professorships available, and for those professors, there is only so much NSF grant money to go around, which is highly competitive to earn. Professors at research universities spend a lot of time fundraising; it costs a lot of money building and maintaining a lab that is resourceful enough to perform the research and publish the results necessary to gain tenure.

Short of a major cultural change where companies are encouraged to invest in research at the levels that Xerox and AT&T did back in the 1970s and where we see an expansion of academia similar to the post-WWII boom (which is unlikely in the United States), the future I see for those wanting to work on problems that don't lead to immediate productization is independent research done on a researcher's spare time when not being engaged in "money-making" activity. After all, Einstein did brilliant work while he was a patent examiner, and Yitang Zhang did amazing research while being employed as an untenured lecturer. I would advise today's computer science PhD students of this current reality of research employment. If one wants to work on self-directed research projects, then that person must be willing to have a self-funded research career; all researchers need to be concerned with funding whether that funding comes in the form of a direct salary, a grant, or indirectly through the salary of an unrelated job.


In general I agree with the warning in your comment, but I think there's second challenge besides funding at universities. I mean, funding is always hard, but right now it's a decent time for computer science funding. There's the DoD (for things more about AI and security), NIH (for things with some health relevance), industry (ML and topical problems), or NSF which is more generous towards CS than any other field today. But funding isn't as much of a problem for PhD students who should be supported by their advisors, so they have a good several years of solid focused research time.

The challenge I'm talking about is papers, specifically the game of publishing. I feel like PhD students spend more time in optimizing for publications than doing actual research. There's an obsession with the number of papers so everyone is trying to eek out a paper for every semi-failed experiment, overfitted model, or unfinished prototype. No one wants to throw away effort on something that basically failed, so they're trying to find the perfect narrative, frame their results just so it looks like they're good, or slice and combine it into something submittable. Every deadline is worth submitting to, the number of selective conferences is growing, and there's incentive to "get on" another paper as a co-author (which means building collaborations, helping out, editing).

For each paper, there's months spent writing, giving and receiving feedback, making figures and formatting. Each submission usually requires some change to the format and language, so upon rejection, the paper is edited and targeted towards the next conference. Then there's the submission game of proposing reviewers, choosing the right track or subcommittee, interpreting reviews, writing multi-page rebuttals, editing and getting feedback from co-authors about the rebuttals, and in the best case a month later, preparing the camera-ready version, the back-and-forth with the publisher, and finally preparing and practicing the conference presentation.

So before great research can truly come out of universities, I think publications need to be deemphasized. This could be a simple norm like judging researchers on their best 3 papers for faculty hiring and research awards. In turn, that would reduce paper submissions, increase paper acceptance rates, and finally -- leave more time for actual research at universities.


The challenge is that companies have poor track records of determining what the market will want, so how can you know what to research unless it is a better version of what you already do?

Companies like Intel and Nvidia and the pharma companies do that kind of research because you can reasonably guarantee that someone will want a faster chip or a better cure for a disease.

But beyond that, what does the market want next? If you can answer that, you can operate a lab. If you can't, then you are just blindly stumbling around. University research is probably also a heck of a lot cheaper as people will work for a lot less if they get a degree at the end of it.

I think they would be better off creating a prize system for problems they want to solve (the return on investment for XPrize is amazing) and letting the university researchers figure it out.


Startups are more or less externalized R&D facilities and the acquisitions happen only after a team has eliminated sufficient risk. Sometimes thats in the form of revenue but sometimes it’s research or some other signal.



Research is risky, it's easy to cut for short term gains, it's very hard to quantify before POC and that makes it very very hard to justify keeping, and few Co panties are good at the whole pipeline needed to turn research into profits and research is best done by people with equity. Those are just some of the reasons that the modern world has moved to a model where small, new companies soley do research and then either sell/license it or sell themselves to bigger companies.

Drucker wrote about this in (I think) the 90s. Its actually a much more sensible way to do this work.


It's worth keeping in mind that Bell Labs existed because the government paid Bell to keep it open. I forget the precise agreement, but simply having the lab earned Bell money, it wasn't a cost center.

EDIT: I'm looking for a reference for this. I saw an article that says it was funded with 1% of company wide revenues. This is undoubtedly true, but a graduate student I know that knows about such things says that a substantial fraction or maybe even all of it was reimbursed.


I asked some people that were there and they said that Bell was a regulated monopoly and the government allowed them to charge an additional 10% to customers for investment. After competition and deregulation was introduced, things got increasingly more profit focused (as in research had to be justified on that basis). After the company broke up, there was some kind of cooperative lab between seven child companies that equally shared funding of it. However, that only worked well while they were prevented from competing with each other. Once competition was instituted, the companies started asking the lab to sell results to individual companies, and it went downhill from there.


I worked at GEC Hirst in the early 90s. It was a fascinating place, and I learned a lot in my research into local loop technologies in the early days of the commercial internet. However it was clear that the writing was on the wall when the on-site library was closed down.

I left shortly afterwards.


As a Dutchie, I want to point out Philips-Natlab, which invented the CD(ROM).


I worked at a corporate research lab at a well known tech co. It was mostly sales and marketing to be honest.


Why so many identical links to the cited paper? I would understand if they had anchors, but they haven't.


The corporate research labs have been replaced with the likes of DeepMind (hell bent on doing research only in neural networks) and Google X (personal playground of the company's founder with no known scientific or business output). The fact that big tech does not invest in basic research anymore does not bode well for the future of big tech.


What an ill informed comment - Google, Microsoft, Facebook are among the largest producers of original research in CS, especially for machine learning these days. You can look at any major conference in the last few years, and it will not be surprising to see one or two of these names being at the top in terms of papers contributed.

You're partially right if the argument is "basic research" = less applied fields / "purer" math, but even there Microsoft research has been a significant player in the TCS and optimization community.

A lot of the current research done at universities is also funded by these organizations, either via grants (monetary or equipment like from Nvidia) to research labs, or scholarships and financial support to students. A large part of AI advancement in the past few years have been precisely because of the amount of effort these firms have put in (along with a lot of others, they are just the most public).


There's this other division of the corporation you might be interested in called Google Research.


fantastic summary of some pretty interesting research from 2019...

alas, it's likely that nobody in any position to do anything about this will bother




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: