Hacker News new | past | comments | ask | show | jobs | submit login

That's a great advertisement. It's too bad that groups like Westinghouse's R&D division and Bell Labs aren't around anymore. Are there any real corporate research groups doing basic research anymore?



Honestly, I'd say that google is...

I mean...they make smartphones now, how does that make sense beyond "well, we locked a bunch of geeks in a room, and they thought this would be cool...so we did it!".

Or howabout google books? Or all the dark fibre they purchased...or the google datacenter in a shipping container things? Perhaps my vision is skewed as an outsider looking in, but google seems to make a lot of really cool things.

They seem to adhere to the "innovate now, monetize later" model, which is really cool.


They didn't have a bunch of geeks sitting around thinking that up - they bought a company that was actively working on it. Android isn't a google invention - it was an acquisition.


What I do like about this principle is, these "thinkers" do draw out awesome ideas from a source. I will be sounding esoteric here - and you can start your nonchalance, but there definitely is a way of tapping into that superconscious knowledge through that untapped ether. I believe most of history's brilliant ideas came from those type of environments.


youtube.com/testtube


There are a few, but I would say that most basic research these days is academic. However, today's Universities are similar enough to corporations that their professors and grad students can be considered to be "corporate research groups."

If we kill the Java school (or more generally "University is knowledge worker trade-school") mentality, Universities will stop being able to signal their value through "owning" large numbers of researchers, and all the demand for innovation will shift back to corporations.


OK, I have to disagree, simply because I think it's that's a wonderful argument and HN bans empty ++ posts.

Companies that do basic research are a different kettle of fish. They make money on breakthroughs and innovation. Java schools make money on students, so the thinking is an entirely different kind - formalizing existing knowledge (so the students can study something more "academic"). Sure, you can formalize code as category theory, algebra, sets, and so on, but it's not exactly an out-of-box breakthrough. Also, the type of researchers you get are excellent students, not engineers who made innovations in the field and got kicked up to the lab.


Java schools don't make money from the students by "formalizing existing knowledge", though. It's more of a prisoners' dilemma: they make the most money by relying on other schools to formalize knowledge, and then delivering it to their own students as cheaply, but with as much implied authoritativeness, as possible.

But I meant to indict here more than just what you normally think of as "Java schools"—virtually all Universities are receptacles for students who go to them to get a high-paying (and high-status!) job, and thus virtually all Universities cater to those students. Java schools are just more upfront about it.

How Universities actually make their money is a combination of convincing students to study there, and receiving grants. Both of these rely on the status of the researchers in the employ of the University—even though the researchers (the Ph.Ds) are very infrequently the students' lecturers. Students pay not to learn from high-status people, but simply to coexist in an environment with them, and have the "status" of the institution rub off on them.

In order to reverse this process, we have to remove "has studied at high-status University X" as a judging criterion for high-status jobs. That may well be impossible, though, given how embedded the idea has become in our culture.


"even though the researchers (the Ph.Ds) are very infrequently the students' lecturers"

That's not true at MIT; are there any other exceptions to this sad pattern in research universities?

One of the advantages of the lowest tier schols that don't do (much) research is that their faculty are there to teach first and foremost. When the system works, when the instructor is sufficiently in command of his material, you get a better result than in the mid-tier schools.


though the researchers (the Ph.Ds) are very infrequently the students' lecturers.

Who does teach at these Universities? Almost all my lecturers where either current researchers or former researchers currently focusing on writing and teaching. I can only think of one or two courses where the lecturer didn't at least have (or was working towards) a relevant PhD.


I'm pretty sure derefr meant the ones who already have their Ph.Ds, not the archetypal grad student who's learning how to teach at the same time he's teaching an entire class (instead of a recitation, section or tutorial).


Microsoft Research does a lot of basic R&D. Most technology companies are fundamentally R&D companies. Also, the J Craig Venter institute.

One of the issues that with the transition from the industrial age to the information age is that our focus has shifted from industrial research to information research. Because so much of the information age is intangible it's unclear that real progress is happening.

Two things have happened that further hide these issues, the results of research are incrementally realized and the pace has increased to such a rate that it is simply expected. News is when the transistor density or aerial density does NOT double in a year.

Even with industrial research the issue is that no one is around to see the benefits. When someone figures out how to run a lights out factory they will be hailed as greedy instead of innovative.

Even the BP disaster is an example of corporate R&D, does anyone else find it incredible that we are repairing (or not repairing) an oil well that is 5,000 feet under water and there are no humans in sight? The fact that the disaster is even possible should be awe inspiring to the technological progress of humanity.


It would be nice if we were repairing it. :/

However, I think my first reaction to the depth that they were drilling was "That's fucking awesome!" rather than "that's fucking stupid." The stupid was finding out that they were taking this for granted now and skimping on the safety.

Speaking of, BP does a lot of R&D in the energy field themselves. Even the energy companies want to be in on alternative energy when it comes.


> Even the energy companies want to be in on alternative energy when it comes.

The oil companies are not stupid. There will come a day when it is no longer economically feasible to extract the remaining oil. Any company that hasn't diversified by then can turn off the lights and go home.


> The fact that the [BP] disaster is even possible should be awe inspiring to the technological progress of humanity.

I wouldn't describe that as "technological progress". Technology is usually described as 1) controlling natural powers to 2) reach some purpose. The BP disaster fails to achieve both.

More generally, sloppy security standards are a strong sign that a technology is either not fully developed or not fully understood. This holds for engineering as well as for programming.

So the disaster is a sign of missing progress in technology.


You wouldn't describe mankind's ability to extract oil from 5,000 feet under water technological progress? You might as well say that the Space Shuttle wasn't technological progress because of the Challenger disaster.


The space shuttle was under enough control to prevent a catastrophe around the crash site.

In contrast, what we see with BP is a failed experiment that is out of control for months. It is not a demonstration of technological progressiveness. The experiment was falsely labeled as "technology" in order to receive more public trust when it was started.

This is a marketing trick we know from software companies as well. (Fortunately, many companies are honest in that regard.)

However, maybe it is just a question of wording, i.e. whether big, risky experiments count or don't count as technology. My personal understanding of technology implies some minimum degree on maturity. In particular, anticipatory acting should be at least possible.


There are over 4000 oil platforms in the gulf of Mexico alone. According to wikipedia, we've had 4 spills in the gulf, two of which are major (Deepwater Horizon and Ixtoc).

In contrast, we've had 132 space shuttle launches, 2 of which killed everyone on board.


> we've had 4 spills in the gulf, two of which are major

And yet "we" are completely unprepared for oil spills, needing months to recover from worst-case scenarios (and decades if not centuries to recover from the long-term damage). That isn't "progressive".

> we've had 132 space shuttle launches, 2 of which killed everyone on board

It is hard for me to see in how far these incidents are comparable.

The damage of space shuttle launches was local to the "experimental room" and was under control in the sense that "just" the shuttles were destroyed and nothing else. The crews were informed about the risk of that experiment and accepted to take the risk.


HP Research and Intel both do a lot of basic research as well.


Microsoft Research and AT&T Labs are still doing some fundamental stuff.


ibm's watson is still up to no good




Consider applying for YC's first-ever Fall batch! Applications are open till Aug 27.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: