Hacker News new | past | comments | ask | show | jobs | submit login

Yes, let's throw a few more $millions at well-credentialed plodders.

MIT is a zombie: a corpse with delusions of youthful vigor.




As someone who is early in the process of considering Ph.D. programs, I'd be interested if you could expound on that. Also, any schools seem like the opposite of zombie?

(This question also goes out to anyone else that has something to add.)


AFAIK the entire field is dead:

http://en.wikipedia.org/wiki/AI_winter

There are various explanations as to the cause of death: the end of the Cold War; the humbling of the mega-monopolies which funded "blue sky" research (mainly AT&T); a general loss of faith resulting from a decades-long lack of progress. Take your pick.

In fact, the entire field of computer science has been stagnant for a while, shiny gadgets to please people with 5-minute attention spans notwithstanding:

http://www.eng.uwaterloo.ca/~ejones/writing/systemsresearch....

Bureaucrats have replaced thinkers:

http://unqualified-reservations.blogspot.com/2007/08/whats-w...

My advice: study physics or chemistry.


While the CS academy has many issues, the field itself is alive and well. Google and MS both do systems research, as do several financial firms and software companies servicing the financial industry.

As a person moving from physics to CS, I personally find computing to be a very interesting place right now.


> Google and MS both do systems research, as do several financial firms and software companies servicing the financial industry

Where, then, is the desktop operating system not built of recycled crud? Where can I see a conceptually original system created after the 1980s?

> the field itself is alive and well

I disagree entirely. It is a zombie, maintaining the illusion of life where there is none.

Hell, UNIX still lives, and this proves that systems research is dead:

http://www.art.net/~hopkins/Don/unix-haters/handbook.html

Why is my desktop computer running software crippled by the conceptual limitations of 1970s hardware? Why are there "files" on my disk? Where is my single, orthogonally-persistent address space? Why is my data locked up in "applications"? Why must I write programs in ASCII text files, and plod through core dumps and stack traces? Why can't I repair and resume a crashed program?


> Where, then, is the desktop operating system not built of recycled crud?

You're willing to look past our massive advances in optimization, control systems, search technology, computer vision, etc etc, and pretend they don't exist...

... because desktop OSes still suck?


> our massive advances in optimization, control systems, search technology, computer vision, etc

Ok, I'll bite. What advances? I'm talking about real change, not incremental bug-stomping by plodders.


Computer Vision, for one, is making massive advances.

I just spent three weeks (class project) implementing a new algorithm to find the minimum cut of a directed planar graph in O(nlgn) time. The algorithm is actually quite elegant:

http://www-cvpr.iai.uni-bonn.de/pub/pub/schmidt_et_al_cvpr09...

This came out of a Ph.D. thesis written in 2008, and was applied to some computer vision problems in the paper I linked above. This isn't a minor speedup or optimization... it yields asymptotically faster results.

My vision professor is fairly young, and recently did his own Ph.D. work on Shape From Shading. This is the problem of recovering 3D shape from a single image (no stereo or video). His solution used Loopy Belief Propagation and some clever probability priors to achieve solutions that were orders of magnitude better than previous work. In fact, his solution is so good that rendering the resulting 3D estimate is identical (to the naked eye) to the original (although the actual underlying shape varies, since there are multiple shapes that can all appear the same given the lightning conditions and viewing angles).

There is also a ton of interesting progress in the last two decades making functional languages practical in terms of speed (and hence useful). My advisor did his Ph.D. in this area.

The entirety of CS is not evidenced by the current state of operating systems. In fact, I'd argue that OS research at this point has less to do with computation than it does with human-computer interaction, which seems like it requires more research about humans than computers.


That sounds like clever engineering to me, not science.


>not incremental bug-stomping by plodders

but no true Scotsman would do such a thing!


Lack of adoption of systems research by desktop operating systems does not prove the absence of interesting systems research.

With the exception of MS, most of the systems research I was referring to is not used in (or intended for) desktop operating systems.


And even systems research that is intended for desktop operating systems can be a hard sell. David Gelernter, for instance, has had a lot of interesting ideas, but getting enough regular users to adopt these systems could be a harder task than developing them in the first place.


Because you don't see a new desktop os, you assume that computing research is dead?


Oh, it isn't dead. It is undead. Animated yes, but with no life behind it.


> Where, then, is the desktop operating system not built > of recycled crud? Where can I see a conceptually > original system created after the 1980s?

MS Bob.

I'm serious. And the example I offered shows why a conceptually original system is not necessarily a good thing.


> MS Bob

There is nothing new about straightjacketing computing into an "everyday household objects" metaphor. As in, "the desktop," for instance. It is a very old idea which simply refuses to die.

And here is what the late Erik Naggum had to say about "user friendliness," the ancient disease which gave us MS BOB:

"the clumsiness of people who have to engage their brain at every step is unbearably painful to watch, at least to me, and that's what the novice-friendly software makes people do, because there's no elegance in them, it's just a mass of features to be learned by rote."

(http://tinyurl.com/ya86frv)

"The Novice has been the focus of an alarming amount of attention in the computer field. It is not just that the preferred user is unskilled, it is that the whole field in its application rewards novices and punishes experts. What you learn today will be useless a few years hence, so why bother to study and know /anything/ well?"

(http://tinyurl.com/yjfpbyq)


I tend to agree with you. I'm personally still waiting for my intelligent compiler that fixes mistakes automatically. I mean, you would think if it can tell you that your semicolon is missing that it would at least be able to fix it without interrupting you, right?... :)


Truth, though hopefully not insurmountable truth.

Maybe things would improve if more people volunteered time toward computer science research; I can envision something like the GNU Project, but for research rather than engineering, with proper administration, goals, tasks, and resources, to help establish purpose and vision, and attract volunteers to an overarching common goal.

Or maybe even establish something like Y Combinator for CS research, a small-scale NSF if you will. Give a small group of innovative folks a few months of funding to create something new, regardless of if it has near-term business viability.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: