The service [1] described in that article allows authors to let people download an article from the ACM via. a link on their personal website, rather than having to host the PDF themselves. There's been no change in the access policy for non-members.
Anderson is so incredibly misunderstood. He makes the same, single point over and over: interconnectivity makes niches easier to reach. That's it. No "the blockbuster is dead", no "throw away everything you've ever learned about running your business", nothing even all that profound. He simply points out that niches are easier to reach, that they function a little differently than blockbusters, and by the way, there is a long history of established business techniques to address them. I don't understand why anything he says is even controversial. He's not saying anything new, simply making an observation. Weird.
There's an ACM article with a time-line of new technologies (with company name) followed by date that technology based a billion dollar industry (with company name). I can't find it now but it was interesting because the company name was almost never the same on both lines. It has been a pretty damning history for any company trying to justify basic research.
Not to suggest that basic research is optional, just that it is fundamentally for the greater good. Kernighan seems to get that. He's focused on the needs of his company but also understands what motivates academics. Interesting...
Facebook and Twitter cater to complete different tastes of two almost entirely different audiences
This is the important point. Facebook will never be as succinct as Twitter. Facebook will never fit in the palm of your hand the way Twitter does. I use Twitter-Facebook integration. No one comments on my tweets in Facebook.
PasswordSafe (Password Gorilla for Linux/Mac users) is a nice option for password management (which I'm sure you've seen and decided against). But the Truecrypt volume sync works very well. I've been keeping a PasswordSafe archive along with the Truecrypt volume across Linux, Windows, and Mac.
I think the fact that we need to step back when processor technology changes shows that we (the software community) lack a truly pervasive concurrent programming model. Does Clojure really offer a truly orthogonal solution to this problem?
And when I say "problem" I'm really talking about the scalability of fine-grained concurrency techniques formalized by Dikjstra (http://en.wikipedia.org/wiki/Semaphore_(programming)).
I agree that programmers will eventually have to be relieved of concerns such as critical section but we will never escape questions like "what are the parallel aspects of this algorithm (or library or application)". A compiler will never "discover" large-scale concurrency patterns in any mainstream language (at least to my reckoning). Maybe interpreted languages will ultimately lead the way.
"[...] students need to be able to do the hard jobs".
Difficulty comes in all shapes and sizes. Your next point is better: "[...] teach concepts instead of tools and languages".
That approach to education sees any language and/or run-time environment as an opportunity to work with an underlying concept.
A study of any language or run-time to the exclusion of an underlying concept is short-sighted. Any student with a language-centric approach to learning faces many difficulties in their professional life. We must all be able to adapt to changes in our field. We must be able to separate fad from fundamentally different.
Diigo does a pretty nice job of highlighting. I don't publish my highlights but a lot of other people do.
Socialbrowse might address something I find a bit annoying about Diigo: I don't care what most people think about an article, especially if I don't know the commentators. The highlights of others just clutters my page most of the time. Diigo allows you to turn page Highlights off but ALL Highlights are removed including my own...
"the petabyte scale [...] forces us to view data mathematically first and establish a context for it later."
Doesn't establishing a mathematical view require one to posit a theory? I'm not seeing anything profound here. When did we stop treating mathematics as a branch of scientific inquiry?