I agree that computer literacy is becoming as important for everyone as reading/writing and arithmetic skills.
What I'm unsure about though is which specific skills and knowledge are most important. For instance, a huge number of people would benefit from more advanced skills with something like Excel, both in their home life and even more in any kind of job where you use a computer. But then there's a set of people who would benefit more from knowing how to do some other thing on the computer.
Everyone's subset of knowledge of English reading/writing is only slightly different. But with computers two people can know a lot about computers but know hardly anything in common. So what should be taught?
I've long thought that the most important thing is to be curious and willing to try things on the computer, because you'll end up figuring out whatever you want. But maybe there are people for whom computer skills would be very useful but they'll only learn them if it's taught. I know there are a lot of things I learned in college that I wouldn't have discovered on my own because I never would have known to read about them.
Personally, I think that any computer education (well any education generally) is worthwhile, but universal concepts with practical application are the most important.
There are certain concepts that all advanced users of computers know, but about which there is a damaging lack of understanding amongst beginners.
As an example, a basic understanding of HTTP (not necessarily calling it that) would save people lots of money on their phone bill. I know plenty of people who see "1GB download limit" on their mobile contract and think that's fine "because they never download on their phone". When you explain that downloading includes refreshing their Facebook app, checking football scores, etc, they don't understand. Nor do they understand why they always go over the download limit.
So knowing what it means when you say 'download' is a universal benefit. If you're using a computer of some description in the 21st century, downloading something will be unavoidable.
Conversely, you can get by on a computer without ever using Excel or any kind of spreadsheet application. So while it would be tremendously useful to a very large number of people, it's not a fundamental in the same way as knowing the difference between a file and an application.
It's not necessarily what google expects you to think, but rather what you are most likely to be searching for.
Sometimes people search for content that they might not agree with, because they want to see what is being said there out of curiosity. Not every search is someone submitting their opinion to google, I'd expect that most are not.
You're right, I didn't phrase that right. I should have written "what google expects you to think of", like you said. Still, isn't that the same as "if we divide people in groups based on what they think about topic A; what does the largest group think?" Which to me sounds the same as "what you're most likely to be thinking".
All the people who want the addicts dead, let's say they're 30% of all people who think seriously about drug addicts, will happily rally under "should be shot", while the ones who want them rehabilitated would form many smaller groups under specific kinds of rehabilitation programs, how those should be administered and really what is the best program for fixing these people. Though, when you look at it like that, you're really most likely to think "should be rehabilitated" or maybe "should... I don't really have an opinion one way or the other". But then, if google actually did high-level clustering; that is, extracting opinions that are all at the same level of specificity, would those suggestions be useful for a search engine?
I guess the really right way to put it is -- That's what the google crawler has seen written most frequently -- and assume it doesn't really mean what you or I think about things.
> I guess the really right way to put it is -- That's what the google crawler has seen written most frequently -- and assume it doesn't really mean what you or I think about things.
Not what the crawler has seen most, but what people typing the same thing as you have ended up searching for most frequently. (We may be thinking the same thing and just confusing the words.)
I don't believe it's supposed to be "what you're most likely to be thinking", it's just a commonly searched-for phrase. I don't think Google's trying to autocomplete with your opinion because people aren't just searching for their own opinion, they're searching for words that will hopefully return the information they want.
Exactly. The "drug users" one, for example, leads to an article explaining how a police officer said that. Anyone hearing secondhand about that story and wanting to learn more about the incident would probably Google that phrase.
> "Newly minted college graduates soon entering the job market could be facing another hurdle besides high unemployment and a sluggish economy. Hiring managers say many perform poorly—sometimes even bizarrely—in job interviews."
First paragraph immediately stuck out. How is this a hurdle for new graduates? The fact that other graduates perform poorly means a given graduate will have an easier time than otherwise.
Do we all disdain math, writing, science, and history because they are forced? I imagine most of us did to some extent, but then again, some of the forced skills have turned out to be useful.
I'm not arguing for either side but I question the idea that CS deserves less attention than the natural sciences in a school curriculum.
There's that, and also the fact that cpus aren't getting faster as much as they used to. I imagine if my cpu was still doubling in single-core power every couple years that there would be dev tools that could make use of that power and I would want to upgrade.
And games would definitely be doing much more as well.
But it's not practical to make software that will take twice as long to run if cpus only speed up 10% every two years.
True. But, I think that's the other side of the coin. That is, I'm not so sure we need faster CPUs as we once did.
Save for extreme gamers and other more esoteric applications (e.g. CAD, video transcoding, etc.), most folks (i.e. the majority of the PC market) wouldn't benefit much from a CPU that's much faster than those now commonly fitted to the slightly-above-average consumer rig.
So, in general, I don't think software makers are holding back from making software that pushes the hardware. I just think it's more difficult to push today's more powerful hardware with typical software applications.
Agreed. There are a host of other potential applications that could push hardware as well.
And, I do believe that if one were to have mass market appeal (i.e. broad utility and demand), then we may see increased PC demand again (provided the PC is the appropriate platform).
But, do you think that there are a significant number of such applications waiting in the wings for PC hardware advances, or do you believe that perhaps no such applications are ready for prime-time as of now?
For me to be able to even guess at where I fit, either "coder" needs to be defined, or the poll needs to be "how many coders do you think are better than you" not as a percentage of coders, but as a percentage (or number) of people in the world.