The ugly truth is that kids don't study because they aren't expected to.
Let's face it as long as their kid gets that perfect GPA and is obedient enough most parents don't give a damn that they actually learn something. To my peers it's just a ritual they need to complete in order to have a good "life" whatever that is.
In fact, my teachers have scolded me for trying to derive stuff on my own. It is quite obvious to them that I should be spending time practicing to get marks instead. Who cares if I actually know that electric flux is only the field lines that penetrate the surface? As my teacher put it they'll check my mark sheet instead.
So, is it any shock that students don't study?
I know that I am learning the secrets of the universe in some ways, and what within my textbooks lie answers to questions that people have pondered over for centuries, but the truth is that no one else gives a damn. They know what a muon is because they need to know what a muon is. They can't see nature and they really don't care.
No one expects them to create things, that's left to those nuts like me (someone said this to me point-blank). What is expected of them is status, prestige and a good paycheck with a stamp that they won the rat race of life. Nothing more. Nothing less. Oh and a big house in the suburbs to show that they have arrived won't be too bad either.
In the end, this is a problem that is core to us as a society. Most kids are a mirror of how they were brought up, and only if they are taught to see things differently can the status quo change, but as anyone may tell you it takes integrity and it is just too damn hard. (no one makes the case better than Al Pacino in scent of a woman http://www.youtube.com/watch?v=nqsf0XynGz8)
"Marks herself points out that employers don’t generally care about the content of job applicants’ classes; they’re more interested in whether an applicant graduated, was able to meet deadlines, and work within a bureaucracy"
Sad, but probably true. One of the classes I remember the most was one that was set up to give engineering students an introduction to the workplace. So, you had business-like tasks (most of which were centered around Excel) and you were graded by whether you met, exceeded, or didn't meet expectations.
In one assignment, I made a really fancy Excel spreadsheet (it was a producer/consumer type problem, and my 1s for produced units actually moved across the page). But it had a bug, and the numbers it created were pure gibberish. I found the bug but had no time to fix it. I documented this fact somewhere in the fine print and talked about how I would fix it if I had more time.
It looked cool though, even if it didn't work at all, so I got "exceeds expectations." I had known how demos work long before that, of course, but I think it was my first real demo.
As for studying, well, maybe they should ask what happens to it after college? I didn't think we were supposed to stop. Which reminds me, I should fire up Anki and review.
I agree that college should not be the end of learning. However, I wouldn't want to be taught by a professor that says "that employers don’t generally care about the content of job applicants’ classes; they’re more interested in whether an applicant graduated, was able to meet deadlines, and work within a bureaucracy."
If they truly believe that, it will reflect in their courses. Instead of trying to teach their subject to the best of their ability, they'll might instead give many deadline oriented assignments that do not help learning of any kind. Instead of being accessible for students to ask questions, they might make it a more difficult process to get in touch with them, rationalizing that "they don't really care about the subject, all they care about is getting a good grade, and graduating so they can get a job in some bureaucratic workplace."
Professors with that attitude can turn students off to a subject that they would have otherwise enjoyed, and make it more difficult to learn.
Yeah, in this situation they are forgetting that not everyone in the class is doing it just to essentially check off a box in their degree process. Recently there was an email sent out that was basically like "do data mining, it's big in industry", giving that subject a miss because the lecturer is terrible but also because he's so industry focused.
I'm not planning on continuing in academia but I'm certainly not wanting to take classes that are just there to fulfil degree requirements without going deep into the subject and providing interesting and challenging content.
Not surprising. The degree requirement is largely a (time-consuming, expensive, and inaccurate) IQ and class filter that employers can use without getting sued out of existence.
agreed. for a lot of folks I believe it's used as a filter. and yes, a very expensive and time-consuming one. kinda sad actually. college is not without substantive merit to the student, beyond the filter effect, so it's not all superficial however.
True - except for the % that are heading to grad school to do physics studying is irrelevent.
The purpose of a college degree is to reduce the number of job applicants HR depts have to deal with.
Since everybody now gets A for every class recruiters now look for extra-activities. So studing to understand the causes of WWI in more detail is less use to you, and the recruiter, than you also being in the junior entrepreneurs club.
They just barely touch on the important part, which is whether or not students are achieving the same mastery of the subject matter. If professors are making courses too easy, that's the thing to fix. Butt-in-seat hours are only a symptom, and focusing on this (especially after acknowledging it's not the end goal!) is oddly Puritan.
I think it is a factor though, even with efficiency improvements in studying over time I doubt the average student can spend half the time and possibly get an understanding as deep as before across all subjects.
For some specific subjects though this may be a natural progression with technology improvements, certain things may just now be simpler.
We also have to look closely at where the students are coming from, I think the trend starts in high school, an increasing amount of people anecdotally seem to be developing the bad study habits here. Myself included, I think I have improved as time has gone on but I still don't feel most classes are giving challenges worth of close attention and study when I have much more interesting stuff on the side.
Even more important is whether students graduate knowing what they need to know for today's world. I don't think that past study habits are even applicable, since what students need to know nowadays may be completely different from before.
College is simply another hoop to jump through for (an increasingly unlikely) shot at the middle class. The act of studying is useful for when you actually want to know something, but the commoditized studying colleges encourage of plowing hours into courses one cares very little about has very little purpose, and students know this. Just another notch on the "American education is an absurdity and a farce" stick.
I don't think you should throw out "increasingly unlikely" without some evidence. I do worry about our education system. But the idea that it's harder to attain a middle class standard of living then it was in the past is very dubious. And, it's a claim that's used to support a lot of bad policies.
What do they consider studying though? Is paper writing counted as studying? Is online homework or problem sets counted as studying? Because I can tell you from personal experience that often those things take up much of my time, and tend to burn me out a bit so that when I have free time, I hang out with my friends and work on personal projects that I enjoy rather than studying the material I just did for homework. If I had more free time available, and less work to do, I would most likely read my book more thoroughly, especially in classes that I enjoy.
College is losing its romantic lustre. Students used to believe in education for itself, but now ambitious students just cram in more classes and activities to graduate faster or pad their grad school applications. Before, a student studying the American Revolution in class might go to the library and flip through the books in the professor's "suggested reading" section on the syllabus looking for a particularly interesting book to read thoroughly. And while he was there he would see other books in the stacks and end up reading for a couple of hours. Then he'd bring up what he learned in conversations with other engaged students.
Students would feel good about studying like that. They were learning. It was making them better than they used to be. Not only that, going to college was what they were supposed to do to succeed, and studying and discussing ideas was what you were supposed to do in college, so reading books about the American Revolution, or Sartre, and talking about them with your friends was pretty closely linked with your future comfort, according to the rules of society.
Less capable students would study for three hours because it was the only way they could pass their classes. (The ones who were smart and ambitious but not academically engaged? They're busy puking with rich men's sons. They pull down the studying averages, but later they'll subsidize all that economically unproductive studying as generous alumni.)
So what happened? Three things.
First, not many people have to study that hard to pass their classes. Professors used to set challenges for students that would force them to study, like difficult papers and exams. Now, that is seen as authoritarian and sadistic. You aren't allowed to test anybody just to make them study, because many people find studying difficult, even unpleasant, so it must be intrinsically bad for you. Studying has to be justified, on a minute, case-by-case basis, by concrete future benefit. This is slightly better than the attitude in some countries, especially in Asia, where forcing kids to study hard is the point of the whole system, and subjects like history and chemistry are just handy tools for what is essentially a benevolent society-wide hazing ritual. However, it's a step back from understanding that studying, like exercise, is actually good for you in certain quantities, and colleges are providing a service in forcing you to do it.
I know that sounds weird to some people, so I'll explain it in a way that hopefully everybody can relate to. Many people pay personal trainers to help them work out. Exercise instruction is usually just a pretext. The real purpose of the personal trainer is to provide social expectations, and sometimes actual social pressure, to force the client to do something beneficial they would not do in the absence of pressure: work hard in the gym.
Sometimes the pressure is merely having an hour-long appointment. Having made the appointment, the client would feel guilty about skipping or arriving late. The trainer also serves as an expert whose judgment about the nature and intensity of the workout is more credible than the client's, so the client would feel unjustified in substituting a lower standard. The trainer also provides helpful emotional support by providing emotional energy and even the threat (which must be real) of disapproval. Of course, this must be calibrated to the client, but if the client knows there is no shame in giving up and going home, then the personal trainer has failed in their job of providing social pressure to help the client succeed.
This explains why colleges used to make kids work hard. Of course, now they can't, because college is seen as compulsory for success, and success itself is compulsory, and nothing compulsory can be unpleasant unless it's exactly the same level of unpleasantness for everybody. It's basic fairness. So professors who make students study hard are assumed to be emotionally damaged reactionary sadists, unless they're charismatic enough to convince everyone that they're actually doing it "for the keeeds."
Second, a lot of middle class students now have the sense of safety and entitlement that used to be confined to the rich kids. So they're partying and assuming that as long as they don't flunk out, their future is assured.
Third, ironically, college itself is no longer guarantees admission to an economically stable middle class. Students who are worried about their future need to do something that distinguishes them. Something documented, something they can put on a grad school or job application. Grades mean nothing. Grade inflation has made it pointless to work hard in your classes and learn the subject better than anyone else, because you'll end up with nothing to show for it. Spending time in the stacks reading about the American Revolution doesn't make any sense to kids worried about their future; it's entitled self-indulgence. To them, it's the same as drinking and puking every day, just another form of entertainment that normal, economically insecure people can't afford.
So what kind of kids would spend their time actually studying? Ones with old-fashioned, romantic ideas about college. Ones who aren't concerned about their future. Ones who believe they should improve their minds and let everything else take care of itself. Or ones who are lucky enough to inhabit a social milieu where studying is actually encouraged. Kids in software and engineering at least have open source and the "makers" social crowd to encourage them to work hard. Writers can write for their friends, maybe even try to get published. Kids in history and economics? They're stuck looking for an undergraduate project with an impressive-sounding title.
Nah, students never "cared" about learning. College started in the U.S. as a way for the upper classes to cavort under the ostentation of "learning about the world". It has always been a status signal. This is why the middle classes are willing to pay their left nut for a suitably prestigious degree - status signals to the middle classes are like honey to a bee (or something).
There are perhaps .5% of the population for whom college makes actual productive sense - those intelligent and driven enough to devote their life to producing original research. Colleges now fulfill sort of an odd intermesh in society: stamping the middle classes with seals of 'approval', and housing people who produce actual scholarship. There is really no logical connection between the two, and indeed most professors dislike 'teaching'.
College isn't necessarily 'bad', but is when it saddles people who will be earning 40-60k the rest of their lives with 200k loan debts.
The ones who brought up the averages did. College may have been about prestige, but the system is rarely that honest about itself. Most of the people involved in maintaining the system -- teachers, professors, and students, certainly, and also many parents -- believed in the romantic image of college, and the ones who didn't believe, pretended. William James said that whatever you attend to becomes your reality, and thanks to everybody attending so closely to the myth of college as an educational experience, that's what it turned out to be in reality as well.
You're saying 199 in 200 people should not go to college? I disagree.
A lot of people are going to state and community colleges and are not spending very much money to do so. There are very few people who graduate with over $100k in debt. Most have much less, if any.
I am saying there is no productive value in them going relative to their financial burden. For the many who go for free/cheap thanks to state subsidy, that's great. But then we shouldn't worry that students aren't studying, since college doesn't really DO anything in regards to future productivity and the like.
Also, a lot of the "soft" factors people romanticize about college (spawning appreciation for the arts or some niche intellectual pursuit, civic/social development) have occurred for thousands of years by people just growing up.
I don't agree with your percentages on how many people truly benefit from a 4 year degree, In all the coverage of Senator Byrd though I did notice an interesting fact.
The recently deceased senator received a law degree in the early 1960's. Interestingly, he did not receive a bachelor's degree until the 1990's.
Apparently back then you only needed a high school diploma to go to law school.
There are a lot of things that are most efficiently learned and are useful to the world that colleges teach. It doesn't have to be just a place for future researchers, but as a society it would be useful for us to look at what we require a degree for, and figure out if that is really necessary or just a waste of time and money.
Of course, the money is big enough that this is unlikely to happen in a measured, sensible way.
> "those intelligent and driven enough to devote their life to producing original research."
you've heard of engineers, doctors and lawyers, right? lots of folks go to college to, in part, prepare to work in those fields. they may never do "original research" but can be quite intelligent and driven. I'm going to make a guess that you might be grad student or are otherwise on the academic/professor career path, if you said something like that. Because there are a lot of people who don't go down that path and actually have a uncomplimentary view of it, pretty much the inverse of what you said. i won't repeat the stereotypes here but they exist.
I've always figured that if I had to study for a test, there was something wrong. A test should judge if you know the underlying concepts, not how many facts you've memorized.
And think how much better you'd "know the underlying concepts" if, in addition to listening in lecture and staring blankly when they ask questions about the reading, you read a section in advance, did some practice problems, noodled around with assorted Wikipedia pages on the topic, talked to and sparred with your friends about the material, and dropped by the professor's office to nail down some esoteric edge case?
Actually, I'm guessing you did at least some of those things if you were a successful student. All of them are forms of studying; cramming facts for a test is but the slightest part (and, as you say, likely a sign of a poorly-written exam).
A test should judge if you know the underlying concepts
Isn't it a bit presumptuous for the student to dictate the criterion of the test like this? It's often easy for students to get underlying concepts, but education could be so much more, and teachers should be free to design tests of something more ambitious than understanding concepts.
I understood the basic concepts behind playing the piano at a young age: press the keys with variable force at rhythmic times - not conceptually difficult.
But a test designed to determine if I understand that would not measure an ability to apply that material.
Did you really never study a subject where at least part of what you needed to learn was facts? History? A foreign language (noun and verb forms, vocabulary, irregular noun and verb forms, genders, idioms)? Science? Computer science? Math? Literature? All of these subjects involve learning facts - at least some of the time.
Having enough command of background facts to quickly figure things out is useful, but beyond that, knowing where to look up the facts seems like a perfectly fine substitute for memorizing them. The real problem is if you know so little that you don't even know how to figure out what you need.
knowing where to look up the facts seems like a perfectly fine substitute for memorizing them
I suspect that you might revise this statement if you applied the simple metacognitive strategy of thinking about this as if you were a software system: if you put all of your facts on the other side of a high-latency lookup system — as you are suggesting — you limit the size your brain's base of associative firings. The work that one does to bring facts into the brain instead of leaving them in external storage allows those concepts to be quickly brought into the short-term "working memory" system where they can take part in cognitive activities. Eschewing practice and memorization limits the rate and scope of your thoughts. Worse yet, you won't know what you're missing. (This is the basic reason that Paul Graham's Blub programmer doesn't know what he's missing.)
I used to think that too - and to some extend, it is certainly true - but the problem is that you can't really derive new things without knowing old things, and you simply can't make mental connections between things unless you know them.
I agree there are large classes of things like that, but I think there are also large classes of things where looking them up is fine--- and my guess, though I could be wrong, is that the things traditionally memorized in schools lean more towards the second. If you think of the kinds of information that comes in tables, I think it's important to know that the table exists, and what kind of information is in it, when you would want to consult it, and ideally even how you would recreate it if you needed to. But actually memorizing the table? Doesn't seem that useful, and I think that's the kind of thing traditional tests have focused on (say, memorizing a bunch of different combinatorial identities, or memorizing a bunch of properties of chemical elements).
That's especially the case when nobody in the real world would be in some bizarre situation where they're stuck on a desert island having to do things entirely from memory. For example, if you're doing big-O analysis of an algorithm, it's perfectly reasonable to assume that you'll have Maple or Mathematica, or at least a list of common identities, available to use when simplifying your result or solving recurrences.
Indeed, but facts are easy to learn once you understand the concepts. I can memorize that a quicksort is O(n log n) and a binary search is O(log n) all day long. However, if I understand big O notation, I can figure them out on my own and (in theory) never have to memorize what the complexity of an algorithm is again.
Besides that, I shouldn't have to pay $300 a credit hour (for a cheap public school nonetheless) to memorize things. I can do that on my own.
I think that's a great argument, but only for some cases, in some disciplines. No amount of conceptual knowledge will tell you what the irregular 3rd person singular present active subjunctive of 'esse' is in Latin (it's 'sit'). No amount of conceptual knowledge will tell you what year the 1st Punic War began (264 bce) or where Vergil was born (modern Mantova - a town called Andes in his time). Some things, you have to memorize.
For that matter, so nobody thinks I'm picking intentionally useless and out of the way factoids, you need to memorize whether your programming language says 'elsif' or 'elif' or 'elseif' too. Would you trust a programmer who had to look that up all the time?
I dunno about Latin, but for most languages, the best way to learn what the verb forms are is to read and converse a lot, and attempt to write using sentences to the limits of your ability, with someone competent to correct (but ideally not to penalize) your errors.
Learning grammar completely abstracted from use is a masochistic and ultimately rather pointless exercise, whose main use as far as I can tell is providing a conveniently simple way to assess effort expended in language courses without having to assess real fluency (which takes more work for a teacher). Harshly grading grammar mistakes discourages people from pushing at the limits of their ability, because it rewards using only words and grammatical constructions that the student is completely confident in. The goal of any language course should be to encourage as much real language use as possible, because that’s the only way to real fluency, including fluency with correct grammar constructions.
The knowledge of the year the first Punic War began will be learned pretty well if you go read 50 letters written during the first Punic War, instead of looking at dates on flash cards. But reading those letters will ultimately be a lot more engaging and relevant to other circumstances. The year itself is only a useful fact insofar as it provides a point of reference by which other events can be compared. Memorizing a bunch of dates for their own sake is in my opinion wrongheaded, especially for non-specialist students (that is, primary students up through undergraduates). Much better to understand the causes and progress of the 1st Punic War than to spout off 100 dates of battles.
No one gives a damn where Vergil was born except the people who live in that town, and maybe some other poets who refer to the town and expect their readers to know the reference. In other words, the main reason to know where Vergil was born is just to impress people in the “in crowd” of snooty scholars, or some teacher with backwards priorities. Much better to spend your time carefully reading Vergil’s works, thinking hard about them, and coming up with something interesting to say.
Your elseif vs. elif spelling is a pretty stupid (in my opinion) thing to intentionally spend time memorizing. Go build some stuff in your language, looking up the parts you need. Every once in a while go through some language references or tutorials, etc. I guarantee you can learn the syntax of most programming languages without ever making a single flash card.
Now, there are some things you need to memorize:
* Medical symptoms and diagnoses, drug interactions, etc. if you’re a physician.
* Lines of a play, if you’re an actor; this holds for other kinds of performances too, such as music, etc.
* Basically, anywhere where knowing the precise fact is the goal. This does happen fairly often in the world; a supermarket cashier might need to memorize which aisle has the bread, for example, and a good teacher should try his best to learn the names of his students.
The thing is, often knowing the precise fact is assessed, and is the explicitly stated goal for students, but is not actually a good goal, pedagogically. Students who do not question the stated goals of their schooling are in my opinion harming their own educations, even if they are able to ace everything by official standards.
2) Thanks to the Flynn effect, people are growing smarter, so need less time for the same content
3) Thanks to the internet, its a lot easier and faster to learn something than it was earlier (more cliffnotes type resources)
4)People today can probably make much better assessments to what things will actually help them in their professional lives, and what things are random rites of passage i.e. hoops they have to jump through to get their degree
I wonder if this has been adjusted for the increase in dropout rates? I'm assuming that many of those dropping out did less studying for whatever reason, be it too busy working or just not having what it takes to do this sort of unsupervised work.
I don't study nearly as much as I should if I was to follow the official recommendations, but the truth is that it just doesn't take that long to learn the basic ideas, and once you have those, it is pretty easy to build the concepts on top of those.
It would take far more time if I was studying for a degree in the humanities, simply because they don't have a solid foundation on which to build stuff.
Sounds like the Japanese system. Work your ass off in high school and nighttime cram schools to get into the best colleges, then coast through college, partying, drinking, recuperating from a hellish youth that sucked up all your passion for learning, then graduate and get a job based on your top college's pedigree.
if you want students to study then you should eliminate grades. grades simply encourage them to game the system. of course, gaming the system is a very valuable real world skill.
"if you want students to study then you should eliminate grades. grades simply encourage them to game the system. of course, gaming the system is a very valuable real world skill."
Grades many times give you the push you need to succeed. Without them, many people wouldn't bother studying. A similar thing that happened recently in many public schools is the elimination of winners and losers in sports games (every game is a tie). Is this really going to prepare anybody for the real world?
The only people "gaming the system" aren't interested in learning and probably shouldn't be in college in the first place.
Who knows? But I guess you know (or have heard) of some people who do not bother studying attending institutions that issue grades.
(I do not want to claim that grades cause non-studying. Just that even with grades non-studying does occur--thus invalidating the grandfather-comment's argument.)
"Who knows? But I guess you know (or have heard) of some people who do not bother studying attending institutions that issue grades."
Some. But why cater to the minority? If the grade system wasn't working, we would see more problems in schools and universities. I also don't really know of a good alternative.
Strangely, I find myself in favour of marking exams (i.e. pointing out flaws and good points, maybe giving a score). Feedback is nice. But I am not in favour of grades for a class.
Although taking some kind of weighted average of the scores of individual tests is more or less trivial, I resent it. Perhaps having one final test is a better option? I don't know.
"Strangely, I find myself in favour of marking exams (i.e. pointing out flaws and good points, maybe giving a score), but not so much in favour of grades."
How does one fail a class if there are no grades? We can't let everyone pass (especially in something like medical school)
It's pass/fail, so you can still fail. Something like 70% or below is failing. You still get graded on work, but the classes themselves don't result in a GPA, just pass or fail.
I know the idea at Yale is that the students have proven they're hard-working, smart, and dedicated to being doctors, so they don't need to be run through that gauntlet again. Yale would rather them focus on really learning rather than gaming the grading system for a better class rank.
I guess in general we should disentangle teaching with accreditation. I.e. the institution that grades you, should not be the one that collects your tuition money. Just to prevent conflicts of interest.
(Letting somebody pass in medical school might be a different topic, I agree.)
A few schools, like Evergreen in Washington State, have done precisely this.
The other problem is that grades still solve some practical problems, which I discuss here: http://jseliger.com/2010/02/17/the-validity-of-grades . Not having grades will probably encourage system gaming too, just in a different (and probably worse) way than one has now.
Note: the effects may or may not be worse, however in the inter-rim present (say the next 40 years) our teaching staff would have no way to understand how the system is being gamed. You'd be changing the sport without changing the referee, an umpire can't referee a football game.
Changing from grades and teachers who've learnt graded systems and inherently learnt the systems pitfalls is a much better compromise than changing the system to no grades with teachers who've learnt graded systems and inherently have no knowledge on the pitfalls of an ungraded system.
We could overall decrease the ability to game the system, however we wouldn't be able to adapt our teachers as fast as we could change the system. Our students would be as knowledgeable of the system as our teachers, which is like turning our entire system of educators into fresh-faced Teacher's School graduates. It would quite possibly be the death of education for the next decade.
There are deficiencies in certain grading schemes but "eliminating grades" is too radical and impractical solutions. How do you think we should evaluate the students at the end of the semester?
If I wanted to, I could probably learn more in a month or so of home schooling using the Internet that what I could in an entire semester at a traditional college. Roughly speaking, YMMV, etc. Could pick my own hours, topics, pace, no need to cram for tests, no worry about grades, spend a ton less money, etc.
most of what they have most students "study" is academic trivia that (1) won't be retained for very long after exams are done anyway, and (2) doesn't have use after leaving college. Note I said 'most' and 'most'. The tech fields tend to have more stickability and usefulness in what is taught (though it's not perfect either). Most of the "studying" in college, from what I remember, was about doing short-term memorization cramming in the days or weeks, and very often just the night before or hours before a test of some kind.
University is not about studying entirely. You can freely study in your own as much as you like, simply buy the textbooks and read them. Indeed in such way one can in perhaps five years learn about all the major subjects, economics, physics, literature, et al.
The lectures are nothing more than a fanciful way of stating what is in the reading and if you have done the reading before hand you would be really bored and it is boring enough.
The seminars can be interesting and that is when everyone learns. You do need to do the reading before hand, but that takes no more than one hour. That is how long I used to study, one hour of reading before the class and I would be very well prepared. Then I was able to engage in the class discussions and engage in some real thinking.
That however does not include the assignments, especially in the final year, or the exams, where, at some point around February or March, you need to study almost all day and things can very easily get out of hand if you do not.
This is also where real knowledge is acquired. Not from the reading of the class book, but from the preparation of assignments, which requires a reading of some 20 journals, many cases - I did law and psychology - some books and then the actual writing of the paper itself. It is a very successful model of learning, that of learning by doing.
Now to go back to my point. University teaches you how to live independently for a start, to have the confidence to stand in front of your peers or argue with them, to manage and organise your time, but most important of all, to actually learn who you are and who you want to be.
The real story from this article is not that students are studying less, the reasons are obvious, we hardly need to take notes in class, we have power point presentations which we can read at any time which we do not because they are too condensed, everyone now has their own textbook which might not have been true 50 years ago, which are only little used because we can actually access real research which happens in pokets of the year rather than throughout.
Thus the questions for the researches of the polls and the like mentioned in the article are, when were the polls asked. From the article mentioning that number one is that the students do not know how to study I would probably think the polls were around October or November or maybe December time. This is the quietest time in university life. Even if the polls were asked at the same time each year since 1960s, there still have been many changes and things are very different. Thus, perhaps students are not really studying less. They might and probably do underestimate the time they spend studying also and perhaps some might not consider certain activities as studying at all such as researching for your assignment.
The real story from the article is that No 2 on the list is that the students are depressed. Why? Perhaps because they are disappointed and bored. It is too easy, you read the book, then you go to the lecture and are told what was in the book, then you go to the seminar and again are told what is in the book. Unless your memory is inferior, you will probably have a very good understanding of what is in the book by the end of it, until come February that is.
So why are the students depressed? Because they are disappointed and kind of angry. They want to be creative, opinionated, original, discover some ground breaking way of thinking, not rigid, objective, professional. They feel like they are a bit regimented.
However, perhaps and probably by the end they realise that their expectations were simply wrong and not very effective or useful when engaging with practice. They instead probably appreciate that they now have very good idea of the methods of operating within their field. If that is not success and if the system which achieves the goal of teaching the kids how to be men, then, what on earth is?
University works. Through the troubles and ordeals the adolescents are turned into adults. They are given the time to discover what they like, what they think, what are their principles, values, what are the principles and values of society, do they agree with them, would they like to change them, they are given the skills required to communicate effectively, find knowledge and synthesise it with the aim of coming to a conclusion on the entire matter etcetera.
I therefore disagree with the article that people are studying less. They are just studying differently. I also disagree with many commentators who suggest that our educational system is a shambles, especially at a university level. I think it is the best system humankind has found to a smooth transition from a confused adolescent to a very able adult.
Let's face it as long as their kid gets that perfect GPA and is obedient enough most parents don't give a damn that they actually learn something. To my peers it's just a ritual they need to complete in order to have a good "life" whatever that is.
In fact, my teachers have scolded me for trying to derive stuff on my own. It is quite obvious to them that I should be spending time practicing to get marks instead. Who cares if I actually know that electric flux is only the field lines that penetrate the surface? As my teacher put it they'll check my mark sheet instead.
So, is it any shock that students don't study?
I know that I am learning the secrets of the universe in some ways, and what within my textbooks lie answers to questions that people have pondered over for centuries, but the truth is that no one else gives a damn. They know what a muon is because they need to know what a muon is. They can't see nature and they really don't care.
No one expects them to create things, that's left to those nuts like me (someone said this to me point-blank). What is expected of them is status, prestige and a good paycheck with a stamp that they won the rat race of life. Nothing more. Nothing less. Oh and a big house in the suburbs to show that they have arrived won't be too bad either.
In the end, this is a problem that is core to us as a society. Most kids are a mirror of how they were brought up, and only if they are taught to see things differently can the status quo change, but as anyone may tell you it takes integrity and it is just too damn hard. (no one makes the case better than Al Pacino in scent of a woman http://www.youtube.com/watch?v=nqsf0XynGz8)