Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Things you wish you'd learned about programming in college?
44 points by wooby on Nov 29, 2009 | hide | past | favorite | 62 comments
A friend and I will be giving a talk to a group of computer science students in a few weeks. We're both students ourselves, but are a bit older and have more working experience than the average attendee will probably have.

Here are some of the ideas we've had so far:

- Present on "using and knowing the tools" from OS to editor to libraries.

- Something interactive, like working as a group through one of the problems from the New Turing Omnibus.

- Talk about the social aspects of coding, like group dynamic and collaboration tools, and maybe introduce pair programming.

The audience will be mostly undergraduate CS students. My friend and I are 25, and have worked freelance and at startups/medium sized companies on mostly Rails, PHP, and Java web apps for 5+ years. We have a lot to learn ourselves, and see this speaking opportunity as a way to better ourselves as presenters and learn more about the topic we pick.

So, what's a good topic? What's something every CS grad wishes they had learned or heard about? Thanks in advance for your suggestions!




Don't waste your expensive and valuable college time on software engineering tools, source control, and other mundane crap that any monkey can learn quickly. Every programming tool that I learned about in college is now either unused (CVS, RCS, Motif), or dismissed by the l33t rock-stars as dinosaur technology (C++, Java, Perl). Today's l33t tools will be just as dead in ten years. Learn enough to do your assignments well, but view the time invested as a sunk cost.

If you want to learn how to be a coder, go to DeVry, or read some O'Reilly books and hack away. Your career will be mercifully short and uneventful.

If you want to be a computer scientist, spend your time learning math and theory, and learn it inside out. Then take business classes, chemistry classes, language classes, art classes -- anything to make you marketable in a non-technical way. The only way you're going to survive to old age in software (if that's even possible) is by acquiring talents that grow more valuable with age and experience -- skills that can't be cheaply exported to the next younger guy who is willing to work 80 hours a week for low wages and free soda.


If you want to learn how to be a coder, go to DeVry

It's worth noting that neither a DeVry "degree" or a Computer Science degree will make you a programmer. If you just get the DeVry education, you will know what functions you can call in the Java core library. If you just get a CS degree, you will know what B+ trees are. But neither of those alone get you a working database management system; that requires excellent coding skill, and an excellent understanding of the underlying mathematics. "The real world" requires that you have both sets of skills; only having half the education will make you less than half the programmer.

I think this is a wider problem in our field; the "academics" don't want to admit that the "industrial" aspect of programming is important, and the practitioners don't want to admit that the academics are quite-often onto some very good ideas. Turns out that both sides are important. The practitioner doesn't want to waste time reinventing simple concepts (which is why we have code organization techniques that were popular in academia long before they were popular in industry), and the academic doesn't want to reinvent industry (turns out that knowing about automatic testing and source control make writing your new researchy programming language a whole lot easier).

So it's clear that to be an effective programmer, you need to be well-versed in both the practical and academic aspects of the field. Dismissing the practical aspects by saying they are a waste of time and that some dude from a trade school can handle them for you is ... short-sighted.

(Oh, and if Perl is a "dinosaur technology", Java, C++, and C are "pre-life self-replicating molecule" technologies. Just sayin'.)


I agree about needing both sides.

The people I've worked with that are heavy on the academic side (MS,PhD, etc) with little real world experience have, without fail, been very mediocre and have had a hard time getting stuff done. They are either architecture astronauts or they try to turn it into a thesis paper. Bonus: I once saw someone with a MS in some computer-related field remove RAM from a running computer.

My experiences with people that have had no academic experience but lots of real world experience have been just as bad. I recently worked at a place where out of 14 programmers, only myself and three others had a degree. They all, to some extent, shared some of the same qualities: they can get stuff done (in small projects), but usually in really dumb and expensive ways, and they display a poor aptitude for learning new skills and techniques, and can't think abstractly.

I don't mean to say that a degree is everything or will automatically make you a better programmer, but in getting one, you gain a foundation that you wouldn't otherwise get from work experience alone.


"So it's clear that to be an effective programmer, you need to be well-versed in both the practical and academic aspects of the field. Dismissing the practical aspects by saying they are a waste of time and that some dude from a trade school can handle them for you is ... short-sighted."

That's not what I said, though. My opinion is that if you want to have longevity in your career, you need to focus on the theory. That doesn't mean that you don't learn the other stuff -- you just learn what's necessary, and move on. More to the point: when you're spending big bucks on college classes, you'd better be devoting your time to learning stuff that you can't learn from a few hours of quality time in a coffee shop with an O'Reilly book and a laptop.

For what it's worth, though, I've known many CS professors, and many industry programmers, and I wouldn't put the average of one group ahead of the other in terms of programming skill. The myth that academics don't know "industry" is a myth (but there are plenty of coders who don't know anything about algorithms).

Also for the record: I like Perl. I'm not the one calling it a dinosaur technology. I also like C++, though, so maybe I'm just old.


Why can't you learn data structures from a book in a coffee shop? I did.

I would argue that you can't learn the theory until you have enough practical experience to actually implement and play with the theoretical concepts you are learning about. By not learning "the trivial stuff you can learn from an Oreilly book in a coffee shop", you are wasting your on time. The sooner you stop wasting your own time, the less of it you will waste.


If you divide the sticker price of my college education by the number of class hours, the implication is that one hour of instruction costs about $80. In the harsh light of that fact, I would still have paid a few hundred bucks to learn CVS or SVN in college rather than learning bad habits. My first two jobs programming (academic and quasi-academic) didn't use source control, and I kept my bad habits until I got into industry and was dragged kicking and screaming into professionalism. I think source control should be taught starting the first day of CS101. If the exact tool changes in 5 years, oh well, you can learn the new tool. But it should be an automatic, instantaneous, ingrained part of your process from day one. (Ditto IDEs and basic Unix system administration.)

Four courses that have been worth substantially more than $80 an hour to me: Japanese, Technical Writing, AI (mostly because it really should have been called Introduction To Scripting Languages), and (weirdly enough) my single course on assembly. That was entirely due to a twenty minute discussion with my professor that had an effect on me, the general gist of which was "Any performance problem can be solved by caching, if you do it right." I haven't programmed a single line of assembly in my professional career but every time a performance problem comes up I cache and the problem goes away. (And is replaced by a cache expiration problem.)


I think source control should be taught starting the first day of CS101.

I agree with the general sentiment but I think that instead this should be taught during the first time students need to work in large groups (more than 2 people). Without suffering from not having version control, no one will care. Now, I'm not saying that we should only focus on what people think they need as students, but if there is a way to force them to understand the value of something, all the better.


"Any performance problem can be solved by caching, if you do it right."

Terje Mathiesen, or a follower of his? His axiom is that all optimization, or at least almost all of it, is an exercise in caching.


> If you want to learn how to be a coder,

> go to DeVry, or read some O'Reilly books

> and hack away. Your career will be mercifully

> short and uneventful.

What a condescending thing to say.

I went to art school.

My career has been going strong for 15+ years.


Sorry, but there's a difference between programmers and computer scientists. He's talking about the latter.


Oh yeah, what's the difference?


In my opinion, the computer scientist is focused more on the academic and completes his work in the form of a paper. On the other hand, the programmer actually builds a product that is intended to be used. They ignore this distinction at my school (UMass Boston). The general consensus seems to be that programming is a lowly task.

Perhaps your experiences are different, but at the end of the day there is a sharp conflict of interest between the computer scientist and the programmer. Both want to solve problems, but the computer scientist seems more interested in some math problem and the programmer is more interested in building cool shit.


Computer science is as much about analyzing the properties of computation as it is about writing programs. Graph theory, computational complexity, data structures and algorithm design and analysis, etc. are all computer science topics.

Implementing solutions in code is programming.


wtf?


Knowing the "theory" of computer systems (operating systems, networks, compilers, etc.) isn't worth a damn thing if you can't MAKE and MODIFY one. Software engineering tools matter because you'll never be able to make or modify a system of any complexity without first engaging with them (to some extent).

The broken assumption here is that you can become a competent computer scientist by doing assignments


Seriously. Learn to think creatively and abstractly about problems, which doesn't necessarily come from being in over your eyeballs in code. Your ability to think outside the box will help you more than any computer science class ever could.


I find this so true.


Everything useful I learned about programming came either before college, after college, or when I got home from classes.

I'm obviously from the pro-dropout sector, but I can tell you that the one thing I've learned about learning is that for anyone successful (by pretty much any definition) I've met, most learning has taken place outside of a classroom setting.

Grab something hard, confusing, foreign, etc... Master it. Then move on. Learn things that are ostensibly completely useless to your job. You will do your job better than those who do not.

I still do take classes to help with things occasionally (there was a great arduino class at the hacker dojo recently), but even in that case, I've learned more about it on my own since the class than I did during it.

So here's my advice: Don't wait for other people to teach you things. Learn stuff on your own now. Maintain the expectation that you only know 10% of what you need to know.

And if anyone ever hears me say that I know everything worth knowing, someone please shoot me.


This might be the best advice I've heard in a while:

"Grab something hard, confusing, foreign, etc... Master it. Then move on... Don't wait for other people to teach you things. Learn stuff on your own now. Maintain the expectation that you only know 10% of what you need to know."

I feel like I should print this out and post it on my wall. Thanks, disspy.


Learning by yourself when you want to is just so much more efficient than learning in a class.

However I will give credit to school for introducing me to new stuff that I would have never crossed on my own. Perhaps that's how education should be in the future: easy introductions to various things.


> easy introductions to various things.

right on! imho, one of the main points of a diverse undergraduate education is to give you exposure to introductory-level courses in lots of different fields. it's unreasonable to imagine that you can master a field just by taking a few undergrad courses in it, but that's not the point of those courses. that's why i think that general-education and non-technical elective courses are such important parts of the curriculum, even though they don't directly teach you anything applicable in your future job


"Take initiative. You're not going to learn to be a programmer in college. Do open source, internships, summer jobs... Make sure you work with people better than you."

And then focus on getting the most out of the stuff that college is good at teaching: fundamentals and theory. Do you really want to learn how to be a practician from a career academic?


I'm in my final year of CS undergrad, and I've learned a couple of things outside of class that help me, all stuff I wish they would teach (well) in class:

* Learn how to write a Makefile, your builds should be one button/command away

* Learn how to test your code using assert.h, JUnit, whatever, just test all the fucking time.

* Learn GDB, stop using printf to debug.

* Use source control for everything, but especially group work. The groups in your classes emailing code around are doing it wrong, and by the time they'll find out, it will be to late.

* Spend some time in a functional language, and apply what you've learned to your Java/C/C++ class work. Minimizing side effects will minimize errors

Most of all though, get involved with your faculty or department, and try to affect some positive change. The connections you will make with other students and faculty could change your programming career more than anything else you do in school.


- Learn how to write a Makefile, your builds should be one button/command away

- Learn how to test your code using assert.h, JUnit, whatever, just test all the fucking time.

- Learn GDB, stop using printf to debug.

One of these things is not like the other. The first two pieces of advice suggest that doing things automatically is better than doing them manually. Then, you throw all that away for debugging? Instead of having your program just print the results of intermediate steps, you want to load the program inside another program, set breakpoints, and type "continue" as you hit them all, requiring your constant attention to perform a tedious task?

Debugging with printf is fine.


I think he should amend his statement to be learn how to use your debugger of choice.

I think printf is a fine tool for some things. GDB others, the IDE's version of a debugger as well for some tasks, re-looking at the code in question another time is another way. It should not be a holy war and there is never a completely correct way to do multifaceted tasks such as debugging. Knowing the strengths and weaknesses of the tool at hand is the best way to increase your effectiveness.

I am hacking on a kernel driver right now and printf(well IOLog) is the easiest way to debug but if I was running in userspace I would probably use the xcode interface to GDB because it lets you zero in on what you want to look at with state but sometimes printf is all I need.


What says you can't use GDB on the kernel? You'll need two machines to use it, but there's support in Darwin for debugging over FireWire.


I never said you couldn't. I said the easiest(least amount of hassle) way is to use IOLog and my brain for debugging purposes.


Source control, continuous integration, automatic testing. All of those things you might learn on a software engineering course but for some reason were conspicuously absent on my CS course.


Despite a trend for some colleges to teach computer science as software engineering (my college included..), they should be considered two different things. However, they're not. So we usually wind up with a mix that is some CS, some SE and satisfies very few people.


Agreed. I think most universities would rather err on the side of CS, lest they be construed as something like a trade school. Unfortunately, that doesn't mesh well with the reality that the majority of students will go into software engineering, not computer science. It's clearly difficult to serve the divergent goals of teaching people core abstract concepts, while still adequately preparing them for a career.

In most of my "CS" classes, it was sort of assumed that tools like version control, test suites, etc were just part of getting things done, and they were up to you to learn on your own time. Lots of TAs/LAs had quick tutorials to teach things like CVS and the like, but it wasn't really considered course material. The same was true for processes/development models, and for tools in other areas of engineering, like Solidworks, SPICE, etc.


+1 for each of these.

My university's CS & Soft Eng courses had them, but they were bullet points in the curriculum that the lecturers were neither interested in nor competent with.

So, source control meant "use RCS on your local machine for one assignment (individual, not group work.)"

Automatic testing meant writing some scripts that produced specified predictable results, for the lecturers to run the assignments against. No result verification, no structure, no test design, definitely no TDD.

Continuous integration was simply missing, although this was seven or eight years ago so I guess it was a newish concept back then.


I think teaching them half-assed like this is even worse than not teaching them at all. Students' initial impressions are that version control is a waste of time and that testing is pointless busywork, best done at the end.

It seems like those attitudes can persist well into many working programmers' careers.


How to research.

How to ask a question.

How to communicate.

How to come back to a program you've written after a year. Or after ten.

Distributed source code control.

Incremental development.

Integrated debugging. Planning for failure. For support.

Finding and using libraries or frameworks.

Backup.

Source code archeology; how to understand and extend and maintain complex applications.

How to work within a team of programmers.

Upward-compatibility and code longevity; the multi-edged sword of an installed base.

The game (tools, operating systems, languages) changes at least every five to ten years, or more often.

You're an integrator first, and a programmer second. Or third. When you need to.


It'd be interesting to teach a semester-long course building up a single complex application, and to require each student to swap their source code with another (randomly selected) student for each new assignment.


They do that at my university for one of the project courses (where teams of 5-6 build a single application for one semester). Later on they make teams trade the code with eachother. I'm not there yet so I can't comment much.

My program is computer engineering, not CS, though.


The first three are so important and so often glossed over. It drives me crazy when I hear classmates look for the easy way out or avoid a problem because it's "hard." There's no professor in the real world and if something's hard, tough luck. It's important to start learning to teach yourself as early as possible and during school is a great time to do it. Learning to independently handle difficult problems is something best done in a nurturing environment, not somewhere like a job where you have more to lose.


The whole purpose of learning about technology is to make technology do something for people. So somehow we got the idea to focus exclusively on the technology and hope the people part takes care of itself.

Smart people can learn all sorts of interesting technical stuff. In fact, if you're smart, you're going to learn all sorts of stuff like SDLC, functional programming, OOAD, ER Modeling, etc.

What they don't teach you is the part about interacting with regular people -- people who have no idea how to make technical things happen.

Interviewing. Negotiation. Conflict resolution.

I know it sounds fluffy, but I see smart guys all day long. What separates the truly successful from the rest of the pack is how well they interact and solve problems for people, not tech smarts. Technical smarts [and a willingness to develop them more] are a given by the time you get into the field.


Highly relevant stackoverflow.com thread: "What is the most important thing you weren't taught in school?": http://stackoverflow.com/questions/258548


Compilers.

Somehow I got a BS and MS in computer science without taking a single compilers course, and I've regretted it.

I recently saw someone make a comment along the lines of "I'm a web developer, why would I ever want to learn about compilers?" Many people seem to think compilers are just programs like gcc that turn a language like C into machine code, but that's really just one kind of compiler.


Functional programming.


My biggest nitpick about Georgia Tech was the lack of web-geared CS courses. The most "advanced" web-geared class was an 8-person class taught by a grad student about basic MySQL/PHP/Ajax. Of course I had the regular C, SmallTalk, Java, Matlab et cetera courses, but none of that interested me like LAMP, RoR and other stacks do.

Also not too many CS students early on have played with version control, and when they do it's CVS or something that was required by a class. SVN and Git should be more common in curriculum IMHO.

So I guess I'm suggesting you should address to these students where they can get started in web app development, because my university sure didn't help much when it came to that. Every semester I asked my academic advisor on their status of adding a Rails CS course and it was usually "same as last time.. looking for a professor that can teach it"


How to collaborate with a large team of others. We had group projects, sure, but they were small enough that usually one or sometimes two guys would do all the coding and everybody else would just fuck around.

The stuff you have to learn to effectively collaborate -- including task estimation, task splitting, source-management, communicating progress effectively, knowing when to ask for help, and more -- is basically what you spend the first year or two of your working career learning.

I don't know if it can be effectively taught in an academic context, but I really wish it had been. It's a timeless skill that transcends the actual technology used to get it done.


Don't forget about all the non-technical stuff. No one programs in a vacuum...

The Systems Development Life Cycle: analysis, design, development, testing, security, deployment, project management

Underlying systems theory, regardless of language: architecture, databases, frameworks, MVC, properties, methods

Underlying logic, regardless of language: iteration, branching and conditions, functions and routines, variable typing and scope

Anyone can go to webmonkey and learn how to write a "Hello World" program. But these basic underlying concepts that you use for the rest of your life: that's what college should be for.


I wish I took a software engineering class earlier than my junior year. I started programming my freshman year, and didn't realize how badly I was doing it till that class.

I also wish I didn't learn a compiled language first. I almost think there should be two beginner classes: lisp or python and then a unix tools class where you learn scripting.

Operational things like getting a box setup, code deployed, etc. would also have been nice.

I guess I'm saying I learned enough theory and math, and wanted more practical knowledge.


> I started programming my freshman year, and didn't realize how badly I was doing it till that class.

Now there's an interesting idea: a college doesn't have to devote huge amounts of time to teaching about version control, collaboration, and so forth. They just have to cover it enough to make the students realize that they have a problem.


This will obviously vary depending on how much the school focusing on CS or CE, but I'd stress mastering the basics of several programming languages in different domains (e.g. C, Java, PHP, Python, Lisp). There is a lot to be taken from each of these, even if you think they suck on their own.

I don't think you can effectively give a talk about "using and knowing the tools" without being too specific. If you're specific you fail. Instead, I'd focus on simply drilling it in people's mind that they should feel incredibly comfortable using their editor. It's the only thing they will always be able to control and it's the only place you'll be modifying your code. I'd encourage you have them pick an OS, and a standard library for each programming language and get super familiar with it.

I'm not sure interaction in the form of social coding would work in a single presentation. If this were a regular thing, sure. I think you can provide greater value doing something else.

One thing that I feel is often missing from my fellow students is confidence. They're all a bunch of babies and school is torture for them. I think an interesting talk (and judging by your experience, I'm sure you could make it awesome) would be to emphasize the confidence and empowerment that comes with mastering a programming language and doing some cool shit with it (a side project). There are too many people in my classes that are always sinking, who have not yet realized that they can actually do cool things because they're too busy sucking at class. They need to learn something really cool on their own, which will drive them to learn and finish school because they'll finally see dots starting to connect.


First, congratulations on being offered the opportunity to talk to your peers. I would echo many of the comments here to go outside academia and CS and focus on what you have learned by coding for pay and pleasure.

How to build your personal online coder reputation, begin contributing to the community (open source, blogging) - focus on what you personally have done/not done and your experiences good and bad with sharing and collaborating. And what's it like starting? or working at a startup

My point here is focus on experiences that have made you learn. Come in as peers who have been out in the external-college real world and tell them about the real world (calibrate their expectations)

I think most of the students are expecting a standard tech briefing, and I challenge you to switch it up on them. Promise some soup du jour tech topic but change it to talking about the real business world (change it after audience is seated so you get the students who probably would have skipped a real-world/business topic oriented lecture).

One of the most valuable lectures I've ever crashed was one for Ch.Eng seniors on how to get the job you really, really want. I directly respect the impact of that lecture with giving me enough real knowledge to apprach interviewing with confidence.


Thank you for your input, I really like the idea of "turning things around."


Good luck! You're welcome. I don't think anyone will walk out of your lecture after the bait-and-switch. Maybe you can hedge the bait-and-switch by making the main topic 20% of the total time and the rest 80% (Q&A + the switched topic). Please try to video or at least audio record it for your records (you seem to be someone who will want to do this sort of thing again, better). If you can get pizza (even if its - who wants pizza - pitch in a $1 or $2), that might help.


I wish I had learned C++, personally. A number of the places I was hoping to work for required many years experience in it. Like or hate the language, it's used frequently. Or at least it was.

Some other things I wish I had done/learned: * working with external libraries in C * working in groups on LARGE projects

Something I did learn through extracurricular activities, not school work for the most part: * SOLVING PROBLEMS!!! UVa, TopCoder, Google Code Jam, ACM Programming Contest. Learn how to take a given problem and convert it into code.


First, never confuse schooling with education

Second learn stuff thats in this book: http://www.amazon.com/Introduction-General-Systems-Thinking-...

Third,The general idea of good education is to prepare you for the future rather than learning about the past


Make up (or find) as much gnarly, twisted data as needed to throw at your code to convince yourself that it will respond gracefully no matter what comes along. (Then if you can, get someone else to.) If it can't - or you can't test all conceivable ways it can fail - then you need to either rethink your design or switch languages.


I used to be quite concerned that all my major programming projects were for my own business, and as a result I figured that I had to be missing some sort of 'process' step that would mean I was developing software the right way.

Being walked through a few different development lifecycle models would have helped to put my mind at ease.


Consider yourself lucky. You could have been taught ISO-9000 or something from the Software Engineering Institute. The way things go now, three classes and you would have missed a release cycle.


Experience is the best teacher. Just build programs! The more you build, the better you'll be. Note how I said build, not write. Think about it.

Oh yeah, and contributing to open source is the bomb. You learn so much about real-world applications of your knowledge.


I went through a "Information Science and Technology" program rather than a "Computer Science" program but the one thing I really felt like I missed out on was gaining experience using svn/git/etc in teams.


Code versioning and unit testing are the only two things I wished I had known from the start (though I still avoid them, god knows why).


I wish I learned more about testing, Source Control, and using and integrating with existing code and libraries.



I'd suggest you know the difference between

    selling software &
    selling consulting


Will your talk be recorded? I'd be interested in seeing it.


Yes, it's on December 11th, and will be recorded and available here: http://www.vimeo.com/user1295234 sometime in mid-December. Thanks for your interest! I'm looking forward to it.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: