Hacker News new | past | comments | ask | show | jobs | submit login

This is so true and so awful. I ran into the same thing. I had used a debugger and couldn't believe people had resistance to it when it was suggested by our TA. They were like, "Nah, I'll just 'printf' everything." And then the debugger they tried to get us to use was the god-awful command line version of gdb. I seriously think that turned several people off from ever using a debugger again because it was so bad.



Sadly printf debugging still captures something about the investigation process of humans, something debuggers don't handle well at first, until you're used to them and know how to map. It's blurry in my mind but you want to handle sets of breakpoints and custom structure IO/formatting for a debugging session, and debuggers don't offer a way to smooth this out (I'm no expert though, grain of salt required) so printf(...) is still a large reflex.


In a debugger you have to observe the changing state.

If you print values you get a debugging log which (for me) is often more helpful, because I can easier map it to the offending code line.

But it's good to have both tools available.


printfs are usually good. However when you deal with complex realtime multithreaded processes, sometimes high priority thread just pre-empts the printf, so you need to use somekind of flush statements ...this delays overall operations. In such cases debuggers are good 'to some extent'. I still haven't found the best way to debug hard realtime multithreaded applications


Those are exactly the places I find myself relying on printfs. In real-time, multithreaded code, you can't block on a thread and inspect things for a couple minutes and expect it to resume working afterwards.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: