Skip to main content

My debugger is

jdb
2% (19 votes)
jswat
1% (8 votes)
other standalone app
2% (13 votes)
part of my IDE
59% (481 votes)
System.out.println()
30% (242 votes)
don't need one
6% (48 votes)
Total votes: 811

Comments

favourite debugger

gdb, as it allows me to inspect native code easily as well.

cheers, dalibor topic

About unit tests...

Ah, the good old "unit tests have magical powers" theory. A unit test tells you that something is wrong, the process of debugging is to try to figure out what went wrong, and why. If your unit tests are telling you what went wrong and why (which they would have to, for adding more unit tests to solve the problem), you would have to have planned for exactly that thing to go wrong. (Unless magic is involved) But unit tests (at least in TDD) are about exploring requirements. So therefore adding more tests means adding more requirements (or perhaps adding tests for requirements that don't already have tests). And in any case surely adding requirements (or tests) doesn't solve the problem of the current test failing? The 'test infected' people I've worked with still appear to go through a normal debug cycle to figure out what is going wrong. Once they've figured out what caused the bug/defect only then do they add a test (for the technical requirement they have unearthed in the process). Otherwise you'd have some kind of bizarre potentially infinite recursion - my test failed, so I write test', but that fails, so I write test'', but that fails, so I write test''' (etc). The only way out would be to write perfect tests every time. So no, adding extra unit tests as a substitute for the debugging step is... bizarre.

About unit tests...

I am not attempting to suggest that unit testing prevents all bugs (although it is a good start). I am saying that the presence of bugs tends to indicate that your unit tests are insufficient. The bug may be simple, allowing you to insert tests (first, and perhaps check it in if you don't mind giving colleagues a fright) and then fix the code. If the bug is the sort of thing that would drive you to the debugger. Then it tends to indicate that your lower level tests are incomplete. This is an opportunity to patch these holes, and also give you an idea of the sorts of things you might have missed across you system. It's not infinitie recursion, because you are drilling down a level. Or perhaps inserting a level if your code is too entangled to be easily understood. A big exception is when you come up against a third party API or poorly documented protocol. It (typically) wont be productive to write tests for other peoples systems. Then it becomes suck it and see. To be clear about my personal point of view, Test First/TDD is rotten. I'm from the make-your-design-coherent-first school. Make it is coherent. Make sure it does the job. Then write good tests with good coverage. Writing tests may make you change your design (not always for better nor always for worse) as they represent a new use for your production code.

An alternative

A minority point of view says that instead of rushing for the debugger, you should see the situation as an opportunity to improve the unit testing. If the code is doing something unexpected, for no obvious reason, then clearly the unit tests need some work before resorting to debugging. Perhaps that attitude should become mainstream.

An alternative

No. Don't waste time. First whip out the debugger and debug the problem. Then create the unit test that captures the problem. Then fix the bug and walk over it one more time in the debugger.

Note modern debuggers allow resuming your code after code changes so you don't need to recompile, rerun, etc.. just continue your walk thru as you go repairing.