Posted by tball
on September 1, 2008 at 9:04 PM PDT
Developers often get so involved with design abstractions that those concepts can block understanding of what's really happening during execution. Reminding yourself that "it's just bits" regularly can keep you on top of whatever problem you are investigating.
I was recently interviewing, and found that while some things have changed in the fourteen years since I last jumped companies, a lot is still the same. Recruiters still can barely spell Java or C++, let alone adequately screen a developer with those skills. Still, they (and the pointy-haired bosses working with them) are able to dream up a litany of all the hot technologies they just read about in Business Week magazine. Good luck actually finding someone with all of those qualifications (and who is able and willing to do the work for $50K ;-).
I've never been able to pass this recruiter hurdle, however, because I tend to work on whatever project is most needed at the time, rather than on what up-and-coming technologies might beef up my resume. What's kept me employed is a network of people who know that an engineer who can contribute to a team creating solid products is much more important than having someone who knows the hot technology of the month. And a belief we share is that it's much easier for an engineer to learn a new language, platform, framework, whatever, than it is to turn some buzzword-compliant tech wannabe into an engineer. Happily, my new employer, Google, understands this difference; that's why they are looking for engineers who are good bit-slingers since, let's face it, they have a LOT of bits to sling.
I think the difference is that real engineers know what's under the hood for any digital technology: at the lowest levels, it's just bits, ones and zeroes, no mystery. No matter how esoteric some platform abstractions are, at the end of the day they run on processors twiddling bits in interesting ways. Those layers of abstraction we all so carefully craft are essential for managing the increasing complexity of today's software, but it's easy to forget that when a layer is posing a seemingly unsolvable problem, we can always reach beneath it to piece together a solution.
That's why I don't agree with the current "testing rocks, debugging sucks" meme that is floating through our community, as both skills are essential for a good engineer. Yes, testing is essential for verifying software in an easily repeatable manner, and that the best first step in fixing a bug is to write a regression test that duplicates the problem. But good debugging skills which can look under the hood when necessary are essential for truly understanding what the problem and the right fix are.
I'm probably biased because one of my early learning experienced used a Heathkit 6810 microprocessor trainer , because you have to do the problems in not C, not assembler, but raw machine code. Firmware, linkers and loaders are all easier to understand once you've distilled a program down to a list of hexadecimal numbers. Robotwar was another fun learning experience; it may not have had me writing real assembler, but I learned quickly to debug all new code on its test bench or risk watching my mistakes blow my robot up in the arena.
My suggestion to new developers is that no matter what technology you are working with, regularly dig into how it is implemented because what you don't know now what may bite you later. Use a debugger regularly and have source code to the libraries it depends on, so that it's easy to step through the hard-to-understand areas. And when that layer is understood, dig deeper when you can. In the end it's all just bits, and good engineers regularly remind themselves of that fact.