It seems that more often than not, logging fails to garner the respect it deserves. Many applications weave megabytes of useful debug information from a combination of thousands of successful and failed requests into a single log file. Tools help filter the result to some degree but can't help much when the production application forgoes debug level logging for the sake of performance.
I use Apple Keynote for my conference presentations (Keynote competes with PowerPoint for those of you stuck running a legacy operating system). I like to think pretty presentations somewhat detract from my horrid speaking abilities.
In a recent thread on TheServerSide.com, readers voiced concern over JDBC drivers and connection pools that fail to close Statements and ResultSets when you call
Simon Brunning solicited advice on Data Access Object (DAO) design. I have a few tricks up my sleeve that don't demand a great deal of effort (no need to mock up
InitialContext for example) but that do result in performant and maintainable tests.
In the months since Bitter EJB came out, the reviews alone have made the hard work and social sacrifice feel worthwhile. If you'll pardon the shameless plug, I couldn't be pleased more by the warm reception:
"Well written... not just a catalog of antipatterns... will really give the
reader more insight into EJB."
When we started our project six months ago, we didn't consider AOP at first. After digging deeper into the design, we recognized the unwelcome presence of crosscutting concerns and couldn't deny the urge to reduce dependencies and avoid rote, error-prone code. With an ideal AOP framework in mind, I assessed the landscape.
If you've ever diagnosed a bug in a web application, you've undoubtedly experienced annoyance digging through a list of fifteen exception stack traces trying to identify the one you're interested in (if it's even present), or a sinking feeling when you tailed the web server log only to find: