Why is Java on my server 8x slower than on my laptop
I have 1.6.0_10 installed on my Vista laptop. It has a 2GHz Sempron and 1G of RAM.
I have 1.6.0_13 installed on my Linux Server. It has a 3.2GHz Athlon X2+ dual core, and 4G of RAM.
I run a small benchmark that consists mostly of calculating the dot product of a single vector with itself. The vector is based on doubles. Most operations are double additions, double multiplications, and the loop. There is very little allocation or garbage collection.
The benchmark runs for 500 sec on the laptop, but takes 4200 sec on the server. I have tried using the amd64 release of the HotSpot JVM, as well as the i586 HotSpot JVM. I have tried OpenJDK. I have tried -d32. I have tried -client (as the first option). Memory settings are definitely not the issue - the process doesn't get anywhere near the memory limits of the machine. I have tried -XX:CompileThreshold=1. Some of these had minor impact on the run-time on the server but it is always above 4000 sec. That is over 8x slower for a machine that should be substantially faster.
What am I missing?