Posted by mikijavinator
on July 10, 2010 at 12:46 AM PDT
i implemented a server with sockets. Each connected client gets its own thread on the server. The server is executing queries and answering with the result to the client.
So i run a stresstest like the following:
query: for $n in 1 to 1000000 return lastnumber...
so the server is counting to 1000000 an returns 1000000
The problem ist, when i run 1 client it needs 0,141192231 sec to execute
When i run 1000 clients it needs 135,732144 sec total, so i divide
that with 1000 and get 0,135732... sec for one client.
The testing computer is a 3.4 ghz dualcore with 1 GB ram.
How is it possible that a client is faster when there a more clients?
Any thoughts on that would be great...