I'm tweaking my Akka Http server
now and having very horrible results when loading it with concurrent requests. Since I wasn't sure if perhaps I had a hidden blocking IO request somewhere I figured it would be worth testing the example project from the Akka Http site:
Alternatively, you can bootstrap a new sbt project with Akka HTTP already configured using the Giter8 template:
sbt -Dsbt.version=0.13.15 new https://github.com/akka/akka-http-scala-seed.g8
I've gone ahead and boot strapped it as per the instructions and run the server on local host:
/path/to/bootstrap/sbt run
[info] Running com.example.QuickstartServer
Server online at http://127.0.0.1:8080/
http://127.0.0.1:8080/
I ran some very trivial tests with "ab
" tool:
Simple test performing sequential requests:
ab -n 1000 http://127.0.0.1:8080/users
Server Software: akka-http/10.1.5
Server Hostname: 127.0.0.1
Server Port: 8080
Document Path: /users
Document Length: 12 bytes
Concurrency Level: 1
Time taken for tests: 0.880 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 165000 bytes
HTML transferred: 12000 bytes
Requests per second: 1136.74 [#/sec] (mean)
Time per request: 0.880 [ms] (mean)
Time per request: 0.880 [ms] (mean, across all concurrent requests)
Transfer rate: 183.17 [Kbytes/sec] received
We see that the "time per request" is 0.880 ms [mean]
in this case
Now I bumped the concurrency up to 5:
ab -n 1000 -c 5 http://127.0.0.1:8080/users
Concurrency Level: 5
Time taken for tests: 0.408 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 165000 bytes
HTML transferred: 12000 bytes
Requests per second: 2450.39 [#/sec] (mean)
Time per request: 2.040 [ms] (mean)
Time per request: 0.408 [ms] (mean, across all concurrent requests)
Transfer rate: 394.84 [Kbytes/sec] received
Now Time per request
has increased quite sharply 2.040 [ms] (mean) (throughput is much higher though)
and again bumping up to 50 concurrent requests:
ab -n 1000 -c 50 http://127.0.0.1:8080/users
Concurrency Level: 50
Time taken for tests: 0.277 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 165000 bytes
HTML transferred: 12000 bytes
Requests per second: 3607.35 [#/sec] (mean)
Time per request: 13.861 [ms] (mean)
Time per request: 0.277 [ms] (mean, across all concurrent requests)
Transfer rate: 581.26 [Kbytes/sec] received
Here the latency is extremely high , at 13.861ms vs the first case which was at 0.880ms (latency increased about factor 16)
This simple server has no blocking IO.
I am wondering what I should configure in order to keep the latency as low as possible.