I have developed a REST API with laravel 8 and configured the backend with PostgreSQL/PostGIS. My main aim is to handle atleast 200 concurrent requests per second for the API which should return the result within a second for all 200 concurrent requests. I have listed the configurations as follows but still I could not achieve 200 concurrent requests per second but could get the results in 3 seconds.
Technical Workflow
NGINX 1.18.0 -> PHP-FPM -> Laravel 8 -> PGBouncer -> PostgreSQL 12/PostGIS 2.4
AWS Server - M6G.8X Large
Operating System - Ubuntu 20.04 LTS
NGINX Configuration
user www-data;
worker_processes auto;
pid /run/nginx.pid;
include /etc/nginx/modules-enabled/*.conf;
# Load module section
load_module "modules/ngx_http_brotli_filter_module.so";
load_module "modules/ngx_http_brotli_static_module.so";
events {
worker_connections 1024;
}
http {
##
# Basic Settings
##
client_body_buffer_size 10K;
client_header_buffer_size 1k;
client_max_body_size 8m;
large_client_header_buffers 4 4k;
client_body_timeout 12;
client_header_timeout 12;
keepalive_requests 2000;
send_timeout 10;
sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_timeout 65;
types_hash_max_size 2048;
# server_tokens off;
...
...
}
PHP-FPM Configuration
Tried with static, dynamic and ondemand modes
pm = ondemand
pm.max_children = 5200
pm.process_idle_timeout = 5s
pm.max_requests = 2000
request_terminate_timeout = 600
PGBouncer Configuration
Tried with session, transaction and statement modes
pool_mode = transaction
max_client_conn = 50000
default_pool_size = 50
reserve_pool_size = 25
reserve_pool_timeout = 3
server_lifetime = 300
server_idle_timeout = 15
server_connect_timeout = 5
server_login_retry = 2
PostgreSQL Configuration
shared_buffers = 32GB
effective_cache_size = 96GB
work_mem = 33MB
maintenance_work_mem = 2GB
min_wal_size = 512MB
max_wal_size = 2GB
checkpoint_completion_target = 0.9
wal_buffers = 16MB
listen_addresses = '*'
max_connections = 1000
random_page_cost = 1.1
effective_io_concurrency = 200
max_worker_processes = 32
max_parallel_workers_per_gather = 16
max_parallel_workers = 32
parallel_setup_cost = 1000.0
parallel_tuple_cost = 0.1
min_parallel_table_scan_size = 100MB
min_parallel_index_scan_size = 5MB
parallel_leader_participation = on
The Laravel REST API is accessing DB for below needs.
- SELECT statement of accessing a postgres function to get the output which is marked as PARALLEL SAFE
- INSERT statement to add a new record in a functional table
- INSERT statement to add a new record in an audit table
Even with th above configurations I could get only 65-70 requests/sec throughput while checking from jmeter for 200 parallel requests and the requests completing within 3 seconds. The API responds on an average within 300-400 ms while checking in Postman. Please help me resolve this issue.