1

Im trying to Send bunch of POST request to a server . i need to send maximum amount of possible requests for a second to sever and i have used CURL in linux a few tools like axios and nodejs but they are just not what im seeking for . the response time is high , used IP address to circumvent NS-LOOKUP time but there is still a TCP connection time that repeat for every packet and kinda an overhead .

The other problem im facing is that using CURL in a loop i see its doesn't iterate that fast seems like it can't do them pararel and needs one to complete to go for the next . so i used & and wait syntax in my bash script file but still not that efficient plus i know how to use nohup

How can i keep my connections alive to solve that TCP Connection overHead? or is there any tool for this purpose out there? How can i send like 1000 or more requests at once ... i dont care about the the respond i just need to get my request to the server in a certain time faster than any one else.

ItsJay
  • 13
  • 1
  • 4

2 Answers2

3

You can use Apache ab to load test your server. Use it like this:

ab -k -c 350 -n 20000 -p content.json -T application/json example.com/

This will command for example will fire up 350 simultaneous connections until 20 thousand requests are met.

Henrik Pingel
  • 9,380
  • 2
  • 28
  • 39
  • tnx herik im so all the headers and flags i use with cur like --data-raw .. Proxy-Authorization... Cookie and ... can be used with this stress tester too ? @henrik-pingel – ItsJay May 11 '20 at 15:48
  • I think so. Check the linked documentation. I believe it has all these features. – Henrik Pingel May 11 '20 at 15:52
0

Basically you need HTTP benchmark / load test. Take a look at this list for example: https://gist.github.com/denji/8333630

NStorm
  • 1,312
  • 7
  • 18