1

So I'm quite deep into this monitoring implementation, and I'm curious as to how to calculate the theoretical maximum it can handle.

I know python is not the most efficient language, and I'm honestly not too worried about missing a packet here or there - but how can I figure out how fast it's going?

My network isn't corporate large, but it can keep up with an nmap scan. (Or so it seems)

It matches Wireshark, so I'm curious of it's limitations on a network with thousands of computers. The scapy documentation doesn't seem to get too far into it, but I admit I may have missed something

I create an async sniff with a callback that just throws the desired information into a hashtable/dictionary with the srcMac as a key, if that would affect anything.

  • 1
    You might experience some issues with scaling a scapy solution to a corporate network. This would be because does not handle multiple packets arriving at once well, does not have fast throughput, and does not use memory frugally. You might consider processing batches of tshark logs instead of having scapy sniff and process. If you are curious about quantitative analysis of where exactly scapy can't keep up, I would recommend doing some testing to find out :) – cmacboyd May 30 '22 at 19:36

1 Answers1

0

In my case where I was sending 4 MB of file from Host A to Host B using python sockets, I was getting 2.3Gb/s speed. The Scapy speed was 100x slower than the network speed depending on what kind of operations are you doing after sniffing the packets. When I was doing too much operations on sniffed packets, I could capture only 15 of 1000 packets(4MB). After optimizing my code Maximum I could sniff was still less than 200 packets out of 1000. If you need more precise measurements feel free to tag me, happy to share knowledge.

Nagmat
  • 373
  • 4
  • 14