0

We are observing a strange behavior on a video streaming application over UDP on the same LAN.

The sender send an H264 video via UDP through ethernet configured as 1000BaseT-FD, the receiver receive the video through the ethernet cable in 1000BaseT-FD, but the received video is severely corrupted due to packet loss (the avarage video bandwidth is 40Mbps).

To remove ambiguity we have tried different protocol and framework: RTP, MPEGTS, RTSP using gstreamer or ffmpeg, and we have tried to connect the two device via two different ethernet switch or directly via an UTP Cat5 cable, but nothing is changed.

Mysteriously if we set the sender ethernet card to 100BaseTX-FD the lost of packets disappears.

Do you have any suggestion ?

gib
  • 1

1 Answers1

0

Did you try to directly connect using the same cable that you been using throughout your tests? Possible that the cable has a pair of cables that are damaged, thus causing large package drops.

I would try to test it using a brand new cable (preferably cat 5e or above).

Otherwise you possibly have a bad NIC on either side. Try to use a debugging tool (called iperf3) to test throughput via a standard link, and see what speeds and latency you can achieve.

Barnabas Busa
  • 772
  • 6
  • 10