1

I have a Debian server that is used primarily as a very low-traffic web server. Recently it began having some trouble regarding its network connection. I am able to use SFTP to upload a file to this server from an outside connection and the speed is just fine. However, if I am logged into the server via SSH and try to download a file using wget, curl or even "apt-get update", the connection will stall out after it has downloaded 100-200 kilobytes of data. I can't even update the system at the moment because of this problem.

What boggles me though is that uploading files to it via FTP or SFTP from a completely separate outside connection works just fine, so I know the up and downstream for the server are fine. The server itself can also upload to outside sources at normal speeds. Anyone have an idea of whats going on? If you need more information, just ask. I need to fix this. :)

Thanks!

norova
  • 21
  • 1

1 Answers1

2

The number of reasons why a file transfer could fail are legion. Why treat it like a black box when you don't have to? Fire up tcpdump, trigger the error, and take a look at what's actually happening. If it doesn't make sense to you, post a link where the good folks here at ServerFault can download the pcap file.

Also, have you verified that this is a general problem with downloads from multiple locations? Or just the apt mirror?

Insyte
  • 9,394
  • 3
  • 28
  • 45
  • I've uploaded a tcpdump log to this drop.io box: http://drop.io/k44hbis And yes, its just a general download problem. Changing repository for apt has no effect. – norova Aug 31 '09 at 18:33
  • I assume it's starting to die at around the 12:01:58.128819 mark. Unfortunately, it's really hard to work from the text output from tcpdump. What we really need is the actual pcap file. For example "tcpdump -w serverfault.cap port 80" would only capture WWW traffic and would write the full output to "serverfault.cap" in a format that can be imported into wireshark or other tools. – Insyte Sep 01 '09 at 00:38