0

I have created a software which is being accessed by remote users through RDP (Remote Desktop Protocol).

Application Platform : C# 3.5 - Windows Form
Current Server that i am using : Windows Server 2008 R2

I am new to online servers and i wanted to know that does RAM consumption affect Network Bandwidth Usage? The reason why i am asking this because when my application performs a heavy calculation which consumes decent amount of RAM (but there is no UI changes), my network bandwidth consumption also spikes up that moment and after the particular calculation is removed from memory the network consumption goes back to normal.

Can anyone tell me is there any relation between RAM and Network Bandwidth usage?

Agent_Spock
  • 109
  • 3
  • Are you using any network filesystems? – David Schwartz Jun 23 '16 at 13:46
  • @DavidSchwartz yes i am sure because it was showing my app using the maximum amount of network usage. – Agent_Spock Jun 24 '16 at 04:56
  • Something which is not obvious from your question is how you are measuring your network bandwidth usage and bandwidth of what? – Ian Murphy Jun 24 '16 at 10:12
  • @IanMurphy i am using "Resource Monitor" which can be accessed from "Task Manager" to see network bandwidth usage of every app. – Agent_Spock Jun 24 '16 at 14:02
  • 1
    @Agent_Spock If you are using network filesystems, then this makes perfect sense. When you use a lot of RAM, cache gets discarded to make room in RAM. Then the data has to be acquired from the filesystem, using network bandwidth. – David Schwartz Jun 24 '16 at 17:17
  • @DavidSchwartz as far as i know when we use RDP all the processes are on the server and only Image is transferred to the client, so why network bandwidth is affected by RAM? – Agent_Spock Jun 25 '16 at 00:08
  • @Agent_Spock Because when the server's RAM consumption increases, as much cache as needed is discarded to increase free RAM. When the data that was discarded is needed again, it has to be fetched from the network filesytem. (This assumes you are in fact using a network filesystem.) – David Schwartz Jun 25 '16 at 00:36
  • @DavidSchwartz the theory you are telling me in the last comment does not seem to work in this condition as my single application was consuming 550MB of RAM and kept on using it until the respected page was closed but at that particular time when i see the network bandwidth it is still high in number. Why is that so? the system cannot get cache at time when the RAM is being heavily consumed, right? – Agent_Spock Jun 25 '16 at 05:45
  • @Agent_Spock No, wrong. That's when it will have to replace discarded cache items the most because it has the least working space. When RAM is being heavily consumed, both cache discard rates and replacement rates will be the highest. Some of the discarded data will be needed again while the RAM consumption is still ongoing. – David Schwartz Jun 25 '16 at 17:55

2 Answers2

0

Not directly, no. If you can get a packet capture on the network utilization during the calculation, you'll see what's using the bandwidth. Perhaps the server has to grab a bunch of data from a backend DB box to perform said calculation - that'd be my guess.

vigilem
  • 579
  • 2
  • 7
0

The answer according to me is that network bandwidth gets used when heavy memory is consumed by an application (which has 0 affect on UI). This might not sound right to many IT professionals but in my case when i tried to reduce the heavy memory consumption by removing images from the memory automatically the network bandwidth consumption reduced.

Statistics

Before removing image from memory "Network Bandwidth" consumption was 5.2MB/s
After removing image (108 images) from memory "Network Bandwidth" consumption was 720KB/s

Note : All the images that were stored in memory was just in the memory and wasn't displayed on UI and was stored as string format.

Agent_Spock
  • 109
  • 3