Absolutely zero experience when it comes to server related problems, but one has arisen at work.
We have a server with 64 CPUs (dont know the number of logical cores), 150GB+ RAM and 30TB of disk space. We have 5 clients attached to the server and the clients all have an application installed on it and reference the server for the data used within the application. The application itself has a very useful functionality that can either:
- Use available GPUs to perform calculations
- Use the CPU(s) to perform calculations
The clients have a single GPU each and a single CPU (32 logical cores). If we push the work to the GPU, it works but isn't timely; further we can pretty much kiss doing any other work goodbye since the displays cannot render and calculate simultaneously on the GPU. If we push the work the CPU, same issue with less of a delay in rendering.
The question came up: why not run the application on the server to leverage its specs?
The server runs on RedHat Linux, the application has support for this. Obviously nobody who reads this will know the nitty-gritty details, but in theory:
Is it possible/feasible to install an application on a server and run it from clients?
Not sure of general practice here, just doing some research.