For the vast majority of server workloads, no - a graphics card is not used. This is true for web servers and NAS's.
There are a few exceptions - some specific algorythythms can be massively sped up with GPUs and there are servers designed specifically for these workloads. These include some types of coin mining, password cracking, Artificial intelligence, some kinds of graphics rendering. (google CUDA for a language used to work with GPUs). The types of load tend to be simpler sets of instructions which can be massively paralleled.