From techopedia
Hadoop Common
refers to the collection of common utilities and libraries that support other Hadoop modules. It is an essential part or module of the Apache Hadoop Framework, along with the Hadoop Distributed File System (HDFS), Hadoop YARN and Hadoop MapReduce.
Like all other modules, Hadoop Common
assumes that hardware failures are common and that these should be automatically handled in software by the Hadoop Framework.
Hadoop Common
is also known as Hadoop Core
.
Hadoop Client libraries
helps to load data into the cluster, submit Map Reduce jobs describing how that data should be processed, and then retrieve or view the results of the job when its finished. Have a look at this article
This Apache link provides the list of dependencies of Hadoop Client library.