By its definition, the entropy is defined by Shanoon as:

Now if you apply this formula to the Student-t distribution, you will notice that this one already contains the degree of freedom parameter (v):

As a result of the integration you will have in the approximation both Betta and Digmma. If you can the calculation, honestly I couldn't, you will find out that these take v as an input just as a result of the calculation. It is not in their definition.
v varies between 1 (Cauchy distribution) and infinity (the normal distribution).
To simplify the calculations, I used the code below:
import numpy as np
import scipy.special as sc
v = float(input('Degre of freedom '))
v1 = (1+v)/2
v2 = v/2
Entropy_of_Variable_X = v1*(sc.digamma(v1)-sc.digamma(v2))+np.log(np.sqrt(v)*sc.beta(v2,0.5))
print('Entropy of the variable X, of degree of freedom equal to : ', v, 'is ', Entropy_of_Variable_X)
You can pass it a list or something like that to calculate the entropy for multiple distribution.