For dimensionality reduction :
In Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA), we can visualize the data projected on new reduced dimensions by doing a dot product of the data with the eigenvectors.
How to do this for Quadratic Discriminant Analysis (QDA).
I am using sklearn's qda, ( https://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.QuadraticDiscriminantAnalysis.html ) and performed dot multiplication of the elements of each class with the respective scalings obtained for that class.
Is this correct? If not, then please suggest how to visualize the projected data for QDA.