I have deployed an ml model on IBM cloud. But when calling its api through flask it only works for .predict() method which basically is okay for a ml model, but my problem is that the input data is a text and it needs to be transformed with tfidfvectorizer, how can I achieve this only with the api without any .pkl file locally. Or what can I change while training my ml model so that it directly accessess the text. The data is fairly large.
I have to transform text to tfidf locally and then call api which is an deployed ml model. But this is not what I want.