I would like to set up a prediction task, but the data preprocessing step requires using tools outside of Python's data science ecosystem, though Python has APIs to work with those tools (e.g. a compiled java NLP tool set). I first thought about creating a Docker container to have an environment with those tools available, but a commentator has said that that is not currently supported. Is there perhaps some other way to make such tools available to the Python prediction class needed for AI Platform? I don't really have a clear sense of what's happening on the backend with AI platform, and how much ability a user has to modify or set that up.
Asked
Active
Viewed 114 times
2 Answers
0
Not possible today. Is there any specific use case you are targeting not satisfied today? Cloud AI platform offers multiple prediction frameworks (TensorFlow, scikit-learn, XGboost, Pytorch, Custom predictions) in multiple versions.

gogasca
- 9,283
- 6
- 80
- 125
-
Thanks for your response, I updated the question to clarify what I'm trying to do. – Chris Ivan Aug 26 '19 at 02:28
-
Hi Chris, what would be the list of requirements so we can understand better your use case? – gogasca Aug 26 '19 at 04:09
-
We would like to use fasttext for NLP, and we require a good Japanese parser, which in our case, most likely means mecab, which has a python library, but it requires use of a dictionary which must be built with a c++ compiler. Fasttext I believe can also be built into a Python package. If it's pip-installable/buildable, it's enough just to list it in the setup.py file, yes? – Chris Ivan Aug 26 '19 at 06:55
-
Correct, if its pip installable or pip buildable we can add it to the preprocessing step. – gogasca Dec 11 '19 at 19:15
0
After looking into the requirements you can use the new AI Platform feature custom prediction, https://cloud.google.com/ml-engine/docs/tensorflow/custom-prediction-routine-keras
To deploy a custom prediction routine to serve predictions from your trained model, do the following:
- Create a custom predictor to handle requests
- Package your predictor and your preprocessing module. Here you can install your custom libraries.
- Upload your model artifacts and your custom code to Cloud Storage
- Deploy your custom prediction routine to AI Platform

gogasca
- 9,283
- 6
- 80
- 125
-
Thanks for looking into this. I had already encountered that page, though, and I don't really understand how I would package a compiled c library. First I would need to build it locally I gather, but for the parser to have access to that library, it needs to know where to look for it, which requires setting things up like environment variables, and there are a host of files in the library that get built into the environment, so...I don't think it's that simple, though I have to admit I know very little about actual software dev. – Chris Ivan Sep 02 '19 at 00:58