1

I'm building a Flask test predictor using AllenNLP.

I'm passing 'passage' and 'question' from a .json file to the predictor.

However, when I pass the json file using curl, it doesn't return a response. Is there a special return in Flask to get it?

Code looks like:

 from allennlp.predictors.predictor import Predictor as AllenNLPPredictor


from flask import Flask
from flask import request
app = Flask(__name__)

@app.route("/", methods=['GET','POST'])
def hello():
    return "<h1>Test app!</h1>"


class PythonPredictor:
    def __init__(self, config):
        self.predictor = AllenNLPPredictor.from_path(
            "https://storage.googleapis.com/allennlp-public-models/bidaf-elmo-model-2018.11.30-charpad.tar.gz"
        )

    def predict(self, payload):
        if request.method == "POST":
            prediction = self.predictor.predict(
                passage=payload["passage"], question=payload["question"]
            )
            return prediction["best_span_str"]

Curl command looks like: curl http://127.0.0.1:5000 -X POST -H "Content-Type: application/json" -d @sample.json

Doug
  • 169
  • 1
  • 2
  • 12
  • Looks like an incorrect URL. Can you try `http://127.0.0.1:5000/predict` – v25 Apr 01 '20 at 23:21
  • Thanks for that catch. That was just a typo in my curl as I was trying a different route. I've edited it above now. If you look at the route now, everything is under "/". – Doug Apr 01 '20 at 23:27

1 Answers1

2

Unless I've misunderstood (I'm guessing you're asking how to obtain the JSON submission in your route, and return the result) it sounds like you need to do something like:

p = PythonPredictor()

@app.route("/", methods=['POST'])
def hello():
    data = request.get_json()
    result = p.predict(data)
    return result

This effectively runs the data in your sample.json through your PythonPredictor.predict method, and returns that prediction to the client.

Notice this code creates the instance p outside the route function, so that the NLP model is only loaded when your flask app starts (not on every request). However it looks like this may re-download that file, unless AllenNLPPredictor.from_path does some caching, so it would probably be advisable to manually download that file to your own storage first, and load from there in the PythonPredictor.__init__ function.

Let me know if any of this needs clarification, or I've missunderstood.

v25
  • 7,096
  • 2
  • 20
  • 36
  • Thanks so much @v25. That was precisely it. I think overall I was missing a concept here that your code snippet provided. To load the model in the PythonPredictor init do you have a link that I can study how to load it from my own server (azure hosted)? – Doug Apr 02 '20 at 03:13
  • Great answer, thank you so much for sharing your views. It's gonna help me alot as well – Aar Man Jan 28 '21 at 19:26