I'm trying to load the decomposable attention model proposed in this paper The decomposable attention model (Parikh et al, 2017) combined with ELMo embeddings trained on SNLI., and used the code suggested as the demo website described:
predictor = Predictor.from_path("https://storage.googleapis.com/allennlp-public-models/decomposable-attention-elmo-2020.04.09.tar.gz", "textual_entailment")
predictor.predict(
hypothesis="Two women are sitting on a blanket near some rocks talking about politics.",
premise="Two women are wandering along the shore drinking iced tea."
)
I found this from log:
Did not use initialization regex that was passed: .*token_embedder_tokens\._projection.*weight
and the prediction was also different from what I got on the demo website (which I intended to see). Did I miss anything here?
Also, I tried the two other versions of the pretrained model, decomposable-attention-elmo-2018.02.19.tar.gz
and decomposable-attention-elmo-2020.02.10.tar.gz
. Neither of them works and I got this error:
ConfigurationError: key "token_embedders" is required at location "model.text_field_embedder."
What do I need to do to get the exact output as presented in the demo website?