Questions tagged [elmo]

A tag for ELMo, a way to model words into a deep contextualized representation. It is part of and developed by AllenNLP.

ELMo is a deep contextualized word representation that models both (1) complex characteristics of word use (e.g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i.e., to model polysemy). These word vectors are learned functions of the internal states of a deep bidirectional language model (biLM), which is pre-trained on a large text corpus. They can be easily added to existing models and significantly improve the state of the art across a broad range of challenging NLP problems, including question answering, textual entailment and sentiment analysis.

75 questions
0
votes
0 answers

Can't install allennlp 0.5.0 in Colab

I want to use ElmoEmbedder from Elmo. Requirements: Python 3.6 - lower versions of Python do not work AllenNLP 0.5.1 - to compute the ELMo representations Keras 2.2.0 - For the creation of BiLSTM-CNN-CRF architecture When i install allennlp 0.5.0, i…
0
votes
1 answer

NLP ELMo model pruning input

I am trying to retrieve embeddings for words based on the pretrained ELMo model available on tensorflow hub. The code I am using is modified from here:…
0
votes
1 answer

Why i getting Highway.forward: `input` must be present when running : from elmoformanylangs import Embedder

I am trying to use ELMoForManyLangs programmatically, by using Embedder python object. from elmoformanylangs import Embedder e = Embedder('/content/drive/MyDrive/ColabNotebooks/158', batch_size = 64) When I run, I am getting the following…
0
votes
3 answers

`Highway.forward: input must be present` in ELMo embedding?

I use Elmo Embeddings for my NLP task. The pretrain was in the Indonesian language from this git. Importing the library by using the syntax from elmoformanylangs import Embedder causing the following error: TypeError: Highway.forward: input must be…
Sarada
  • 61
  • 1
  • 9
0
votes
1 answer

Iterating through multiple files with BERT for QA returns nothing

I am trying to ease my job. I need to do some analysis on the answers BERT gives me for thousands of files. My main objective is to iterate through every file and ask A question. I have been trying to automate it with the following code import…
DarknessPlusPlus
  • 543
  • 1
  • 5
  • 18
0
votes
0 answers

Is it OK to combine domain specific word2vec embeddings and off the shelf ELMo embeddings for a downstream unsupervised task?

I am wondering if I am using word embeddings correctly. I have combined contextualised word vectors with static word vectors because: my domain corpus is too small to effectively train the model from scratch my domain is too specialised to use…
s2134
  • 1
  • 2
0
votes
1 answer

TF-Hub Elmo uses which word embedding to concatenate with characters in Highway layer

I understand that Elmo uses CNN over characters for character embeddings. However I do not understand how the character embeddings are concatenated with word embeddings in the Highway network. In the Elmo paper most of the evaluations use Glove for…
phoenix
  • 13
  • 5
0
votes
1 answer

Confidence score of answer extracted using ELMo BiDAF model and AllenNLP

I'm working on a Deep Learning project where I use a bidirectional attention flow model (allennlp pretrained model)to make a question answering system.It uses squad dataset.The bidaf model extracts the answer span from paragraph.Is there any way to…
Linu Bajy
  • 1
  • 1
0
votes
0 answers

Training ELMO on TPU for generating embedding from custom dataset

Is it possible training ELMO model from scratch on tpu instead of gpu? i want to generate Turkish elmo embedding from custom large corpus.
hazal
  • 33
  • 4
0
votes
1 answer

ValueError: Error when checking input: expected input_1 to have shape (50,) but got array with shape (1,) with ELMo embeddings and LSTM

I'm trying to reproduce the example at this link: https://www.depends-on-the-definition.com/named-entity-recognition-with-residual-lstm-and-elmo/ In few words, I'm trying to use the ELMo embeddings for the Sequence tagging task. I'm following this…
Paolopast
  • 197
  • 1
  • 11
0
votes
0 answers

Training loss and validation loss not reducing while using elmo embeddings with keras

I am building a LSTM network using elmo embeddings with keras. My objective is to minimize the RMSE. The elmo embeddings are obtained using the following code segment: def ElmoEmbedding(x): return elmo_model(inputs={ …
0
votes
1 answer

Reference text for pre-training with ELMo/BERT

How-to issue: spaCy mentions that ELMo/BERT are very effective in NLP tasks if you have few data, as these two have very good transfer learning properties. My question: transfer learning relative to what model. If you have a language model for…
user9165100
  • 371
  • 3
  • 11
0
votes
1 answer

ModuleNotFoundError: No module named 'elmoformanylangs' when I installed ELMo in Colab

I followed the following steps to install ELMoForManyLangs ! git clone https://github.com/HIT-SCIR/ELMoForManyLangs.git cd ELMoForManyLangs/ ! python setup.py install from elmoformanylangs import Embedder ModuleNotFoundError …
Sougen Bai
  • 11
  • 1
0
votes
1 answer

pyspark pandas object as dataframe - TypeError

Edit: RESOLVED I think the problem is with the multi-dimensional arrays generated from Elmo inference. I averaged all the vectors and then used the final average vector for all words in the sentence as output and it works now for converting to a…
androboy
  • 817
  • 1
  • 12
  • 24
-2
votes
1 answer

Fine-tune ELMO for russian language

How can I fine-tume the ELMO model for russian language? If I want to do this with AllenNLP, I need some options.json file, but I don't know where to get it
1 2 3 4
5