0

I am trying to use the french tokenizer and lemmatizer (LefffLemmatizer) of Spacy. I am using Python 3.6 in Conda environment.

However when I am trying to install the french version python -m spacy download fr, I have an error code ModuleNotFoundError: No module named 'cymem'. So, I did a check via pip list (to see if I had it) and I do. I did a conda update --all just to be sure the issue was not comming from my setting.

I was thinking I could bypass this by using the model from spacy. My current code is then trying to bypass it :

import spacy
from spacy import fr
from spacy.fr import French

nlp = French()
tokenizer = nlp.Defaults.create_tokenizer(nlp)

from spacy_lefff import LefffLemmatizer, POSTagger

pos = POSTagger()
french_lemmatizer = LefffLemmatizer(after_melt=True, default=True)
nlp.add_pipe(pos, name='pos', after='parser')
nlp.add_pipe(french_lemmatizer, name='lefff', after='pos')
doc = nlp("Apple cherche a acheter une startup anglaise pour 1 milliard de dollard")
for d in doc:
    print(d.text, d.pos_, d._.melt_tagger, d._.lefff_lemma, d.tag_, d.lemma_)

This is not working either. I have the next error happening.

AttributeError                            Traceback (most recent call last)
<ipython-input-43-6802b928c839> in <module>()
----> 3 pos = POSTagger()
...
--> 109         if not tk.get_extension(self.name):
...
AttributeError: type object 'spacy.tokens.token.Token' has no attribute 'get_extension'

How can I fix this?

Thanks for anyone who could help me.

Bianca

  • I think this isn't the correct way to load the French nlp model. Could you try this one: `nlp = spacy.load('fr')`? – Anwarvic May 15 '20 at 20:06
  • I tried it... and it still does not work. Right now, if I am trying to import spacy I have `----> 3 import spacy ---> 12 from . import pipeline ... ----> 4 from .pipes import Tagger, DependencyParser, EntityRecognizer, EntityLinker ModuleNotFoundError: No module named 'cymem'` – Bianca Martin May 15 '20 at 20:14
  • Try to uninstall spacy using `pip uninstall spacy`. Then, try install it via conda by running this `conda install -c conda-forge spacy` – Anwarvic May 15 '20 at 20:22
  • Thanks! (you are my hero.) I am now able to run spacy. However, i still have error running when i am trying to import 'Spacy Leffff" `from spacy_lefff import LefffLemmatizer, POSTagger ----> 1 from .lefff import LefffLemmatizer ----> 7 from spacy.tokens import Token ImportError: cannot import name 'Token'`. – Bianca Martin May 15 '20 at 20:34
  • what is the version of `spacy_lefff` that you're using? – Anwarvic May 15 '20 at 21:06
  • I am using version 0.3.6 – Bianca Martin May 15 '20 at 21:11
  • It doesn't show that error on my machine.. it shows another error though :) – Anwarvic May 15 '20 at 21:34
  • What can I do from there? I am about to call it a quit. It makes a week I am trying to tag and lemmatize one corpus – Bianca Martin May 15 '20 at 21:45
  • Have you followed [this answer](https://stackoverflow.com/a/44478073/5612363)? – Anwarvic May 16 '20 at 13:55
  • I ended up uninstalling completly anaconda and installing it back. And it fixed the issue. Something went probably wrong when I updated Conda. – Bianca Martin May 17 '20 at 13:27
  • Nice hearing that everything worked out eventually – Anwarvic May 17 '20 at 13:28

0 Answers0