I want to get the BERT word embeddings which will be used in another down-stream task later. I have a corpus for my custom dataset and want to further pre-train the pre-trained Huggingface BERT base model. I think this is called post-training. How can I do this using Huggingface transformers? Can I use transformers.BertForMaskedLM
?
Asked
Active
Viewed 952 times
3

The Exile
- 644
- 8
- 14
-
1You might find this useful: https://huggingface.co/course/chapter3/3?fw=pt – Bill Dec 02 '21 at 16:49