Questions tagged [roberta]

Roberta is a graphical open source IDE designed for multiple robot systems, like Calliope Mini, LEGO Mindstorm or the NAO. The main audience is children doing their first programming steps.

Roberta is a graphical open source IDE designed for multiple robot systems, like Calliope Mini, LEGO Mindstorm or the NAO. The main audience is children doing their first programming steps.

Links:

37 questions
1
vote
1 answer

NER Classification Deberta Tokenizer error : You need to instantiate DebertaTokenizerFast

I'm trying to perform a NER Classification task using Deberta, but I'm stacked with a Tokenizer error. This is my code (my input sentence must be splitted word by word by ",:): from transformers import AutoTokenizer tokenizer =…
1
vote
1 answer

Questions when training language models from scratch with Huggingface

I'm following the guide here (https://github.com/huggingface/blog/blob/master/how-to-train.md, https://huggingface.co/blog/how-to-train) to train a RoBERTa-like model from scratch. (With my own tokenizer and dataset) However, when I run run_mlm.py…
1
vote
1 answer

pretrained roberta relation extraction attribute error

I am trying to get the following pretrained huggingface model to work: https://huggingface.co/mmoradi/Robust-Biomed-RoBERTa-RelationClassification I use the following code: from transformers import AutoTokenizer, AutoModel tokenizer =…
Tomaž Bratanič
  • 6,319
  • 2
  • 18
  • 31
1
vote
0 answers

Pytorch Roberta kernal died immediately when running " out = model(inputs)"

I have a text dataset, which I trained on to get tokernizer, called "bert_tokenizer". Then I try to give a new word and get the word embedding out. from transformers import RobertaConfig config = RobertaConfig( vocab_enter code…
1
vote
0 answers

error received after loading Roberta and XLM_Roberta models from Huggingface

I am enjoying experimenting with different transformers from the excellent 'Huggingface' library. However, I receive the following error message when I attempt to use any kind of 'roberta'/ 'xlm' transformers. My Python code seems to work just fine…
1
vote
0 answers

How to obtain a list in a list (aka matrix)?

How to make an n-dimensional datastructure within lab.open-roberta.org using calliope as system? It seems there is no way to declare a variable as a list within a list. As a workaround one could think of making a list of String and then read the…
dr0i
  • 2,380
  • 2
  • 19
  • 36
1
vote
1 answer

Bug in if statement of roberta language programming a Calliope mini?

I am trying to program on the newly released Calliope mini computer platform (https://calliope.cc/) using one of the offered editors Roberta, a graphical interface (https://lab.open-roberta.org/). A simple program which checks whether a key is…
tfv
  • 6,016
  • 4
  • 36
  • 67
0
votes
0 answers

RoBERTa CSV sentiment analysis

I am analyzing a CSV of amazon reveiws using RoBERTa and I keep receving an exception. From what I can tell the reviews are simply too large for the model to analyize. The exception thrown is "The expanded size of the tensor (546) must match the…
0
votes
0 answers

Need help running SHAP Documentation code

I wanted to understand the logic behind working of SHAP by using custom functions and tokenizers. I tried running it and this part kept giving me errors: method = "custom tokenizer" # build an explainer by passing a transformers tokenizer if method…
abdz_128
  • 31
  • 8
0
votes
0 answers

Open Roberta - what is the meaning of the "drive" block without specifying the distance?

I made a simple program(image) using Open Roberta EV3 with only 2 instructions (start and drive forwards). Question: is the drive forwards block without any specified distance equivalent to drive forwards for a distance of 1 cm? The test I made…
Victorqedu
  • 484
  • 4
  • 20
0
votes
1 answer

Token indices sequence length warning while using pretrained Roberta model for sentiment analysis

I am presently using a pretrained Roberta model to identify the sentiment scores and categories for my dataset. I am truncating the length to 512 but I still get the warning. What is going wrong here? I am using the following code to achieve…
0
votes
0 answers

SequenceClassifierOutput has generator as loss instead of a tensor

I'm doing Distillation from a Roberta with an Adapter, I'm following this tutorial and in the function distill_roberta_weights() I just change teacher_model.config.to_dict() to student.load_state_dict(teacher.state_dict(), strict=False), so the…
0
votes
0 answers

TypeError: 'float' object is not iterable on RoBERTa tokenization

I tried to tokenized my sentences to generate relevant input for my RoBERTa model, but when i try to tokenize my sentences which in a form of np array, it won't tokenize my sentences and reslting in following error. TypeError: 'float' object is not…
Rikhu
  • 1
0
votes
0 answers

Using roberta for sentiment analysis on multiple tweets from csv file

Good evening: I am using Python 3 & Jupyter notebook to conduct sentiment analysis using roberta. I have multiple tweets in my CSV file but I can only obtain a score for one tweet. How might I find the sentiment score?? for each tweet and post it…
IDK
  • 15
  • 3
0
votes
1 answer

Target size (torch.Size([8])) must be the same as input size (torch.Size([8, 15])), multi-class classification using hugging face Roberta

I am using hugging face Roberta to classify multi-class dataset, but now I got an error “Target size (torch.Size([8])) must be the same as input size (torch.Size([8, 15]))”. I am not sure what should I do now, could anyone provide some…