1

Here is the model https://huggingface.co/PrimeQA/t5-base-table-question-generator

Hugging face says that I should use the following code to use the model in transformers:

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("PrimeQA/t5-base-table-question-generator")

model = AutoModelForSeq2SeqLM.from_pretrained("PrimeQA/t5-base-table-question-generator")

It also provides a link to this documentation in the model https://github.com/primeqa/primeqa/blob/main/notebooks/qg/tableqg_inference.ipynb

It has the following code:

from primeqa.qg.models.qg_model import QGModel
from tabulate import tabulate # only used to visualize table



model_name = 'PrimeQA/t5-base-table-question-generator'
table_qg_model = QGModel(model_name, modality='table')

table_list = [
    {"header": ["Player", "No.", "Nationality", "Position", "Years in Toronto", "School Team"],
      "rows": [
            ["Antonio Lang", 21, "United States", "Guard-Forward", "1999-2000", "Duke"],
            ["Voshon Lenard", 2, "United States", "Guard", "2002-03", "Minnesota"],
            ["Martin Lewis", 32, "United States", "Guard-Forward", "1996-97", "Butler CC (KS)"],
            ["Brad Lohaus", 33, "United States", "Forward-Center", "1996", "Iowa"],
            ["Art Long", 42, "United States", "Forward-Center", "2002-03", "Cincinnati"]
        ]
    }
]
# [optional] include an id_list aligned with table_list
id_list = ["abcID123"]

print(tabulate(table_list[0]['rows'], headers=table_list[0]['header'], tablefmt='grid'))



table_qg_model.generate_questions(table_list, 
                                    num_questions_per_instance = 5,
                                    agg_prob = [1.,0,0,0,0,0],
                                    num_where_prob = [0,1.,0,0,0],
                                    ineq_prob = 0.0,
                                    id_list=id_list
                                )

How do I combine these two snippets? I did the following but I get errors:

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("PrimeQA/t5-base-table-question-generator")

model_name = AutoModelForSeq2SeqLM.from_pretrained("PrimeQA/t5-base-table-question-generator")

table_list = [
    {"header": ["Player", "No.", "Nationality", "Position", "Years in Toronto", "School Team"],
      "rows": [
            ["Antonio Lang", 21, "United States", "Guard-Forward", "1999-2000", "Duke"],
            ["Voshon Lenard", 2, "United States", "Guard", "2002-03", "Minnesota"],
            ["Martin Lewis", 32, "United States", "Guard-Forward", "1996-97", "Butler CC (KS)"],
            ["Brad Lohaus", 33, "United States", "Forward-Center", "1996", "Iowa"],
            ["Art Long", 42, "United States", "Forward-Center", "2002-03", "Cincinnati"]
        ]
    }
]

model_name.generate_questions(table_list, 
                                    num_questions_per_instance = 5,
                                    agg_prob = [1.,0,0,0,0,0],
                                    num_where_prob = [0,1.,0,0,0],
                                    ineq_prob = 0.0
                                )

It gives me the following error:

AttributeError: 'T5ForConditionalGeneration' object has no attribute 'generate_questions'
Real Noob
  • 1,369
  • 2
  • 15
  • 29

1 Answers1

0

You can load PrimeQA/t5-base-table-question-generator model using the Huggingface transformers library directly. However you cannot call the function generate_questions. This is because the function generate_questions is defined in the class QGModel. QGModel is a wrapper around the Huggingface AutoModelForSeq2SeqLM class and provides additional functionalities like reduce hallucinations etc. These functions are not defined in the AutoModelForSeq2SeqLM class.

So, you could load the model using AutoModelForSeq2SeqLM and perform generation using model.generate(). However, you cannot call the function generate_questions() or prune_hallucinations() as they are defined in the QGModel class.

Rudra Murthy
  • 710
  • 2
  • 8
  • 20