Questions tagged [seq2seq]

Seq2Seq is a sequence to sequence learning add-on for the python deep learning library.

318 questions
0
votes
0 answers

why the output of T5 model contains ,.... when the input is not mask

I fintuned mT5 with new dataset for summarization task. In the inference phase, mT5 generate outputs contains , ... when the input is not mask. I use the blow code to encode the input: `tokenized_inputs =…
0
votes
1 answer

Trying to save history in tokenizer for seq2seq transformer chat model (GODEL base)

I'm fine-tunning a transformer seq2seeq model (GODEL base), but can't seem to save history in the tokenizers quite well. Here's the code: context = list(df['Context']) knowledge = list(df['Knowledge']) response = list(df['Response']) # Initialize…
0
votes
0 answers

Which model structure should be used for building a NER-like classification model on sensor data?

I am working on a seq2seq model that will work on the sensor data attached to the athlete's leg, which can mark the start and end indexes on the sensor data such as "step start" "step end" for each step taken. I'll also do same procedure for kicks…
0
votes
0 answers

Seq2Seq model in text-to-text

Welcome, I tried to extract the tables from the pdf, but if the table does not contain any bounds, then it doesn't get extracted, so I want to extract the text and the Seq2Seq model predicts the columns and rows, How can I do that?! I'm trying to…
0
votes
0 answers

Adding attention to keras encoder decoder seq2seq LSTM model

I have built a tensorflow keras encoder decoder seq2seq LSTM model. It's purpose is to predict answers to sentences, essentially a chatbot. I have successfully created the model, along with the inference models and managed to train it and generate…
egaga1
  • 1
  • 1
0
votes
0 answers

I don't know what to do with the decoder input of the time series-based seq2seq model

i'm wondering about seq2seq model's architecture (not using attentioin) also how to make decoder_inputs data. While studying, I looked at the structure of seq2seq. Sometimes repeatvectors are used and sometimes not. Are these both seq2seq? As far…
0
votes
0 answers

What should be inputted as decoder_inputs and decoder_outputs in the seq2seq model in timeseirs?

I am trying to predict the power (label) generated by solar intensity (feature) through time series data. So, I am trying to create a seq2seq model without attention mechanism.(I'm trying to predict the 11th label with the past 10 days'…
0
votes
0 answers

Is the a way to train Seq2Seq for user click behavior?

Our system can collect data on user queries and domain clicks based on their behavior. We aim to enhance this system by predicting the domain corresponding to a user query. For instance, if a user enters a query not included in our list, such as…
phuongdo
  • 271
  • 1
  • 4
  • 9
0
votes
1 answer

Sending tf.data.Dataset to model. fit for a seq2seq model, gives various errors in formatting

I am trying to update the steps in https://www.tensorflow.org/text/guide/word_embeddings to accept a sequence to sequence, in which the input sequence is context, question, answer concatenated together, and the output is the answer sequence. I tried…
0
votes
0 answers

i cant train an encoder rnn for seq2seq model

when i want to train an EncoderRNN i get this error Expected hidden size (2, 1, 300), got [1, 1, 300] my code : enter image description here enter image description here when i set hidden = zeros(2,1,300) i get this error RuntimeError: For…
0
votes
1 answer

Can't Initialise Two Different Tokenizers with Keras

For spelling correction task, I build a seq2seq model including LSTM and attention mechanism. I do char-level tokenisation with Keras. I initialised two different tokenizers, one for typo sentence and the other for corrected sentence. After testing,…
marisami7
  • 3
  • 2
0
votes
0 answers

Need help implementing seq2seq models with Pytorch DataLoader

I am currently working on a side project that attempts to predict a numerical response from a set of features encoded as time series data. For the following sample code shown below: import numpy as np import torch import torch.utils.data as…
0
votes
0 answers

unclear transformers prepare_seq2seq_batch deprecation

With this code: from transformers import AutoTokenizer, AutoModelForSeq2SeqLM # https://huggingface.co/Helsinki-NLP/opus-mt-fr-en # https://huggingface.co/Helsinki-NLP/opus-mt-en-fr tokenizer_fr_en =…
LeMoussel
  • 5,290
  • 12
  • 69
  • 122
0
votes
0 answers

Huggingface Saving `VisionEncoderDecoderModel` to `TorchScript` problem

python version: 3.9.12 transformers: 4.26.0 torch: 2.0.0 pillow: 9.2.0 Hi I want to save local checkpoint of Huggingface transformers.VisionEncoderDecoderModel to torchScript via torch.jit.trace from below code: import torch from PIL import…
Weber Huang
  • 233
  • 2
  • 11
0
votes
0 answers

Tensor decoder_inputs may not be reachable given the provided output tensor decoder_outputs

# Define the maximum sequence lengthsmax_input_len = 1000max_target_len = 100 # Define the input and output shapes encoder_inputs = Input(shape=(max_input_len,))decoder_inputs = Input(shape=(max_target_len,)) # Define the embedding…