0

I am trying to create a recipe generator at kaggle using tensorflow and lstm. But I am totally stuck in something related to dimesions. Can someone point me out in the right direction?

https://www.kaggle.com/pablocastilla/d/kaggle/recipe-ingredients-dataset/ingredients-recomender-using-lstm-with-tensorflow/run/1066831

Thanks so much!

Pablo Castilla
  • 2,723
  • 2
  • 28
  • 33
  • Could you please consider posting the relevant parts of your code directly in the post (links get broken/updated over time which is not useful for future readers) and add what you tried to solve the problem? – kafman Apr 07 '17 at 13:51

2 Answers2

1

I think the issue is that

training_batches[0][1] 

is a list and not a numpy.array, you should modify create_datasets accordingly...

Pietro Tortella
  • 1,084
  • 1
  • 6
  • 13
1

Here's an excerpt from the implementation of seq2seq.sequence_loss(logits, targets, weights), which you use in your code:

with ops.name_scope(name, "sequence_loss", [logits, targets, weights]):
    num_classes = array_ops.shape(logits)[2]
    logits_flat = array_ops.reshape(logits, [-1, num_classes])
    targets = array_ops.reshape(targets, [-1])
    if softmax_loss_function is None:
      crossent = nn_ops.sparse_softmax_cross_entropy_with_logits(
labels=targets, logits=logits_flat)

I believe the error you see is stemming from the last line in that code. The error message is self-explanatory:

InvalidArgumentError: logits and labels must have the same first dimension, got logits shape [8,6714] and labels shape [2]

I.e. the size of the first dimension of logits_flat and targets must be the same. This translates directly to your input to seq2seq.sequence_loss: The first two dimensions of your targets and logits variable must be equal. So, either you are not using the same number of batches for the two variables or somehow your sequence length changed (which would be weird though).

kafman
  • 2,862
  • 1
  • 29
  • 51