0

I get the following error:

ValueError: Cannot feed value of shape (1, 251, 5) for Tensor u'vector_rnn_1/Placeholder_1:0', which has shape '(1, 117, 5)'

when running code from here https://github.com/tensorflow/magenta-demos/blob/master/jupyter-notebooks/Sketch_RNN.ipynb

The error occurs in this method:

def encode(input_strokes):
  strokes = to_big_strokes(input_strokes).tolist()
  strokes.insert(0, [0, 0, 1, 0, 0])
  seq_len = [len(input_strokes)]
  draw_strokes(to_normal_strokes(np.array(strokes)))
  return sess.run(eval_model.batch_z, feed_dict={eval_model.input_data: [strokes], eval_model.sequence_lengths: seq_len})[0]

I have to mention I trained my own model following the instructions here:

https://github.com/tensorflow/magenta/tree/master/magenta/models/sketch_rnn

Can someone help me into understanding and solving this issue ?

Thanks Regards

sc3_2
  • 51
  • 5

2 Answers2

1

For my case, the problem is caused by to_big_strokes() function. If you do not modify the to_big_stroke() in sketch_rnn/utils.py, it will by default prolong the input_strokes sequence to the length of 250.
All you need to do, is to modify the parameter max_len in that function. You need to change that value to the maximum sequence length of your own dataset, which is 21 for me, as the line marked with "change" shown below.

def to_big_strokes(stroke, max_len=21):  # change: 250 -> 21
  """Converts from stroke-3 to stroke-5 format and pads to given length."""
  # (But does not insert special start token).

  result = np.zeros((max_len, 5), dtype=float)
  l = len(stroke)
  assert l <= max_len
  result[0:l, 0:2] = stroke[:, 0:2]
  result[0:l, 3] = stroke[:, 2]
  result[0:l, 2] = 1 - result[0:l, 3]
  result[l:, 4] = 1
  return result
YokkaW
  • 11
  • 1
  • Um I also think that is not an elegant way. Later, I found that the solution I posted days ago is not exactly the cure to this bug. Instead, yon't need to modify the method at all, but add a pass-in parameter when calling the to_big_strokes function. For this case, you can call it as: `strokes = to_big_strokes(input_strokes, 21)`, and the shape of strokes will be shaped to (1, 22, 5) after the encode() function (defined in that jupyter sketch_rnn project), which is tha same as the model can accept. **Note that 21 is still the max_seq_len of all input samples, you need to know this before.** – YokkaW Apr 11 '19 at 09:34
0

The problem was that the strokes size is not equal as the array size expected by the algorithm. So adapting the strokes array fixed the issue.

sc3_2
  • 51
  • 5