2

I spend days trying to figure out what is going on and I am still getting this error. here is the error I get

ValueError: Variable rnn/multi_rnn_cell/cell_1/basic_lstm_cell/weights does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?

And here is my sample code, does anyone know what I am doing wrong?

x = tf.placeholder(tf.float32,[None,n_steps,n_input])
y = tf.placeholder(tf.float32,[None,n_classes])
weights = {
    'out': tf.Variable(tf.random_normal([n_hidden, n_classes]))
}
biases = {
    'out': tf.Variable(tf.random_normal([n_classes]))
}

def RNN(x, weights, biases):

    x = tf.unstack(x, n_steps, 1)

    lstm_cell = rnn.MultiRNNCell([cell() for y in range(2)] , state_is_tuple=True)


    # Get lstm cell output
    outputs, states = rnn.static_rnn(lstm_cell, x, dtype=tf.float32)

    # Linear activation, using rnn inner loop last output
    return tf.matmul(outputs[-1], weights['out']) + biases['out']

def cell():        
    return rnn.BasicLSTMCell(n_hidden,forget_bias=0.1, reuse=True)

pred = RNN(x, weights, biases)
Zardaloop
  • 1,594
  • 5
  • 22
  • 43
  • What are you trying to do with `tf.variable_scope(...)`? Are you trying to reuse the weights in both LSTM cells? In that case, you should just reuse the same cell object. Otherwise, just remove the `tf.variable_scope`, since it is confusing TF (you are creating ops under two instantiations of the same variable scope, which results in inconsistent naming, e.g. see [here](https://github.com/tensorflow/tensorflow/issues/6007#issuecomment-315030061)). – jdehesa Oct 19 '17 at 10:54
  • thanks, @jdehesa I have tried without it but it is still complaining about the same issue.I have edited my question . – Zardaloop Oct 19 '17 at 10:58
  • Ah, wait, I didn't see you were using `reuse=True` in the cell. That expects you to create the necessary variables before hand (or have them created with a prior call to `static_rnn` / `dynamic_rnn`). If you really want to reuse the weights, it is generally easier to reuse the same cell object if possible (i.e. create one and pass a list with two references to it to `MultiRNNCell`). If you don't want to reuse the weights, then do _not_ pass `reuse=True`. – jdehesa Oct 19 '17 at 12:17
  • could you please answer the question, by amending the code you the way you think it will work? – Zardaloop Oct 19 '17 at 13:40
  • I'm still not clear whether or not you actually want to reuse the weights in the LSTM cells, the answer is different in each case. – jdehesa Oct 19 '17 at 14:03
  • yes I want to reuse them – Zardaloop Oct 19 '17 at 14:04

2 Answers2

2

If you don't need to reuse the cell, just use the following,

def cell():        
    return rnn.BasicLSTMCell(n_hidden,forget_bias=0.1)

Else, If you need to reuse, you can follow this Reuse Reusing Variable of LSTM in Tensorflow post that has a nice explanation.

Nipun Wijerathne
  • 1,839
  • 11
  • 13
1

If you want to reuse the weights, then the easiest way is to create a single cell object and pass it multiple times to MultiRNNCell:

import tensorflow as tf
from tensorflow.contrib import rnn

n_steps = 20
n_input = 10
n_classes = 5
n_hidden = 15

x = tf.placeholder(tf.float32,[None,n_steps,n_input])
y = tf.placeholder(tf.float32,[None,n_classes])
weights = {
    'in': tf.Variable(tf.random_normal([n_input, n_hidden])),
    'out': tf.Variable(tf.random_normal([n_hidden, n_classes]))
}
biases = {
    'in': tf.Variable(tf.random_normal([n_hidden])),
    'out': tf.Variable(tf.random_normal([n_classes]))
}

def RNN(x, weights, biases):

    # Initial input layer
    inp = (tf.matmul(x, weights['in'][tf.newaxis, ...]) +
           biases['in'][tf.newaxis, tf.newaxis, ...])
    inp = tf.nn.sigmoid(inp)
    inp = tf.unstack(inp, axis=-1)

    my_cell = cell()
    lstm_cell = rnn.MultiRNNCell([my_cell for y in range(2)], state_is_tuple=True)

    # Get lstm cell output
    outputs, states = rnn.static_rnn(lstm_cell, inp, dtype=tf.float32)

    # Linear activation, using rnn inner loop last output
    return tf.matmul(outputs[-1], weights['out']) + biases['out']

def cell():        
    return rnn.BasicLSTMCell(n_hidden,forget_bias=0.1)

pred = RNN(x, weights, biases)

However, you have to make sure that it makes sense to share the variables, dimension-wise, otherwise it will fail. In this case, I have added an additional layer before the LSTM cells to make sure every LSTM input is the same size.

jdehesa
  • 58,456
  • 7
  • 77
  • 121