0

I've recently started to get into Deep Learning with Tensor Flow. I've read a lot of online resources and took a Udacity course for beginners. I gained a lot of the basic principles of Deep Learning. But one thing I am struggling right now wasn't the topic of any of these resources. I have no idea how I can get my own labeled data (CSV format) into my Python program to train my net on. I've came across this posting and started to implement the Input Pipe in the way the post showed and it worked so far. When I tried to train my model by feeding my features and labels with

sess.run(optimizer, feed_dict={
        x: features,
        y: labels,
    })

TF is throwing an error

TypeError: The value of a feed cannot be a tf.Tensor object. Acceptable feed values include Python scalars, strings, lists, numpy ndarrays, or TensorHandles.

X and Y are defined as placeholders and the session is calling tf.global_variables_initializer()

x = tf.placeholder(tf.float32, shape=[10, batch_size])
y = tf.placeholder(tf.float32)

But the objects returned by tf.train.shuffle_batch are Tensors right? Should/can I convert them back to valid data type like a Numpy Array or is there any other, more efficient way of reading my data.

1 Answers1

0

You can do it in two ways.

  1. With feed_dict :

    features_numpy,labels_numpy = sess.run([features,labels])
    sess.run(optimizer, feed_dict={ x: features_numpy,y: labels_numpy})
    

    Run the session, get numpy values for features and labels and feed it to the model using feed_dict. But this is inefficient because you're copying TF data to Numpy and feeding it to TF again. Follow 2 for efficient way.

  2. Avoid feed_fict

    Do not use feed_dict and placeholders. Directly connect features and labels node to model. Or do this

    x = features
    y = labels
    

    Do sess.run(model_output/loss/accuracy) to get predictions or train the model.

I hope this helps.

Harsha Pokkalla
  • 1,792
  • 1
  • 12
  • 17