I am working on training different models for different estimation of human pose problems. actually, what I need is to get different outputs from a regression model for different joints of the human body. After I did searches for this problem, I come up with this idea that I have two ways:
- training different models and combine their final results.
- training models in a chain shape. (The input of the second model is the output of the first model and ...)
I know Keras has a function called concatenate that is such a layer to merge two outputs of the models. But If I don't want to use Keras is it possible to have 6 models and then merge them in a way that the final trained model can estimate all the output of these different models at once?
my models are something like this(they are different based on different datasets i have):
## conv1 layer
W_conv1 = weight_func([3, 3, 1, 32])
b_conv1 = bias_func([32])
h_conv1 = tf.nn.relu(conv2d(x_image, W_conv1) + b_conv1)
# h_pool1 = max_pool_2x2(h_conv1)
#h_drop1 = tf.nn.dropout(h_conv1, keep_prob)
## conv2 layer
W_conv2 = weight_func([3, 3, 32, 64]) # patch 2x2, in size 32, out size 64
b_conv2 = bias_func([64])
h_conv2 = tf.nn.relu(conv2d(h_conv1, W_conv2) + b_conv2)
#h_drop2 = tf.nn.dropout(h_conv2, keep_prob)
## conv3 layer
W_conv3 = weight_func([3, 3, 64, 128])
b_conv3 = bias_func([128])
h_conv3 = tf.nn.relu(conv2d(h_conv2, W_conv3) + b_conv3)
#h_drop3 = tf.nn.dropout(h_conv3, keep_prob)
## conv4 layer
W_conv4 = weight_func([3, 3, 128,256]) # patch 3*3, in size 32, out size 64
b_conv4 = bias_func([256])
h_conv4 = tf.nn.relu(conv2d(h_conv3, W_conv4) + b_conv4)
#h_drop4 = tf.nn.dropout(h_conv4, keep_prob)
## fc1 layer
W_fc1 = weight_func([6 * 6 * 256, 9216])
b_fc1 = bias_func([9216])
h_pool2_flat = tf.reshape(h_conv4, [-1, 6 * 6 * 256])
h_fc1 = tf.nn.relu(tf.matmul(h_pool2_flat, W_fc1) + b_fc1)
h_fc1_drop = tf.nn.dropout(h_fc1, keep_prob)
# fc2 layer
W_fc2 = weight_func([9216, 1])
b_fc2 = bias_func([1])
prediction = tf.add(tf.matmul(h_fc1_drop, W_fc2) , b_fc2, name= 'output_node')
cross_entropy = tf.reduce_mean(tf.reduce_sum(tf.square(ys - prediction), reduction_indices=[1]))