0

I'm trying to take a vanilla autoencoder using Keras (with a Tensorflow backend) and stop it when the loss value converges to a specific value. After the last epoch, I want to use a sigmoid function to perform classification. Would you know how to go about doing this (or at least point me in the right direction)?

The below code is quite similar to the vanilla autoencoder at http://wiseodd.github.io/techblog/2016/12/03/autoencoders/. (I'm using my own data, but feel free to use the MNIST example in the link to demonstrate what you are talking about.)

NUM_ROWS = len(x_train)
NUM_COLS = len(x_train[0])

inputs = Input(shape=(NUM_COLS, ))
h = Dense(64, activation='sigmoid')(inputs)
outputs = Dense(NUM_COLS)(h)

# trying to add last sigmoid layer
outputs = Dense(1)
outputs = Activation('sigmoid')

model = Model(input=inputs, output=outputs)

model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
model.fit(x_train, y_train,
    batch_size=batch,
    epochs=epochs,
    validation_data=(x_test, y_test))
quil
  • 417
  • 1
  • 6
  • 16
  • 1
    Hi, what is your x_train and what is y_train? this doesn't look like a autoencdoer – CAFEBABE Jul 09 '17 at 19:11
  • Just look at the example in the link. If I get it working on that, I'll be able to get it working on my data. – quil Jul 10 '17 at 02:12
  • Let me put the same question differently: what do you want to achieve with the auto encoder. You could use a standard classification model. – CAFEBABE Jul 10 '17 at 19:03

1 Answers1

0

I have an interpretation of what you are aiming at, however, you don't seem to have a very clear image yourself. I guess you can clarify if you prepare the necessary dataset yourself.

One possible solution would be as below:

NUM_ROWS = len(x_train)
NUM_COLS = len(x_train[0])

inputs = Input(shape=(NUM_COLS, ))
h = Dense(64, activation='sigmoid')(inputs)
outputs = Dense(NUM_COLS)(h)

model = Model(input=inputs, output=outputs)

model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
model.fit(x_train, x_train,
    batch_size=batch,
    epochs=epochs,
    validation_data=(x_test, y_test))

h.trainable=False


# trying to add last sigmoid layer
outputs = Dense(1)(h)
outputs = Activation('sigmoid')

model2.fit(x_train, y_train,
    batch_size=batch,
    epochs=epochs,
    validation_data=(x_test, y_test))
CAFEBABE
  • 3,983
  • 1
  • 19
  • 38
  • is there a reason your applying sigmoid twice? `h = Dense(64, activation='sigmoid')(inputs)` `outputs = Activation('sigmoid')` – Kenan Oct 19 '17 at 02:23