I have a simple CNN model with a Conv2D, Maxpooling, flatten, dense layers. the input shape of my data is (8,8,1). I want to use the output features from the flatten layer as inputs to an XGBoost classifier. For that, I need to train the CNN model with all layers, then, load the model with trained weights but without the Dense layer. Then, I should use that trained model to predict on train data and then, use the results (i.e., features_train) as inputs to the XGBoost classifier. Does anyone know how can I load the trained model with its learned weights but without the dense layer?
below is my code:
input_shape_cnn = (8, 8, 1)
input_cnn = Input(shape=input_shape_cnn)
layer = Conv2D(filters=32, kernel_size=3, activation='tanh')(input_cnn)
layer = MaxPooling2D()(layer)
layer = Flatten()(layer)
layer = Dense(1, activation = 'softmax')(layer)
model = Model(inputs=input_cnn, outputs=layer)
model.compile(loss='binary_crossentropy',
optimizer=Adam(learning_rate=1e-4),
metrics=['acc', tf.keras.metrics.AUC(from_logits=True)]
)
history = model.fit(X, y, epochs=epochs, batch_size=batch_size, verbose=0)
features_train = model.predict(X)
xgb_model = xgb.XGBClassifier(max_depth=max_depth, learning_rate=learning_rate, gamma=gamma, reg_lambda=reg_lambda, scale_pos_weight=scale_pos_weight)
history = xgb_model.fit(features_train, np.ravel(y), verbose=0)