I want to train my custom model with GPU devices. I am wondering Will clients be able to use it via CPU ?
Asked
Active
Viewed 2,634 times
3
-
3answered here: https://stackoverflow.com/questions/40980035/can-a-model-trained-on-gpu-used-on-cpu-for-inference-and-vice-versa – William D. Irons Mar 20 '19 at 13:35
-
I believe that after the model is trained it is just a set of ifs so - yes. – yossico Mar 20 '19 at 13:35
1 Answers
5
Yes, you do the heavy job of training on a GPU, save weights and then, your CPU will only do the matrix multiplication for predictions.
In Tensorflow
and Keras
you can train your model and save Neural Network weights:
Tensorflow:
# ON GPU
with tf.Session() as sess:
sess.run(init)
save_path = saver.save(sess, "/tmp/saved_model.ckpt")
# ON CPU
with tf.Session() as sess:
saver.restore(sess, "/tmp/saved_model.ckpt")
Keras:
model.save_weights('your_model_weights.h5')
model.load_weights('your_model_weights.h5')
With sklearn
algorithms, you can save weights this way:
model=XGBClassifier(max_depth=100, learning_rate=0.7, n_estimators=10, objective='binary:logistic',booster='gbtree',n_jobs=16,eval_metric="error",eval_set=eval_set, verbose=True)
clf=model.fit(x_train,y_train)
from sklearn.externals import joblib
joblib.dump(clf, '/path/your_model.joblib')
model = joblib.load('/path/your_model.joblib')
model.predict(X_train)

razimbres
- 4,715
- 5
- 23
- 50