0

I'm working trough the Wine Classification Challenge and I'm not getting the same summary when training the model and printing its params:

model = pipeline.fit(X_train, y_train)
print (model)

For some reason I get this summary of the model: what I get

    Pipeline(steps=[('preprocessor',
                     ColumnTransformer(transformers=[('preprocess',
                                                      Pipeline(steps=[('scaler',
                                                                       StandardScaler())]),
                                                      [0, 1, 2, 3, 4, 5, 6])])),
                    ('regressor', LogisticRegression())])

How to get this more useful summary printed?: desired solution

Pipeline(memory=None,
         steps=[('preprocessor',
                 ColumnTransformer(n_jobs=None, remainder='drop',
                                   sparse_threshold=0.3,
                                   transformer_weights=None,
                                   transformers=[('preprocess',
                                                  Pipeline(memory=None,
                                                           steps=[('scaler',
                                                                   StandardScaler(copy=True,
                                                                                  with_mean=True,
                                                                                  with_std=True))],
                                                           verbose=False),
                                                  [0, 1, 2, 3, 4, 5, 6])],
                                   verbose=False)),
                ('regressor',
                 LogisticRegression(C=1.0, class_weight=None, dual=False,
                                    fit_intercept=True, intercept_scaling=1,
                                    l1_ratio=None, max_iter=100,
                                    multi_class='auto', n_jobs=None,
                                    penalty='l2', random_state=None,
                                    solver='lbfgs', tol=0.0001, verbose=0,
                                    warm_start=False))],
         verbose=False)

I also don't need the full model.get_params(deep=True) which prints a lot of stuff.

desertnaut
  • 57,590
  • 26
  • 140
  • 166

0 Answers0