Questions tagged [regularized]

Regularization involves introducing additional information in order to solve an ill-posed problem or to prevent over-fitting by shrinking the parameters of the model stay close to zero

In mathematics and statistics, particularly in the fields of machine learning and inverse problems, regularization involves introducing additional information in order to solve an ill-posed problem or to prevent overfitting. This information is usually of the form of a penalty for complexity, such as restrictions for smoothness or bounds on the vector space norm.

From http://en.wikipedia.org/wiki/Regularization_%28mathematics%29

<

195 questions
0
votes
1 answer

Does CatBoost support L1 regularization of the cost function?

Does CatBoost suppport L1 regularization? The hyper parameter l2_leaf_reg controls the L2 regularization term of the cost function, but is there any way to use the L1 norm?
Leo
  • 63
  • 1
  • 6
0
votes
1 answer

Regularization of activations once the model is initialized

I would like to add regularization of activations in Tensorflow.keras on a pretrained network, using a loop over layers. If I want to regularize weights or biases, I can do: l1=0.001 l2=0.001 for layer in model.layers: if isinstance(layer,…
shora
  • 131
  • 11
0
votes
1 answer

Extracting Variables from Elastic Net in R

How do I extract variables from elastic net for modeling purposes? (if this is a stupid question and the answer can be found someplace please let me know and I'll look) I have already done cross validation and determined the alpha but I am trying to…
westiegirl
  • 11
  • 4
0
votes
1 answer

How does this regularization code affect loss?

I have seen some learning with convolution neural network code. I do not understand the next part of this code. loss = tf.reduce_sum(tf.nn.l2_loss(tf.subtract(train_output, train_gt))) for w in weights: loss += tf.nn.l2_loss(w)*1e-4 The…
ddjfjfj djfiejdn
  • 131
  • 1
  • 12
0
votes
1 answer

Cost Function Regularization in TensorFlow.js

I wondering if someone could help me. I'm new to TensorFlow.js (JavaScript version). I've built a neural network and want to add a regularization term to the cost function (loss function). I can see the regularizers in the JavaScript API…
Alanaj5
  • 37
  • 6
0
votes
1 answer

Why does the number of observations reduce using model.matrix in ridge regression?

I'm using glmnet package in R for ridge regression. I tried on Hitters dataset from ISLR package. The problem is, when I use model.matrix to create the design matrix, the number of observations reduced for unknown reason. This is the…
Ashley
  • 67
  • 1
  • 7
0
votes
1 answer

Trying to perform LDA with LASSO

PenalizedLDA( x = train_x, y =train_y) returns Error in sort.int(x, na.last = na.last, decreasing = decreasing, ...) : 'x' must be atomic I'm trying to use linear discriminant analysis with lasso on the sampbase dataset from UCI.(I've added the…
Apocryphon
  • 13
  • 7
0
votes
1 answer

h2o glm regularization path values

[Python 3.5.2, h2o 3.22.1.1, JRE 1.8.0_201] I am running a glm lambda_search and using the regularization path to select a lambda. glm_h2o = H2OGeneralizedLinearEstimator(family='binomial', alpha=1., lambda_search=True,…
ironv
  • 978
  • 10
  • 25
0
votes
1 answer

Performing lasso regularization with factors and numeric predictors?

I have a data set in which I want to perform lasso for feature elimination. I am currently following a guide online in R as I am new to R. The data is stored in a dataframe. The target has been removed from the dataframe and is stored in its own…
rmahesh
  • 739
  • 2
  • 14
  • 30
0
votes
1 answer

L1 penalty in sklearn's MLPRegressor?

Is there a way to introduce L1 regularisation in sklearn's MLPRegressor? I can only find the L2 parameter in the documentation at the moment.
mythicoat
  • 31
  • 1
  • 3
0
votes
0 answers

How to add regularization in CNN autoencoder model_Based on Keras

I am a freshman in Keras and deep learning, I am not quite sure the right way to add the regularization, I wrote a CNN autoencoder using the API model class, right now I add the regularizer in each of the "Conv2D" Keras function,I am not sure if…
J. Zhao
  • 231
  • 1
  • 2
  • 3
0
votes
1 answer

Is the solution at tangent point an optimal solution?

From what I understood from this article, the blue circles are the level curves and the blue dot is the optimal solution that minimizes the cost function. The yellow circle is the L2-norm constraint. The solution that we need is the one that…
theateist
  • 13,879
  • 17
  • 69
  • 109
0
votes
1 answer

Why we might get underfitting without regularization?

In this article the author says ...without applying regularization we also run the risk of underfitting... Why we might get underfitting without regularization? Regularization “make” network simpler to avoid overfitting and not underfittin. So,…
theateist
  • 13,879
  • 17
  • 69
  • 109
0
votes
1 answer

Which kind of regularization use L2 regularization or dropout in multiRNNCell?

I have been working on a project related with sequence to sequence autoencoder for time series forecasting. So, I have used tf.contrib.rnn.MultiRNNCell in encoder and decoder. I am confused in which strategy used in order to regularize my seq2seq…
dnovai
  • 137
  • 1
  • 8
0
votes
0 answers

How to regularize the intercept with glmnet

I know that glmnet does not regularize the intercept by default, but I would like to do it anyway. I was taking a look at this question and tried to do what whuber suggested (adding a constant variable and turning the parameter intercept to FALSE) ,…
gsmafra
  • 2,434
  • 18
  • 26