Questions tagged [regularized]

Regularization involves introducing additional information in order to solve an ill-posed problem or to prevent over-fitting by shrinking the parameters of the model stay close to zero

In mathematics and statistics, particularly in the fields of machine learning and inverse problems, regularization involves introducing additional information in order to solve an ill-posed problem or to prevent overfitting. This information is usually of the form of a penalty for complexity, such as restrictions for smoothness or bounds on the vector space norm.

From http://en.wikipedia.org/wiki/Regularization_%28mathematics%29

<

195 questions
4
votes
1 answer

finding maximum depth of random forest given the number of features

How do we find maximum depth of Random Forest if we know the number of features ? This is needed for regularizing random forest classifier.
mach
  • 318
  • 1
  • 5
  • 13
4
votes
1 answer

how to use glmnet without regularization

I have read that glmnet can be used without regularization, i.e. it can be used as a regular glm. I am writing a thesis and trying to avoid using to many different packages, so it would be convenient to use glmnet to do a regular glm logistic…
Camilla
  • 113
  • 2
  • 12
3
votes
2 answers

Updating Python sklearn Lasso(normalize=True) to Use Pipeline

I am new to Python. I am trying to practice basic regularization by following along with a DataCamp exercise using this…
TMo
  • 435
  • 4
  • 11
3
votes
1 answer

Regularization in Modelica

I wonder if other regularization techniques (e.g. hyperbolic tangent) rather than smoothStep function exist in Modelica. I work on a complex code that I used smoothStep several times to avoid chattering in my model; however, I have seen some side…
AJodeiri
  • 314
  • 2
  • 8
3
votes
1 answer

Why do I get probabilities outside 0 and 1 with my Logistic regularized glmnet code?

library(tidyverse) library(caret) library(glmnet) creditdata <- read_excel("R bestanden/creditdata.xlsx") df <- as.data.frame(creditdata) df <- na.omit(df) df$married <- as.factor(df$married) df$graduate_school <-…
3
votes
1 answer

Why does regularization in pytorch and scratch code does not match and what is the formula used for regularization in pytorch?

I have been trying to do L2 regularization on a binary classification model in PyTorch but when I match the results of PyTorch and scratch code it doesn't match, Pytorch code: class LogisticRegression(nn.Module): def…
Rest1ve
  • 105
  • 8
3
votes
1 answer

Lasso regression won't remove 2 features which are highly correlated

I have two features, say F1 and F2 which has a correlation of about 0.9. When I built my model, I first considered all the features to go into my regression model. Once I have my model, I then ran Lasso regression on my model, with the hope that…
3
votes
2 answers

Regularized Logistic Regression in Python (Andrew ng Course)

I'm starting the ML journey and I'm having troubles with this coding exercise here is my code import numpy as np import pandas as pd import scipy.optimize as op # Read the data and give it labels data = pd.read_csv('ex2data2.txt', header=None,…
3
votes
1 answer

How to add L1 normalization in python?

I am trying to code logistic regression from scratch. In this code I have, I thought my cost derivative was my regularization, but I've been tasked with adding L1norm regularization. How do you add this in python? Should this be added where I have…
DN1
  • 234
  • 1
  • 13
  • 38
3
votes
1 answer

L2 regularization in caffe, conversion from lasagne

I have a lasagane code. I want to create the same network using caffe. I could convert the network. But i need help with the hyperparameters in lasagne. The hyperparameters in lasagne look like: lr = 1e-2 weight_decay = 1e-5 prediction =…
user27665
  • 673
  • 7
  • 27
3
votes
1 answer

How can use regularization in TensorFlow-Slim?

I want use regularization in the my code. I used slim for create conv2d like this: slim.conv2d(input, 256, [1, 1], stride=1, padding='SAME', scope='conv1') How can I add regularization to this? and how can I used it for regularize my loss?
Tavakoli
  • 1,303
  • 3
  • 18
  • 36
3
votes
2 answers

optimization and regularization

I am trying to use total variation minimization for an image reconstruction problem. Essentially, I am trying to penalize different in the intensity of the two pixels in the reconstructed image. For this I minimize |Ax-b|+ \lambda |F(X)| where F(x)=…
2
votes
1 answer

Why do the coefficents from orginal regularized regression (ordinalNet) have the wrong sign?

I have data with an ordered factor response ("low", "medium", "high") and a large number of predictor variables. I'm pursuing an ordinal regularized regression model using the ordinalNet package. This is the first time I've used this package, so I…
Arthur
  • 1,248
  • 8
  • 14
2
votes
1 answer

Logistic regression with dropout regularization in PyTorch

I want to implement a logistic regression with dropout regularization but so far the only working example is the following: class logit(nn.Module): def __init__(self, input_dim = 69, output_dim = 1): super(logit, self).__init__() …
Marco Repetto
  • 336
  • 2
  • 15
2
votes
1 answer

How can I unscale and understand glmnet coefficients while using tidymodels?

I'm a bit confused with how I should interpret the coefficients from the elastic net model that I'm getting through tidymodels and glmnet. Ideally, I'd like to produce unscaled coefficients for maximum interpretability. My issue is that I'm honestly…
Evan O.
  • 1,553
  • 2
  • 11
  • 20
1 2
3
12 13