Questions tagged [mini-batch]

Use mini-batch when working with neural networks and in particular with mini-batch stochastic gradient descent.

A mini-batch is a subset of the input data (batch) broken up into many smaller pieces (mini-batch) that are used to train the neural network.

References

84 questions
0
votes
1 answer

Performing L1 regularization on a mini batch update

I am currently reading Neural Networks and Deep Learning and I am stuck on a problem. The problem is to update the code that he gives to use L1 regularization instead of L2 regularization. The original piece of code that uses L2 regularization…
0
votes
1 answer

How to implement Mini Batch Kmeans using apache spark MLlib?

I have implemented Kmeans using spark. But as my data is huge and feature count is very big I want to implement mini batch kmeans using Apache spark MLlib. Is there any example or document on how to implement it?
Rahul
  • 645
  • 1
  • 9
  • 21
0
votes
1 answer

Tensorflow- How can I use MNIST Dataset with full batch?

I'm studying about machine learning. While I'm studying, I found Tensorflow CNN code using MNIST Dataset.And here's a code that i want to know. cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y_conv), reduction_indices=[1])) train_step =…
YOON
  • 47
  • 1
  • 1
  • 5
-1
votes
1 answer

Colon operator in List Slicing

mini_batch_X = shuffled_X[:, k * mini_batch_size:(k + 1) * mini_batch_size] What is the semantics of the above line? what does the first colon mean?
-1
votes
1 answer

PyTorch minibatch training very slow

When training my model on the adult income data set and using minibatches training is very slow regardless if I use PyTorch's DataLoader or a basic implementation for minibatch training. Is there a problem with my code or is there another way to…
joni
  • 35
  • 2
  • 10
-1
votes
1 answer

When implementing mini batch gradient descent is it better to chose the training exemples randomly?

When implementing mini batch gradient descent is it better to chose the training exemples-to compute the derivatives- randomly? Or would it be better to shuffle the whole training exemples then iterate trough them and shuffle everytime? The first…
-1
votes
1 answer

How to deal with the randomness of NN training process?

Consider the training process of deep FF neural network using mini-batch gradient descent. As far as I understand, at each epoch of the training we have different random set of mini-batches. Then iterating over all mini batches and computing the…
-2
votes
1 answer

What is the right way of mini-batching the validation set while training?

I am training a neural network. For training I get 80% of my data and divide it to a number of mini-batches. I train on each mini batch, then update parameters, until all data is visited. I repeat the whole procedure for a number of epochs. The…
user25004
  • 1,868
  • 1
  • 22
  • 47
-2
votes
1 answer

Pytorch minibatching keeps model from training

I am trying to classify sequences by a binary feature. I have a dataset of sequence/label pairs and am using a simple one-layer LSTM to classify each sequence. Before I implemented minibatching, I was getting reasonable accuracy on a test set (80%),…
glarik
  • 1
  • 1
1 2 3 4 5
6