The number 1407 does not refer to the number of samples, it refers to the steps per epoch. For example assume you have 1000 training samples. If you set the batch_size=100 then it takes 10 steps per epoch to go through your entire data set. If you did not specify a batch_size model.fit defaults it to 32. 45000/32=1406.25 so it rounds it up to 1407. 1407 X 32=45024 so for each epoch you go through your entire training set once plus 24 additional samples. For validation data it is best to go through the validation set only once per epoch. Therefore try to select the validation batch size such that validation_samples/validation_batch_size is an integer then specify that as the value of validation_steps in model.fit. Here is a handy little function that will determine the largest available batch size and number of steps where length is the number of samples in the data set, and b_max is the maximum batch size you will allow based on your memory capacity.
def get_bs(length,b_max):
batch_size=sorted([int(length/n) for n in range(1,length+1) if length % n ==0 and length/n<=b_max],reverse=True)[0]
return batch_size,int(length/batch_size)
# example
batch_size, steps=get_bs(1000, 80)
print (batch_size, steps)
# results in batch_size=50 and steps=20
This function is also useful to determine if length is a prime number because it will return a batch_size of 1 just make b_max=length-1.