Explain Gradient Descent is going to help to minimize loss functions?
Ans: Gradient descent is an optimization algorithm used to minimize loss function by iteratively moving in the direction of steepest descent
Share:
Ans: Gradient descent is an optimization algorithm used to minimize loss function by iteratively moving in the direction of steepest descent