Experiment with different learning rates and see how they affect the number of steps required to reach the minimum of the loss curve. Try the exercises below the graph.
|Set learning rate:||
|Execute single step:||
|Reset the graph:|
✓ This model has reached minimal loss. Is it possible to achieve similar results with fewer steps? ⚠ Gradient descent is taking too long to reach the minimum loss. Try again with a different learning rate. ⚠ Gradient descent will never reach the minimum loss. Try again with a different learning rate.