From e27850b87fd745863e2ca1b42161932f3ad4ce0f Mon Sep 17 00:00:00 2001 From: agnuspaul98 Date: Wed, 4 Feb 2026 13:46:27 +0530 Subject: [PATCH] Fix broken link to Autograd tutorial --- beginner_source/basics/optimization_tutorial.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/beginner_source/basics/optimization_tutorial.py b/beginner_source/basics/optimization_tutorial.py index 82bfaa8f07c..3dfd60ecc6c 100644 --- a/beginner_source/basics/optimization_tutorial.py +++ b/beginner_source/basics/optimization_tutorial.py @@ -15,7 +15,7 @@ Now that we have a model and data it's time to train, validate and test our model by optimizing its parameters on our data. Training a model is an iterative process; in each iteration the model makes a guess about the output, calculates the error in its guess (*loss*), collects the derivatives of the error with respect to its parameters (as we saw in -the `previous section `_), and **optimizes** these parameters using gradient descent. For a more +the `previous section `_), and **optimizes** these parameters using gradient descent. For a more detailed walkthrough of this process, check out this video on `backpropagation from 3Blue1Brown `__. Prerequisite Code