logo

PyTorch RuntimeError: "grad can be implicitly created only for scalar outputs" Solution 📂Machine Learning

PyTorch RuntimeError: "grad can be implicitly created only for scalar outputs" Solution

Example 1

2.png

If you have set the loss function as loss = sum(a,b), an error might occur during backpropagation when loss.backward() is called. Changing it to loss = torch.sum(a,b) will prevent the error.