The demo program uses the Adam ("adaptive momentum") training optimizer. Adam often works better than basic SGD ("stochastic gradient descent") for regression problems. PyTorch 1.7 supports 11 ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results