Using AdaBelief Optimizer in PyTorch
Description
AdaBelief, introduced by J. Zhuang et al. in 2020, is one of the variations of Adam1. Since PyTorch does not natively provide this optimizer, it must be installed separately.
Code2
Installation
The following command can be used to install it via cmd.
pip install adabelief-pytorch==0.2.0
Usage
The code below can be used to import and utilize it.
from adabelief_pytorch import AdaBelief
optimizer = AdaBelief(model.parameters(), lr=1e-3, eps=1e-16, betas=(0.9,0.999), weight_decouple = True, rectify = False)
Environment
- OS: Windows11
- Version: Python 3.11.5, torch==2.0.1+cu118, adabelief-pytorch==0.2.0