4 Commits (31d6d30e6a1921f1a715ff40068b760e49d1d500)

Author SHA1 Message Date
  wangnan39@huawei.com 172728a6a6 support weight decay for sparse optimizer 5 years ago
  panyifeng 3c2057297e support multi param for tuple grad 5 years ago
  wangnan39@huawei.com d4e3d69f37 support loss scale for sparse situation 5 years ago
  wangnan39@huawei.com 4042f16ce4 add lazy adam optim and support sparse adam & ftrl for cpu backend 5 years ago