18 Commits (052d8e2c9940aa8a7b147b881958642bfcc33fe7)

Author SHA1 Message Date
  danishfarid c34e52c3d6 first commit 4 years ago
  simson 90d594a357 code check for ops 4 years ago
  mindspore-ci-bot ab0f23a90e !9491 GPU add trace for error/excpt 5 years ago
  mindspore-ci-bot 6acf699302 !9422 Add dynamic supports to op allreduce and reducesum on gpu 5 years ago
  VectorSL 6c6e2e5478 add trace for gpu error/excpt log 5 years ago
  zhouyuanshen 458f0e7c58 dynamic shape adapting for allreduce and reducesum 5 years ago
  Jonathan Yan d7bb5cd058 Fix CI Alarms 5 years ago
  ZPaC db3a2d60cb GPU supports p2p nccl interfaces 5 years ago
  Yi Huaijie d7faa77b5e support int64 shape 5 years ago
  baihuawei 9f75418001 fix GPU BroadCast 5 years ago
  baihuawei 572a7c4741 fix nccl broadcast 5 years ago
  lizhenyu fcaf86f5d9 fix nccl kernel memory align bug 5 years ago
  lizhenyu 839ec02542 Add FusedBatchEx support 5 years ago
  baihuawei b9ebd9c280 add gpu nccl broadcast 5 years ago
  ZPaC 0bc74f28c5 Enable get rank id and size by group 5 years ago
  liubuyu 76dc80e7b7 Unified code style 5 years ago
  ZPaC ab23776f5f GPU supports to create groups for auto parallel. 5 years ago
  liubuyu 43c79eb853 mindspore path adjust 5 years ago