26 Commits (a64920c46802a3030bfdf8754870ffb1e5c6ba78)

Author SHA1 Message Date
  huangxinjing f17f3325c0 Fix cco and api comment 4 years ago
  lilei f8827a09fb modify parallel API note for master 4 years ago
  huangxinjing 8c2dec7fe2 Fix coo layers 4 years ago
  huangxinjing d1123c9b20 fix linear merge error for pangu model 4 years ago
  huangxinjing 7bd77e549c Fix number of the input params error 4 years ago
  huangxinjing a3ee85ede4 Fix reshape error 4 years ago
  huangxinjing 827597f6fe Add transformer example 4 years ago
  huangxinjing 9ddd2b7669 Add past none check 4 years ago
  linqingke acde7febef update pangu reshape and softmax performance. 4 years ago
  huangxinjing b787c5c8c8 Fix spell error 4 years ago
  huangxinjing 31b3b46852 Replace TensorAdd with Add 4 years ago
  i-robot cdbe9b9a64 !23596 [Auto parallel] Move the MoE-related staff to an isolated file 4 years ago
  Xiaoda Zhang 615be06ec8 move moe-related staff to an isolated file 4 years ago
  huangxinjing 0b89d5c9c4 fix batch size error 4 years ago
  huangxinjing c3a98bab2b Add code check 4 years ago
  i-robot 7828fb0028 !23328 Fix error message 4 years ago
  yao_yf 3d1f74b9d0 fixed sparse atten add strategy for div 4 years ago
  huangxinjing 7932c88aaf fix message error 4 years ago
  huangxinjing e02f553010 Fix spell error and add mode check 4 years ago
  Xiaoda Zhang 5613c0b974 add a moe implementation: 4 years ago
  huangxinjing 6cea07f749 Add args check 4 years ago
  huangxinjing 035eca2485 Fix performance degrades 4 years ago
  huangxinjing c7ec8c5f9f Add value check and fix sparse brop 4 years ago
  yao_yf 82889ec56b fixed sparse attention 4 years ago
  zhihenghu ce12c02343 Add Sparse Attention 4 years ago
  huangxinjing d777742904 1. Move the class to mindspore.parallel, support activation sharding 4 years ago