21 Commits (7afcdfd211dabbf10d80cdcff69aad0e246306c0)

Author SHA1 Message Date
  zhoufeng f49b195c39 extract common as an independent shared library 4 years ago
  yao_yf e21f878e14 adasum ut fix 4 years ago
  i-robot 14393503b7
!30431 allreduce allgather fusion 4 years ago
  jiahongQian 8a2151d8bb allgather reducescatter fusion 4 years ago
  yao_yf 19236b1a70 auto parallel adasum support data parallel and hybrid parallel 4 years ago
  yao_yf 4b79d4c425 auto parallel adasum uts and checks 4 years ago
  yao_yf 20f642a014 auto parallel support adasum cpp part 4 years ago
  b00518648 2ff3425c76 1.consider comm cost inside a op when select a strategy;2.deal with the ops that share same param;3.add ut to fix the perfermence at pangu_alpha 4 years ago
  liuluobin 7c1ea41934 Clearing code check alarm for parallel 4 years ago
  lichenever 78839a2413 parallel_support_make_tuple 4 years ago
  yangzhenzhang 2a0b528084 support opt parallel for adafactor 4 years ago
  lilei 05189459ab auto insert VirtualDataset node for master 4 years ago
  He Wei 41dcac9c49 Replace std::unordered_map/set with robin-hood-hashing 4 years ago
  yao_yf 01dc4bbdf9 fix fault recover in optimizer shard 4 years ago
  yao_yf 501b978d16 find data parallel common group in auto parallel 4 years ago
  huangxinjing f354ab22a3 add pipeline shard interface 4 years ago
  b00518648 ea50695cae pclint 4 years ago
  i-robot 1de156bec0 !23635 [AutoParallel]fix_DTS 4 years ago
  lichenever 84899159a4 fix_DTS 4 years ago
  He Wei bffa1e6a39 Optimize ordered_map/set performance 4 years ago
  yangzhenzhang 7ca64d2235 auto parallel support adafactor opt 4 years ago