21 Commits (07dcd6e0cc5f07d40b9052d9934b7bf3b9524e59)

Author SHA1 Message Date
  ch-l f806b72447 use DeviceMemory for memory control 5 years ago
  zhoufeng c2b3360d69 update clang format rule 5 years ago
  mindspore-ci-bot 46acf23825 !405 [AutoParallel] Adapte rec-prog generator to new parser 5 years ago
  mindspore-ci-bot 5b6b1ad727 !394 [AutoParallel] Simplify rec-prog parser mechanism 5 years ago
  ch-l c71234f383 improve rec-prog str generator 6 years ago
  Ziyan 0d208e00bd Model ALLTOALL as a single operator in cost model; scale the ALLTOALL, 6 years ago
  Chong b1f5e44cd4 improve parser 6 years ago
  Xiaoda Zhang 79de8f4bdf Adjusting backward communication cost of some operators 6 years ago
  yangzhenzhang 6d522f0a4f add parallel op for layernorm 6 years ago
  Xiaoda Zhang ffb2cb03a4 Change 'NOT_FULLY_USE_DEVICES' to 'FULLY_USE_DEVICES' and make ALL-1 user-specified-strategy valid in auto-parallel 6 years ago
  Xiaoda Zhang 0ac50a19f5 Model the memory cost in auto-parallel. It is calculated by the output of operators, plus the parameters. Additionally, modify the graph-operations in auto_parallel to include memory_cost. 6 years ago
  c00425699 d62f560b50 add_bool_type_check_in_comm_op 6 years ago
  c00425699 c8cdb6b331 support distributed GatherV2 operator 6 years ago
  yangzhenzhang b34c0e7a17 add parallel op for dropoutdomask 6 years ago
  c00425699 b413638f23 refactor OperatorCostPtr in OperatorInfo 6 years ago
  Xiaoda Zhang a153fad874 This commit is to separate the computation cost and memory cost in auto_parallel. Some related memory correction is removed. 6 years ago
  Su Teng 60b68a1470 sort include file in parallel dir 6 years ago
  Xiaoda Zhang 3d35792877 change_star_elimination: make the non-identity triangle_eliminatin exact 6 years ago
  Xiaoda Zhang c080ec7874 change star elimination: remove some redundant and checking works 6 years ago
  lichenever f946aea10d fix grpah mode loop sink bug in auto parallel 6 years ago
  zhunaipan 930a1fb0a8 initial version 6 years ago