335 Commits (a17f76dd1d83728cdc8ffcb52de694dfd3fcf12e)

Author SHA1 Message Date
  mindspore-ci-bot 8945884137 !8990 close BatchMatmul and ReduceSum in graph kernel 5 years ago
  mindspore-ci-bot 22d683a805 !8920 Adapt ops LinSpace for Ascend. 5 years ago
  mindspore-ci-bot ddff3c4277 !8969 [bug_fix]GPU distributed training occur core dump when memory is not enough 5 years ago
  mindspore-ci-bot 42cbdfcafc !8903 add trace info when mindspore error 5 years ago
  looop5 1b36f454b8 close BatchMatmul and ReduceSum in graph kernel 5 years ago
  liuxiao93 584e241e29 Adapt ops LinSpace for Ascend. 5 years ago
  mindspore-ci-bot 08dc1481c7 !8918 checkcircle should consider the edges coming from controldepent nodes 5 years ago
  mindspore-ci-bot 661b6073a4 !8921 fix return scalar 5 years ago
  jjfeing 27257b9901 add trace when mindspore error 5 years ago
  lizhenyu 6f6a0dfd7a [bug_fix]GPU distributed training occur core dump when memory is not enough 5 years ago
  lingyunli63 e6a5fc0739 consider controldepend edges in checkcircle 5 years ago
  jjfeing a607890256 fix_return_scalar 5 years ago
  mindspore-ci-bot 3f75f13556 !8648 PyNative Performance Optimization 5 years ago
  caifubi c7d6997819 pynative host device parallel 5 years ago
  mindspore-ci-bot 420a4dc162 !8857 fix bug of change axis of reduce kernel when format is 6hd or fracz 5 years ago
  lizhenyu 094f0b2a07 bugfix:fused batch norm op's input channel nums should be a multiple of 4 5 years ago
  LianLiguang 29d585385e fix bug of change axis reduce 5 years ago
  laiyongqiang 978b7e2e18 fix codex and review bot warning 5 years ago
  fangzehua 69ce58425d fix reshape dynamic and emb 5 years ago
  mindspore-ci-bot fb0e866ad1 !8269 forward unique dynamic shape 5 years ago
  yao_yf 31819bb4a7 support forward unique 5 years ago
  mindspore-ci-bot 286f5b05f7 !8493 【GraphKernel】Fuse composite ops separated by GetItem nodes 5 years ago
  mindspore-ci-bot 9969c83f75 !8689 [GraphKernel] Split shape ops for more fusion opportunity. 5 years ago
  liuxiao93 2aaf5e2e1b Adapt DynamicGRUV2Grad for Ascend new backend. 5 years ago
  dayschan 8e6d92eac9 Fuse composite ops separated by GetItem nodes 5 years ago
  mindspore-ci-bot 381455638a !8600 make gaps not lifelong: constraints from neighbor 5 years ago
  mindspore-ci-bot 05f858e3d6 !8597 remove semi-lifelong for communication op's input's memory 5 years ago
  mindspore-ci-bot dcff06a5d6 !8576 【GraphKernel】Disable all simplify pattern except SimplifyReduce in arithmetic_simplify. 5 years ago
  tronzhang 9d7494f4df split shape ops for more fusion pportunity. 5 years ago
  mindspore-ci-bot 66c6fe3d6a !8603 handle getnext output tensor as normal lifelong tensor 5 years ago
  mindspore-ci-bot 07633f2a2a !8601 independent node use the somas's memory 5 years ago
  dayschan a8bb28437c Temporarily disable all simplify pattern except SimplifyReduce, for some bizarre errors occurs in arithmetic simplify. 5 years ago
  liubuyu 5bf70b24bb adjust dynamic_rnn_grad_fission_v2 position 5 years ago
  mindspore-ci-bot b411c05de0 !8495 Adapt DynamicGRUV2 forward for Ascend new backend. 5 years ago
  mindspore-ci-bot df1d5333db !8519 Migrate current graph kernel passes to Ascend back-end 5 years ago
  mindspore-ci-bot 24d04b1cb1 !8578 dynamic shape check 5 years ago
  lingyunli63 a51465c78b add graphkerneloptimize pass 5 years ago
  Ioannis Lamprou 7e72605fc8 make gaps not lifelong: constraints from neighbor 5 years ago
  laiyongqiang b8821bb2f3 handle getnext output tensor as normal lifelong tensor 5 years ago
  laiyongqiang 222159599a independent node use the somas's memory 5 years ago
  laiyongqiang 42c3830938 remove lifelong for communication output 5 years ago
  mindspore-ci-bot 3c6b91f6fa !8548 handle getNextOutput and contiguous conflict 5 years ago
  wilfChen 2291b7f2e6 dynamic shape check 5 years ago
  liuxiao93 d471ac491e Adapt DynamicGRUV2 forward for Ascend new backend. 5 years ago
  laiyongqiang 0fdc6b547d handle getNextOutput and contiguous conflict 5 years ago
  mindspore-ci-bot dbe5229c56 !8492 expand maxmium_grad minimum_grad and dropout_grad 5 years ago
  mindspore-ci-bot e65c68a723 !8475 DynamicRNNGrad run error: cast Int32Imm 5 years ago
  zengzitao 28f1db74dd expand maximum_grad minimum_grad dropout_grad op 5 years ago
  mindspore-ci-bot 7b70c17fc0 !8449 【GraphKernel】Add Transpose into fusible list; Update akg submodule. 5 years ago
  mindspore-ci-bot b078954667 !8389 remove multiple circles 5 years ago