630 Commits (6fcd6cab684bd2d2ae2da5613efa3aa4e2cd7d5a)

Author SHA1 Message Date
  mindspore-ci-bot f3aaf9b20f !16103 [MS][LITE] move mindir proto to core 4 years ago
  jianghui58 d5ecee383a move mindir proto to core 4 years ago
  Ziyan 2a752f24bf enable not fully use opt shard 5 years ago
  zuochuanyong e7ea343738 add format transform pass on cpu 4 years ago
  mindspore-ci-bot ac912672bf !15871 Support pass args or|and kwargs for OP CreateInstance. 4 years ago
  mindspore-ci-bot 1e7866eb14 !15852 [GraphKernel]Disable GraphKernel in PynativeMode 4 years ago
  Zhang Qinghua cd7f7d40fb Support pass args or|and kwargs for OP CreateInstance. 4 years ago
  dayschan 490c0521ac Disable GraphKernel in PynativeMode 4 years ago
  jjfeing 88c92cd263 clear parameter when param_info clone 4 years ago
  huangbingjian 302746d6a3 fix code check 4 years ago
  mindspore-ci-bot 78469f6083 !15356 Support mem reuse in control flow and multi-call subgraphs 4 years ago
  liangzelang 052a803c63 adapt to mem reuse 5 years ago
  mindspore-ci-bot ade65bac93 !15312 add the actor link by auto monad 4 years ago
  limingqi107 c937a22bda add the actor link by auto monad 4 years ago
  tronzhang 8ff3c16778 add swtich for parallel fusion and default is off 4 years ago
  mindspore-ci-bot cd002cb7f7 !14893 enable stitch fusion on bert 5 years ago
  r1chardf1d0 3b32995936 enable stitch fusion on bert 5 years ago
  zhoufeng f8248d61b9 init plog when opentsd 5 years ago
  luopengting 727dc08bfa fix TAINTED_SCALAR and capitalize constants 5 years ago
  mindspore-ci-bot 5d96d0f7e9 !14583 3d graph format select reconstruct 5 years ago
  yepei6 ca03a24083 correct the grammar error 5 years ago
  liubuyu 40f34b0d90 3d graph reconstruct 5 years ago
  mindspore-ci-bot 0ef2d78411 !14133 tensorprint_debug 5 years ago
  mindspore-ci-bot 75fdaaa6aa !14304 [GraphKernel] Dump GraphKernel split info as text; dump akg kernel launch fail message 5 years ago
  yepei6 5da7fb36c5 modify the tensorprint handle create process 5 years ago
  dayschan 3c6c30024c dump graph_kernel_split info 5 years ago
  mindspore-ci-bot 7f4994af7c !14186 Support while bprop 5 years ago
  mindspore-ci-bot e9ada9fd1d !14192 add the force transform to avoid the utf8 error 5 years ago
  mindspore-ci-bot ad140a8bf4 !14084 [GraphKernel] support matmul on D 5 years ago
  yepei6 ce3597b727 add the force transform to avoid utf8 error 5 years ago
  liangzelang ba65fb9f3c Support non-tail recursive graphs 5 years ago
  lingyunli63 4b966ed40d support matmul on D 5 years ago
  mindspore-ci-bot 2d73a35793 !14056 tensorprint segmentation 5 years ago
  yepei6 0f28c1aa19 add the force cast to avoid the segmentation 5 years ago
  mindspore-ci-bot 18e98c6a0b !13720 【GraphKernel】Add context graph_kernel_flags 5 years ago
  dayschan 11ee3b1624 add context graph_kernel_flags 5 years ago
  liuxiao93 723bbac438 revert nn.BatchNorm3d. 5 years ago
  mindspore-ci-bot efe95ebbce !13724 optimize execute order for commops 5 years ago
  kswang dc543f3f1e optimize execute order for commops 5 years ago
  mindspore-ci-bot cf5eaf8590 !13050 Don't insert UpdateState for HyperMap func graph call, move auto monad eliminator out from CSE, and eliminate auto monad nodes for output node. 5 years ago
  Zhang Qinghua e853df4ecd Don't insert UpdateState for HyperMap func graph call. 5 years ago
  dingpeifei 87e41aaeee IR operators of GPU and CPU are unified as batchnorm 5 years ago
  ms_yan 92e86804e1 init add acltdt handle create and destory 5 years ago
  mindspore-ci-bot 2013e3f370 !13216 If data_format is NCDHW, BatchNorm to BatchNorm3D. 5 years ago
  mindspore-ci-bot 654771df13 !13080 fix embeddinglookup infer 5 years ago
  liuxiao93 d44c706baf batchnorm to batchnorm3d. 5 years ago
  fangzehua dadbd54f0e add embedding infer 5 years ago
  l00591931 bbdb050fc7 Change switch to Switch 5 years ago
  mindspore-ci-bot 54c37bcd61 !12947 Add MaxPool3D,MaxPool3DGrad,MaxPool3DGradGrad ops for Ascend. 5 years ago
  mindspore-ci-bot c69142fdc1 !12968 update reshape type for 3d nodes 5 years ago