99 Commits (693b4cfdc8aaffd69014e94f5909115cd2b7bcff)

Author SHA1 Message Date
  dayschan a2967330ea Normalize the Reduce nodes' axis in GraphKernel 5 years ago
  tronzhang 9530904ef7 only through pass target depend 5 years ago
  mindspore-ci-bot bc38590e53 !12926 【GraphKernel】Process for UpdateState node 5 years ago
  mindspore-ci-bot 54fc5e0d2b !12234 [GraphKernel] Support pipeline optimization for parallel fusion. 5 years ago
  dayschan 49f78d5424 Bugfix about execution-order after GraphKernelSplitter 5 years ago
  tronzhang 7252ffb66b pipeline optimization for parallel fusion 5 years ago
  LianLiguang 4acab81599 using cpp infer firstly 5 years ago
  tronzhang 36e65601d1 absorb real scalar tensor 5 years ago
  dayschan c165ab5bb1 Combine the GraphKernelOptimization of Gpu and Ascend 5 years ago
  dayschan 9d572f3963 Refactor GraphKernelExpander (2nd submission) 5 years ago
  mindspore-ci-bot aa71118a99 !12281 fix exec order bug about monad and add test_case in ci 5 years ago
  zengzitao ef3507e973 fix exec order bug about monad 5 years ago
  mindspore-ci-bot 30005c9c64 !12301 [GraphKernel] Eliminate redundant split nodes. 5 years ago
  tronzhang be2b9978be exclude special node when expand or basic fusion 5 years ago
  tronzhang e953705521 eliminate redundant split ops 5 years ago
  He Wei 7d9a783993 [auto-monad] Support side-effects by auto-monad 5 years ago
  dayschan e0e6c39eae Refactor GraphKernelExpander (1st submission) 5 years ago
  mindspore-ci-bot 5bbf009829 !12108 [GraphKernel] Set attribute to node safely. 5 years ago
  jinyaohui 30a27b2adb modify Gelu、FastGelu to GeLU and FastGeLU 5 years ago
  tronzhang c1e63d4824 set attr safetly 5 years ago
  jinyaohui d9be0c102d add some ops 5 years ago
  mindspore-ci-bot f9d9bba927 !12006 [GraphKernel][Gpu]enable GraphKernel for layernorm and layernormGrad (sync from r1.1) 5 years ago
  mindspore-ci-bot 0ff27ef3b4 !11930 【GraphKernel】Replace Assign with InplaceAssign 5 years ago
  hanhuifeng2020 c5f261d894 enable GraphKernel for layernorm and layernormGrad (sync from r1.1) 5 years ago
  mindspore-ci-bot a24ff36d9c !11777 stitch fusion 5 years ago
  mindspore-ci-bot 9efbef72fc !11622 【GraphKernel】Moved ShapeOpsSplitter before GraphKernelSplitter 5 years ago
  l00591931 9ec100d069 Change TensorAdd to Add, from r1.1 to master 5 years ago
  dayschan 08345c54ea [GraphKernel] Replace Assign with InplaceAssign 5 years ago
  dayschan 8a09279ec3 Moved ShapeOpsSplitter before GraphKernelSplitter, changed it to process sub func_graph only. 5 years ago
  r1chardf1d0 9d6392c5c5 stitch info 5 years ago
  tronzhang d078cbfa99 support parallel fusion 5 years ago
  dayschan 27b4e1653a Raise akg ReduceSum precision 5 years ago
  dayschan b9b4a5e5f7 Add a restriction for getitem in basic_ops_fusion. 5 years ago
  dayschan 8af78cd5ce Added ExpandDims into GPU fusion list 5 years ago
  looop5 0a62d42d65 add reorder_ops pass in graph kernel 5 years ago
  dayschan 26ac9167f8 Enhance the fusion capacity for getitem nodes. 5 years ago
  mindspore-ci-bot 481a95cade !9714 [GraphKernel] When atomic clean node list is not empty, clean batch in once. 5 years ago
  mindspore-ci-bot 037a121e05 !9691 expand ClipByNormNoDivSum in graph kernel 5 years ago
  tronzhang 68868ab438 clean batch when nodes is not empty 5 years ago
  tronzhang 056d7ffc56 clean batch buffer in once 5 years ago
  looop5 fa519433ef expand ClipByNormNoDivSum 5 years ago
  mindspore-ci-bot 7b311f7d2a !9570 Modifications for GraphKernel 5 years ago
  tronzhang 2b88731417 change atomic add struct and add new condition for controldepend 5 years ago
  dayschan 6be3cc6f0d consider atomic_add strategy in graph splitter; fixbugs; fuse and inline single op 5 years ago
  looop5 848be9b07c add tile to expand list 5 years ago
  dayschan e5306b913d GraphKernel Fuser 5 years ago
  tronzhang 13126653ec process cast when activate graph kernel in amp 5 years ago
  tronzhang 2190da9946 support atomic clean and change package for akg. 5 years ago
  huanghui e17dd84c0b add trace managager around backend opt 5 years ago
  mindspore-ci-bot ebef1df00b !8994 split dropout op and expand dropout 5 years ago