32 Commits (c94dea6a512eddb6cbe8b591268d82d7b9aa3209)

Author SHA1 Message Date
  laiyongqiang 33d1427a14 optimize is all nop node detect in mem reuse 5 years ago
  laiyongqiang eb37669e3b fix bug to remove reshape when reshape is depend's input 5 years ago
  mindspore-ci-bot a663f2066c !2285 [Code Review] code review fix 5 years ago
  jjfeing c26274f324 fix code review bug 5 years ago
  limingqi107 0f4397cece fix all nop node graph execute 5 years ago
  mindspore-ci-bot 4642df207a !2210 gpu optimize the max device memory config 5 years ago
  limingqi107 55b3557c0d gpu optimize the max device memory config 5 years ago
  limingqi107 20083679a0 gpu memreuse supports summary node 5 years ago
  limingqi107 b83f90a8d8 gpu optimize Nop node 5 years ago
  mindspore-ci-bot 5da171a735 !2076 fix summary nodes memory reuse refcount 5 years ago
  laiyongqiang 09d5a7227e fix summary nodes memory reuse refcount 5 years ago
  lizhenyu 2d16316c55 fix code review warnings 5 years ago
  lizhenyu 0a55ebf6e9 fix code review defects 5 years ago
  laiyongqiang 79843c3528 use VisitKernelWithReturnType instead of VisitKernel to get node's input 5 years ago
  mindspore-ci-bot 3d28ba10ef !1631 fix reviewbot of mem_reuse 5 years ago
  lizhenyu 9df3a8613c fix get nullptr when use graph manager 5 years ago
  yangjie159 37904cea01 fix reviewBot of mem_reuse 5 years ago
  yangjie159 cbf5390b34 refactor memreuse allocator 5 years ago
  lizhenyu 23a57476da change gpu kernel runtime to support memory swap 5 years ago
  lizhenyu d97f849484 add memory swap manager module 5 years ago
  lizhenyu c5ac2cc38c add memory copy module 5 years ago
  lizhenyu af3307ec6e add mem swap module header file 5 years ago
  yangjie159 72f9fe22d4 modify ascend device max memoryGB to 26 6 years ago
  limingqi107 d9197b591a gpu optimize the use of reference count 6 years ago
  limingqi107 63f3a2caac gpu optimize some return values of dynamic memory pool 6 years ago
  kswang 1e8997f4cb optimize sort for mem reuse and fix memreuse bug 6 years ago
  limingqi107 2891f0d20d gpu dynamic memory pool supports multi-allReduce 6 years ago
  zhoufeng c2b3360d69 update clang format rule 6 years ago
  gukecai f8208c7c52 Support GetNext Parallel 6 years ago
  limingqi107 99f12f9105 gpu uses dynamic memory pool by default 6 years ago
  kswang 04be6a37f0 add getptr for memreuse 6 years ago
  zhunaipan 930a1fb0a8 initial version 6 years ago