zuochuanyong
e7ea343738
add format transform pass on cpu
4 years ago
jjfeing
88c92cd263
clear parameter when param_info clone
4 years ago
mindspore-ci-bot
78469f6083
!15356 Support mem reuse in control flow and multi-call subgraphs
From: @liangzelang
Reviewed-by: @zhoufeng54,@kisnwang
Signed-off-by: @kisnwang
4 years ago
liangzelang
052a803c63
adapt to mem reuse
5 years ago
limingqi107
c937a22bda
add the actor link by auto monad
4 years ago
luopengting
727dc08bfa
fix TAINTED_SCALAR and capitalize constants
1. fix the tainted variable 'path'
2. capitalize constants in env_config_parser
3. move constants about attribute to common utils.h
5 years ago
liubuyu
40f34b0d90
3d graph reconstruct
5 years ago
mindspore-ci-bot
7f4994af7c
!14186 Support while bprop
From: @liangzelang
Reviewed-by: @kisnwang,@jjfeing
Signed-off-by: @jjfeing
5 years ago
liangzelang
ba65fb9f3c
Support non-tail recursive graphs
5 years ago
lingyunli63
4b966ed40d
support matmul on D
5 years ago
liuxiao93
723bbac438
revert nn.BatchNorm3d.
5 years ago
mindspore-ci-bot
efe95ebbce
!13724 optimize execute order for commops
From: @kisnwang
Reviewed-by: @zhoufeng54,@jjfeing
Signed-off-by: @jjfeing
5 years ago
kswang
dc543f3f1e
optimize execute order for commops
5 years ago
mindspore-ci-bot
cf5eaf8590
!13050 Don't insert UpdateState for HyperMap func graph call, move auto monad eliminator out from CSE, and eliminate auto monad nodes for output node.
From: @zh_qh
Reviewed-by:
Signed-off-by:
5 years ago
Zhang Qinghua
e853df4ecd
Don't insert UpdateState for HyperMap func graph call.
Move auto monad eliminator out from CSE.
Eliminate auto monad nodes for output node.
5 years ago
dingpeifei
87e41aaeee
IR operators of GPU and CPU are unified as batchnorm
5 years ago
mindspore-ci-bot
2013e3f370
!13216 If data_format is NCDHW, BatchNorm to BatchNorm3D.
From: @liu_xiao_93
Reviewed-by: @liangchenghui,@wuxuejian
Signed-off-by: @liangchenghui
5 years ago
mindspore-ci-bot
654771df13
!13080 fix embeddinglookup infer
From: @fangzehua
Reviewed-by:
Signed-off-by:
5 years ago
liuxiao93
d44c706baf
batchnorm to batchnorm3d.
5 years ago
fangzehua
dadbd54f0e
add embedding infer
5 years ago
l00591931
bbdb050fc7
Change switch to Switch
5 years ago
mindspore-ci-bot
54c37bcd61
!12947 Add MaxPool3D,MaxPool3DGrad,MaxPool3DGradGrad ops for Ascend.
From: @liu_xiao_93
Reviewed-by: @liangchenghui
Signed-off-by: @liangchenghui
5 years ago
mindspore-ci-bot
c69142fdc1
!12968 update reshape type for 3d nodes
From: @liubuyu
Reviewed-by:
Signed-off-by:
5 years ago
mindspore-ci-bot
54fc5e0d2b
!12234 [GraphKernel] Support pipeline optimization for parallel fusion.
From: @tronzhang
Reviewed-by:
Signed-off-by:
5 years ago
liuxiao93
35f6ba9011
Add MaxPool3D,MaxPool3DGrad,MaxPool3DGradGrad ops for Ascend.
5 years ago
liubuyu
518818fbef
reshape type for 3d nodes
5 years ago
mindspore-ci-bot
e00d8cd1d6
!13020 Not save InitDatasetQueue and GetNext op in PyNative Mode
From: @HulkTang
Reviewed-by: @zhoufeng54,@chujinjin
Signed-off-by: @chujinjin
5 years ago
tronzhang
7252ffb66b
pipeline optimization for parallel fusion
5 years ago
tanghuikang
6102202abd
Not save InitDatasetQueue and GetNext op in PyNative Mode
5 years ago
mindspore-ci-bot
fa4c19f938
!13002 3d format bug fix
From: @liubuyu
Reviewed-by: @zhoufeng54,@kisnwang
Signed-off-by: @kisnwang
5 years ago
liubuyu
62aa7d0e87
bug fix for 3d format
5 years ago
He Wei
891fd7df92
[auto-monad] Refactor ascend_auto_monad
1. Remove output parameter pool for ascend control flow;
2. Remove duplicate code for Switch and SwitchLayer;
3. Add 'return' attribute to label_goto that used for return;
4. Disable tail call optimize for graphs with 'recursive' flag.
5 years ago
Zhang Qinghua
df866f7248
Add TopoSort Rhs First attribute for special CNode, such as Depend CNode with isolated nodes.
5 years ago
mindspore-ci-bot
423dcfc917
!12836 Change return in core_ops
From: @liangzhibo
Reviewed-by: @kingxian,@jpc_chenjianping
Signed-off-by: @kingxian
5 years ago
mindspore-ci-bot
2f312dac66
!12091 Performance optimization for PyNative AllReduce
From: @jojobugfree
Reviewed-by:
Signed-off-by:
5 years ago
mindspore-ci-bot
4365c332e6
!12813 unify AvgPoolGrad's MindIR
From: @yuchaojie
Reviewed-by: @kisnwang
Signed-off-by:
5 years ago
mindspore-ci-bot
c529cfa427
!12754 auto tune step one construct json
From: @liubuyu
Reviewed-by:
Signed-off-by:
5 years ago
l00591931
cf7c5840e3
Change return
5 years ago
yuchaojie
d2cb3aa1c2
unify AvgPoolGrad
5 years ago
caifubi
171b468bb3
PyNative AllReduce Bucket
5 years ago
liubuyu
2d97244741
auto tune stage one: construct json
5 years ago
simson
c29d8f66d8
fix precision error after cache modification
5 years ago
zhupuxu
b15d182cd2
fix bug for dynamic_shape_depends
Signed-off-by: zhupuxu <zhupuxu@huawei.com>
5 years ago
mindspore-ci-bot
a063d7633d
!12241 [auto-monad] Support side-effects by auto-monad
From: @hwhewei
Reviewed-by: @zhunaipan,@zh_qh
Signed-off-by: @zh_qh
5 years ago
He Wei
7d9a783993
[auto-monad] Support side-effects by auto-monad
The basic idea is: exploits data dependency to control the execution order
of side-effect operations, and keep the semantics of ANF unchanged.
The ControlDepend primitive is removed and there are two primitives added:
1. UpdateState:
```
a = Assign(para, value)
```
became:
```
a = Assign(para, value, u)
u = UpdateState(u, a)
```
2. Load:
```
x = Add(para, value)
```
became:
```
p = Load(para, u)
x = Add(p, value)
u = UpdateState(u, p)
```
5 years ago
liu_xiao_93
fabc25538e
Add BCEWithLogitsLoss
5 years ago
mindspore-ci-bot
a24ff36d9c
!11777 stitch fusion
From: @r1chardf1d0
Reviewed-by:
Signed-off-by:
5 years ago
l00591931
9ec100d069
Change TensorAdd to Add, from r1.1 to master
5 years ago
r1chardf1d0
9d6392c5c5
stitch info
5 years ago
mindspore-ci-bot
ce89cc5e8b
!11761 Change GatherV2 to Gather (merge from r1.1 to master)
From: @liangzhibo
Reviewed-by:
Signed-off-by:
5 years ago