wangshuide2020
8da6d65222
fix the validation of Softmax, Tanh, Elu operators.
5 years ago
wangshuide2020
30f99f2722
add raises description for BCELoss, ReLU, BatchNorm1d, etc. operators.
5 years ago
l00591931
9ec100d069
Change TensorAdd to Add, from r1.1 to master
5 years ago
lihongkang
ae325f2e53
fix bugs
5 years ago
lihongkang
5bd1a198ad
fix bugs
5 years ago
wangshuide2020
4b693377dc
update documentation of warmup_lr, F1, RMSProp, BatchNorm2d and add some pictures of links of activation function.
5 years ago
lihongkang
9b2265a66f
fix bugs
5 years ago
zhangz0911gm
b7be94fafa
Fixing errors in classes' notes
5 years ago
lihongkang
79333916ed
fix bugs
5 years ago
zhangz0911gm
0ffec7acf9
Fixing some tiny faults in notes of classes' examples
5 years ago
zhangyi
13e2aee0ea
fix somee error format for comments.
5 years ago
lihongkang
0fa0fd39bb
fix bugs
5 years ago
shibeiji
cd850bd210
register activation operator fast_gelu
5 years ago
JunYuLiu
1eaa4a30dd
Add labels to python files
5 years ago
hedongodng
a5016a0ead
fix: I24U3E, I24U50, I24U7V and I24TZT. Correct operator descriptions
5 years ago
mindspore-ci-bot
bcc6e1ca28
!8655 add output of example of some operations.
From: @wangshuide2020
Reviewed-by:
Signed-off-by:
5 years ago
wangshuide2020
195f25cc07
add output of example of some operations.
5 years ago
zhangz0911gm
dda18138c1
t# This is a combination of 2 commits.
updating notes of pynative in nn_layer
5 years ago
lihongkang
0bf0862112
fix bugs
5 years ago
chenzomi
d5ae6fdd84
code format for nn.layer
5 years ago
lihongkang
c421fc1bad
fix bugs
5 years ago
simson
7cc48a9af8
Third round of enhancement of API comment & README_CN
5 years ago
wangdongxu
b59e5b60fe
fix HSigmoid comment
5 years ago
chenzomi
d383ade6f9
add mobilenetV2 quant export
5 years ago
mindspore-ci-bot
c8f26f799b
!2436 fix nn.PReLU example
Merge pull request !2436 from jiangjinsheng/issue_fix4
5 years ago
jiangjinsheng
157ee1ca16
fix nn.PReLU example
5 years ago
gong chen
a6dfa281ea
Init GraphKernel.
- It provides a unified style to express graph and kernel for user.
- It provides a unified IR to represent graph and kernel for developer.
- It breaks the boundary between graph and kernel.
- It provides more opportunities to do compile optimization.
6 years ago
simson
ca988e9e69
fix the condition when activation name is 0
5 years ago
jiangjinsheng
6c92282e5e
fixed LeakyReLU
5 years ago
jiangjinsheng
eb4571a67f
fixed LeakyReLU, Optimizer
5 years ago
jiangjinsheng
f17c070fb6
fixed doc for GELU
6 years ago
peixu_ren
99fda6f431
Add logsigmoid and reduce_logsumexp
6 years ago
mindspore-ci-bot
b9d3495c4d
!1081 format nn. and change quant.py
Merge pull request !1081 from SanjayChan/formast
6 years ago
chenzomi
1397326c46
add quant.py and change the format of __init__
6 years ago
jiangjinsheng
d12976da6b
add example for hsigmoid
6 years ago
zhouneng
cf38d93eb8
为mindspore.nn包算子添加Examples
6 years ago
chenzomi
c0229fa951
change hswish and hsigmoid accroding to primitive
6 years ago
chenzomi
652ab6c386
add test case for aware quantizaiton
6 years ago
chenzomi
d64f662c76
quantization aware training frontend operators define.
6 years ago
万万没想到
605d980305
1. add Note refer to nn.SGD for detail
2. delete default value of stat
3. delete examples
4. some comments error from wangting review
5. modify comments from jinyaohui
6. modify examples from wanghao
7. modify Select operation examples
6 years ago
zhunaipan
930a1fb0a8
initial version
Signed-off-by: leonwanghui <leon.wanghui@huawei.com>
6 years ago