This website works better with JavaScript.
Home
Issues
Pull Requests
Milestones
AI流水线
Repositories
Datasets
Forum
实训
竞赛
大数据
AI开发
Register
Sign In
Huawei_Technology
/
mindspore
Not watched
Unwatch
Watch all
Watch but not notify
1
Star
0
Fork
0
Code
Releases
13
Wiki
evaluate
Activity
Issues
0
Pull Requests
0
Datasets
Model
Cloudbrain
HPC
Browse Source
fix bugs
tags/v1.2.0-rc1
lihongkang
5 years ago
parent
1d7951fb11
commit
ae325f2e53
2 changed files
with
2 additions
and
2 deletions
Split View
Diff Options
Show Stats
Download Patch File
Download Diff File
+1
-1
mindspore/nn/layer/activation.py
+1
-1
mindspore/nn/layer/normalization.py
+ 1
- 1
mindspore/nn/layer/activation.py
View File
@@ -392,7 +392,7 @@ class GELU(Cell):
class FastGelu(Cell):
r"""
f
ast Gaussian error linear unit activation function.
F
ast Gaussian error linear unit activation function.
Applies FastGelu function to each element of the input. The input is a Tensor with any valid shape.
+ 1
- 1
mindspore/nn/layer/normalization.py
View File
@@ -284,7 +284,7 @@ class BatchNorm1d(_BatchNorm):
Tensor, the normalized, scaled, offset tensor, of shape :math:`(N, C_{out})`.
Supported Platforms:
``Ascend``
``Ascend``
``GPU``
Examples:
>>> net = nn.BatchNorm1d(num_features=4)
Write
Preview
Loading…
Cancel
Save