Browse Source

!11343 fix bugs of op BatchNorm1d, FastGelu in API

From: @lihongkang1
Reviewed-by: @liangchenghui,@wuxuejian
Signed-off-by: @liangchenghui
tags/v1.2.0-rc1
mindspore-ci-bot Gitee 4 years ago
parent
commit
e82c53b5ae
2 changed files with 2 additions and 2 deletions
  1. +1
    -1
      mindspore/nn/layer/activation.py
  2. +1
    -1
      mindspore/nn/layer/normalization.py

+ 1
- 1
mindspore/nn/layer/activation.py View File

@@ -392,7 +392,7 @@ class GELU(Cell):

class FastGelu(Cell):
r"""
fast Gaussian error linear unit activation function.
Fast Gaussian error linear unit activation function.

Applies FastGelu function to each element of the input. The input is a Tensor with any valid shape.



+ 1
- 1
mindspore/nn/layer/normalization.py View File

@@ -267,7 +267,7 @@ class BatchNorm1d(_BatchNorm):
Tensor, the normalized, scaled, offset tensor, of shape :math:`(N, C_{out})`.

Supported Platforms:
``Ascend``
``Ascend`` ``GPU``

Examples:
>>> net = nn.BatchNorm1d(num_features=4)


Loading…
Cancel
Save