Browse Source

!13768 update the documentation of Mish operator.

From: @wangshuide2020
Reviewed-by: @liangchenghui,@wuxuejian
Signed-off-by: @liangchenghui
pull/13768/MERGE
mindspore-ci-bot Gitee 5 years ago
parent
commit
91397125f6
1 changed files with 5 additions and 2 deletions
  1. +5
    -2
      mindspore/ops/operations/nn_ops.py

+ 5
- 2
mindspore/ops/operations/nn_ops.py View File

@@ -372,7 +372,7 @@ class ReLU(PrimitiveWithCheck):

class Mish(PrimitiveWithInfer):
r"""
Computes MISH of input tensors element-wise.
Computes MISH(A Self Regularized Non-Monotonic Neural Activation Function) of input tensors element-wise.

The function is shown as follows:

@@ -380,6 +380,9 @@ class Mish(PrimitiveWithInfer):

\text{output} = x * \tan(\log(1 + \exp(\text{x})))

See more details in `A Self Regularized Non-Monotonic Neural Activation Function
<https://arxiv.org/abs/1908.08681>`_.

Inputs:
- **x** (Tensor) - The input tensor. Only support float16 and float32.

@@ -390,7 +393,7 @@ class Mish(PrimitiveWithInfer):
``Ascend``

Raise:
TypeError: If num_features data type not float16 and float32 Tensor.
TypeError: If dtype of `x` is neither float16 nor float32.

Examples:
>>> input_x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32)


Loading…
Cancel
Save