@@ -580,7 +580,7 @@ class RNN(_RNNBase):
For each element in the input sequence, each layer computes the following function:
.. math::
h_t = \tanh (W_{ih} x_t + b_{ih} + W_{hh} h_{(t-1)} + b_{hh})
h_t = activation (W_{ih} x_t + b_{ih} + W_{hh} h_{(t-1)} + b_{hh})
Here :math:`h_t` is the hidden state at time `t`, :math:`x_t` is
the input at time `t`, and :math:`h_{(t-1)}` is the hidden state of the
@@ -601,22 +601,22 @@ class RNN(_RNNBase):
Inputs:
- **x** (Tensor) - Tensor of data type mindspore.float32 or mindspore.float16 and
shape (seq_len, batch_size, `input_size`) or (batch_size, seq_len, `input_size`) .
shape :math:`(seq\_len, batch\_size, input\_size)` or :math:`(batch\_size, seq\_len, input\_size)` .
- **hx** (Tensor) - Tensor of data type mindspore.float32 or mindspore.float16 and
shape (num_directions * `num_layers`, batch_size, `hidden_size`). The data type of `hx` must be the same as
`x`.
shape :math:`(num\_directions * num\_layers, batch\_size, hidden\_size)` .
The data type of `hx` must be the same as `x`.
- **seq_length** (Tensor) - The length of each sequence in an input batch.
Tensor of shape :math:`(\text{ batch_size} )`. Default: None.
Tensor of shape :math:`(batch\ _size)` . Default: None.
This input indicates the real sequence length before padding to avoid padded elements
have been used to compute hidden state and affect the final output. It is recommended to
use this input when **x** has padding elements.
use this input when `x` has padding elements.
Outputs:
Tuple, a tuple contains (`output`, `hx_n`).
- **output** (Tensor) - Tensor of shape (seq_len, batch_size, num_directions * ` hidden_size` ) or
(batch_size, seq_len, num_directions * `hidden_size`) .
- **hx_n** (Tensor) - Tensor of shape (num_directions * `num_layers`, batch_size, `hidden_size`) .
- **output** (Tensor) - Tensor of shape :math:` (seq\ _len, batch\ _size, num\ _directions * hidden\ _size)` or
:math:`(batch\_size, seq\_len, num\_directions * hidden\_size)` .
- **hx_n** (Tensor) - Tensor of shape :math:`(num\_directions * num\_layers, batch\_size, hidden\_size)` .
Raises:
TypeError: If `input_size`, `hidden_size` or `num_layers` is not an int.