| @@ -284,6 +284,9 @@ class BCELoss(_Loss): | |||||
| \operatorname{sum}(L), & \text{if reduction} = \text{`sum'.} | \operatorname{sum}(L), & \text{if reduction} = \text{`sum'.} | ||||
| \end{cases} | \end{cases} | ||||
| Note that the predicted labels should always be the output of sigmoid and the true labels should be numbers | |||||
| between 0 and 1. | |||||
| Args: | Args: | ||||
| weight (Tensor, optional): A rescaling weight applied to the loss of each batch element. | weight (Tensor, optional): A rescaling weight applied to the loss of each batch element. | ||||
| And it must have same shape and data type as `inputs`. Default: None | And it must have same shape and data type as `inputs`. Default: None | ||||
| @@ -296,7 +299,7 @@ class BCELoss(_Loss): | |||||
| Outputs: | Outputs: | ||||
| Tensor or Scalar, if `reduction` is 'none', then output is a tensor and has the same shape as `inputs`. | Tensor or Scalar, if `reduction` is 'none', then output is a tensor and has the same shape as `inputs`. | ||||
| Otherwise, the output is a scalar. default: 'none' | |||||
| Otherwise, the output is a scalar. | |||||
| Examples: | Examples: | ||||
| >>> weight = Tensor(np.array([[1.0, 2.0, 3.0], [4.0, 3.3, 2.2]]), mindspore.float32) | >>> weight = Tensor(np.array([[1.0, 2.0, 3.0], [4.0, 3.3, 2.2]]), mindspore.float32) | ||||