Struct BCEWithLogitsLossImpl¶
Defined in File loss.h
Page Contents
Inheritance Relationships¶
Base Type¶
public torch::nn::Cloneable< BCEWithLogitsLossImpl >(Template Class Cloneable)
Struct Documentation¶
-
struct BCEWithLogitsLossImpl : public torch::nn::Cloneable<BCEWithLogitsLossImpl>¶
This loss combines a
Sigmoidlayer and theBCELossin one single class.This version is more numerically stable than using a plain
Sigmoidfollowed by aBCELossas, by combining the operations into one layer, we take advantage of the log-sum-exp trick for numerical stability. See https://pytorch.org/docs/main/nn.html#torch.nn.BCEWithLogitsLoss to learn about the exact behavior of this module.See the documentation for
torch::nn::BCEWithLogitsLossOptionsclass to learn what constructor arguments are supported for this module.Example:
BCEWithLogitsLoss model(BCEWithLogitsLossOptions().reduction(torch::kNone).weight(weight));
Public Functions
-
explicit BCEWithLogitsLossImpl(BCEWithLogitsLossOptions options_ = {})¶
-
virtual void reset() override¶
reset()must perform initialization of all members with reference semantics, most importantly parameters, buffers and submodules.
-
virtual void pretty_print(std::ostream &stream) const override¶
Pretty prints the
BCEWithLogitsLossmodule into the givenstream.
-
Tensor forward(const Tensor &input, const Tensor &target)¶
Public Members
-
BCEWithLogitsLossOptions options¶
The options with which this
Modulewas constructed.
-
Tensor weight¶
A manual rescaling weight given to the loss of each batch element.
-
Tensor pos_weight¶
A weight of positive examples.
-
explicit BCEWithLogitsLossImpl(BCEWithLogitsLossOptions options_ = {})¶