Writing your own loss function/module for PyTorch

Yes, I am switching to PyTorch, and I am so far very happy with it.

Recently, I am working on a multilabel classification problem, where the evaluation metric is the macro f1 score. So, ideally, we would want the loss function to be aligned with our evaluation metric, instead of using standard BCE.

Initially, I was using the following function:

It is perfectly usable for the purpose of a loss function, like your typical training code:

Better, we can make it a PyTorch module, so that the usage is more like your typical PyTorch loss:

That is simply to put the original f1_loss function on to the forward pass of a simple module. As a result, I can explicitly put the module to GPU.


6 thoughts on “Writing your own loss function/module for PyTorch

    1. admin Post author

      Hello Greg, I added clamp and find it works slightly better on my dataset at that time, at least in first few epochs compared to no clamp.

      1. Greg

        Thanks for replying Ren. Is there any intuition behind adding it tho? You must have had some sort of gut feeling?


Leave a Reply

Your email address will not be published. Required fields are marked *