Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to use Cross Entropy loss function in pytorch

2025-01-15 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly introduces how to use the cross-entropy loss function in pytorch. It is very detailed and has a certain reference value. Friends who are interested must read it!

First

The weight must also be converted to Tensor's cuda format

And then

The class_weight is used as the input value of the corresponding parameter of the cross entropy function.

Supplement: about the weight parameter of pytorch's CrossEntropyLoss

First of all, this weight parameter needs to be considered more than expected.

You can try the following code

Import torchimport torch.nn as nninputs = torch.FloatTensor ([0ignore_index=255,weight=weight_CE 1]) outputs = torch.LongTensor ([0jue 1]) inputs = inputs.view (1m 3)) outputs = outputs.view ((1)) weight_CE = torch.FloatTensor ([1) nn.CrossEntropyLoss (ignore_index=255,weight=weight_CE) loss = ce (inputs,outputs) print (loss) tensor (1.4803)

The manual calculation here is:

Loss1 = 0 + ln (e0 + e0 + e0) = 1.098

Loss2 = 0 + ln (E1 + e0 + E1) = 1.86

Average = (loss1 * 1 + loss2 * 1) / 2 = 1.4803

What about weighting? Import torchimport torch.nn as nninputs = torch.FloatTensor ([0jue 1]) outputs = torch.LongTensor ([0jue 1]) inputs = inputs.view (1m 3)) outputs = outputs.view ((1)) weight_CE = torch.FloatTensor ([1) nn.CrossEntropyLoss (ignore_index=255,weight=weight_CE) loss = ce (inputs,outputs) print (loss) tensor (1.6075)

The manual calculation found that it was not the simple multiplication of weights:

Loss1 = 0 + ln (e0 + e0 + e0) = 1.098

Loss2 = 0 + ln (E1 + e0 + E1) = 1.86

Average = (loss1 * 1 + loss2 * 2) / 2 = 2.4113

But

Loss1 = 0 + ln (e0 + e0 + e0) = 1.098

Loss2 = 0 + ln (E1 + e0 + E1) = 1.86

Average = (loss1 * 1 + loss2 * 2) / 3 = 1.6075

Did you find that, after weighting, it is divided by the sum of weights, not the sum of numbers.

Let's verify it again: import torchimport torch.nn as nninputs = torch.FloatTensor (outputs = torch.LongTensor) inputs = inputs.view ((1) outputs = outputs.view ((1)) weight_CE = torch.FloatTensor ([1) ce = nn.CrossEntropyLoss (ignore_index=255) loss = ce (inputs,outputs) print (loss) tensor (1.5472)

Manual calculation:

Loss1 = 0 + ln (e0 + e0 + e0) = 1.098

Loss2 = 0 + ln (E1 + e0 + E1) = 1.86

Loss3 = 0 + ln (e2 + e0 + e0) = 2.2395

Loss4 =-0.5 + ln (e0.5 + e0 + e0) = 0.7943

Average = (loss1 * 1 + loss2 * 2+loss3 * 3+loss4 * 3) / 9 = 1.5472

Some people may have questions about the CE calculation process of loss. I will describe the calculation process of cross entropy in detail here and take the calculation of loss4 in the last example to illustrate.

These are all the contents of the article "how pytorch uses the Cross Entropy loss function". Thank you for reading! Hope to share the content to help you, more related knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report