2

I have a regression problem where the output y is a single probability, i.e. real number that varies in the interval [0, 1]

While using L1 or L2 loss will very likely work well, I feel that they are not the most appropriate options considering that the range [0, 1] is already well defined.

Is Binary Cross Entropy (BCE Loss in pytorch) the most appropriate in this case?

keiv.fly
  • 1,299
  • 9
  • 14
Juan Leni
  • 1,069
  • 9
  • 24

2 Answers2

1

Predicting probabilities is can be framed as a beta regression.

That is a separate issue than adding a regularization term (i.e., L1 or L2).

Brian Spiering
  • 23,131
  • 2
  • 29
  • 113
0

At first I was going to say:

It doesn't make sense to use use cross entropy loss in a regression problem!

See explanation here.

But then I realised that if you are really trying to do regression on probabilities it could have some sense.

But still, why would you use it instead of L1, L2? So maybe try it and let me know if it works better!

pcko1
  • 4,030
  • 2
  • 17
  • 30