1

Is there any possibility that cost function might end up in local minima rather than global minima while implementing the Neural network using Tensorflow ?

David Masip
  • 6,136
  • 2
  • 28
  • 62
deepguy
  • 1,471
  • 8
  • 21
  • 39

1 Answers1

1

Most of the critical points in a neural network are not local minima, as it can be seen in this question. Although it is not impossible to fall into a local minimum, the probability of it happening is so low that in practice it does not happen, except from very special cases as a single-layer perceptron.

Being local minima so hard to find, this means that it is highly unlikely to come across the global minimum in your optimization method. All that we do in deep learning is decrease the loss function to find fairly good parameters, but finding local and global minima is extremely unlikely.

David Masip
  • 6,136
  • 2
  • 28
  • 62