31

I am looking for a book that goes through the mathematical aspects of neural networks, from simple forward passage of multilayer perceptron in matrix form or differentiation of activation functions, to back propagation in CNN or RNN (to mention some of the topics).

Do you know any book that goes in depth into this theory? I've had a look at a couple (such as Pattern Recognition and Machine Learning by Bishop) but still have not found a rigorous one (with exercises would be a plus). Do you have any suggestions?

Ile
  • 601

7 Answers7

14

I'd recommend Deep Learning by Goodfellow, Bengio and Courville. I don't know if I'd call it "purely mathematical", but it covers a good amount of math background in the first few chapters. No exercises, though.

Jair Taylor
  • 17,307
  • 8
    Thank you - I've actually had a look at that one too, but while it is good in introducing the main mathematical tools needed for NN, I found it a bit lacking when it came to properly develop the model mathematically. – Ile Mar 13 '19 at 03:17
10

For MLPs, there is a rigorous derivation in the optimization textbook by Edwin Chong and Zak. Although it is notation heavy as all things related to neural networks must be.

This book is for some reason freely available online. See page 219 of https://eng.uok.ac.ir/mfathi/Courses/Advanced%20Eng%20Math/An%20Introduction%20to%20Optimization-%20E.%20Chong,%20S.%20Zak.pdf

I think there is essentially no good mathematical textbook on convolutional neural networks or RNN in existence. People essentially just base their intuition off of MLPs. But it is not hard to create a mathematically rigorous derivation of forward and backward propagation of CNN or RNN.

  • 2
    "This book is for some reason freely available online." That is probably a copyright violation by the webpage owner https://eng.uok.ac.ir/mfathi/. But I won't tell anyone if you won't ;) –  Mar 13 '19 at 12:11
7

Gilbert Strang (of MIT OCW Linear Algebra lectures and Introduction to Linear Algebra fame) has a new textbook on linear algebra for deep learning, Linear Algebra and Learning from Data.

It's got a decent course in linear algebra, some statistics & optimization, the calculus needed for stochastic gradient descent, and then applies them all to neural network models.

2

One of my favorite books on theoretical aspects of neural networks is Anthony and Bartlett's book: "Neural Network Learning Theoretical Foundations".

This book studies neural networks in the context of statistical learning theory. You will find loads of estimates of VC dimensions of sets of networks and all that fun stuff.

I should say that this book does not go into detail on CNNs and RNNs.

pcp
  • 1,600
1

Not a book but maybe of some interest for a current perspective:

Backprop as Functor: A compositional perspective on supervised learning Brendan Fong David I. Spivak Remy Tuyeras (2018) gives a category theoretic structural framework based on the algorithm:

https://arxiv.org/pdf/1711.10455.pdf

This is further discussed by David Spivak (2019) via:

https://www.reddit.com/r/math/comments/ahrar7/lectures_in_applied_category_theory_mit_2019/

1

This field is in its nascent age. Not too many materials for "pure mathematical" lovers. Perhaps you would like to take a look at Stanford's STAT581 course (Theories of Deep Learning).

JP Zhang
  • 199
1

I find this book useful Neural Networks - A Systematic Introduction a book by Raul Rojas

also, this Perceptrons: an introduction to computational geometry by Minsky & Papert 1969 is useful from pure mathematical perspective

Dragutin
  • 111