Reading: Hands-On ML - Quick notes (from Part II - NN & DL / Chapter 10)

Anh-Thi Dinh
This note serves as a reminder of the book's content, including additional research on the mentioned topics. It is not a substitute for the book. Most images are sourced from the book or referenced.
I've noticed that taking notes on this site while reading the book significantly extends the time it takes to finish the book. I've stopped noting everything, as in previous chapters, and instead continue reading by highlighting/hand-writing notes instead. I plan to return to the detailed style when I have more time.
This book contains 1007 pages of readable content. If you read at a pace of 10 pages per day, it will take you approximately 3.3 months (without missing a day) to finish it. If you aim to complete it in 2 months, you'll need to read at least 17 pages per day.


List of notes for this book

Chapter 10. Introduction to Artificial Neural Networks with Keras

  • The Perceptron: one of the simplest ANN architectures (ANN = Artificial Neural Networks)
    • Figure 10-4. TLU (threshold logic unit): an artificial neuron that computes a weighted sum of its inputs , plus a bias term b, then applies a step function
  • Most commond step function is Heaviside step function, sometimes sign function is used.
  • How is a perceptron trained? → follows Hebb’s rule. “Cells that fire together, wire together” (the connection weight between two neurons tends to increase when they fire simultaneously.)
  • perceptrons has limit (eg. cannot solve XOR problem) → use multiplayer perceptron (MLP)
  • perceptrons do not output a class probability → use logistic regression instead.
  • When an ANN contains a deep stack of hidden layers → deep neural network (DNN)
  • Thời xưa, máy tính chưa mạnh → train MLPs is a problem kể cả khi dùng gradient descent.
  • Backpropagation : an algo to minimize the cost function of MLPs.
    • Forward propagation: from X to compute the cost J
    • Backward propagation: compute derivaties and optimize the params → update params
    • → Read this note.
Loading comments...