AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

本文是学习Andrew Ng的机器学习系列教程的学习笔记。教学视频地址:

https://study.163.com/course/introduction.htm?courseId=1004570029#/courseDetail?tab=1

38. Neural Networks - Representation - Non-linear hypotheses

Why neural networks?

Simple linear or logistic regression together with adding in maybe the quadratic or the cubic features, that is not the good way to learn complex nonlinear hypotheses when n is large, because just end up with too many features.

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

for example: Visual recognition

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

The computation would be very expensive to find and represent all of these features.

39. Neural Networks - Representation - neurons and the brain
 

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

mimic 模仿 大脑可以根据接入的信号不同产生不同的功能,比如将听觉区域接入视觉信号,就能'看'见;

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

cut ear or hand neural and connect eye neural, this part of brain will learn to see.(neuro-rewiring experiments重接实验)

try to find out brain’s learning algorithm!

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

you can plug in almost any sensor to the brain and the brain’s learning algorithm will just figure out how to learn from that data and deal with that data.

40. Neural networks: Representation - model representation I
 

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

Nucleus

Dendrite input wires

Cell body胞体

Node of Ranvier 兰氏节

Axon output wire

Myelin sheath 髓鞘

Schwann cell

Axon terminal 轴突末梢

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

one neurons send a little pulse of electricity via its axon to some different neuron’s dendrite.

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

next show the computational steps that are represented by this diagram.

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

forward propagation 前向

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

41. Neural Networks - Representation - examples and intuitions
 

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

just put a large negative weight in front of the variable you want to negate.

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

end up with a nonlinear decision boundary 得到一个非线性的决策

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

each layer compute even more complex functions, then the neural networks can deal with complex question.

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

41. Neural Networks - Representation - Mult-class classification

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

42. Neural Networks - Learning- Cost function

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

42. Neural Networks - Learning - back propagation algorithm
 

back propagation algorithm 反向播算法

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

the key is how to compute these partial derivative terms.

是如何些偏导项

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

43. Neural Networks - Learning - Implementation note: unrolling parameters 
 

How to onvert back and forth between the matrix representation of the parameters versus the vector representation of the parameters.

The advantage of the matrix representation is that when your parameters are stored as matrices it’s more convenient when you’re doing forward propagation and back propagation and it’s easier when your parameters are stored as matrices to take advantage of the sort of, vectorized implementations.

The advantage of the vector representation when you have like thetaVec or DVec is that when you are using the advanced optimization algorithms. Those algorithms tend to assume that you have all of your parameters unrolled into a big long vector.

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

44. Neural Networks - Learning - Gradient checking

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

45. Neural Networks - Learning - Random Initialization

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

all choose 0 will not work

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

we will only get one features

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

the epsilon here has no relationship with the epsilon in gradient checking.

46. Neural Network - Learning - Putting it together
 

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

47. Neural Networks - Learning - Autonomouse driving example
 

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

left top is man and neural network result.

Left bottom is image of road.

ALVINN

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

see human drive and after 2 minutes will auto drive.

AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

can auto switch one-lade road and two lane-road weight. 单车道、双