吴恩达机器学习正则化Logistic回归算法的MATLAB实现

plotData.m文件

这个函数文件是将其中结果为0和1的结果区分出来,主要是通过find函数实现。

function plotData(X, y)
%PLOTDATA Plots the data points X and y into a new figure 
%   PLOTDATA(x,y) plots the data points with + for the positive examples
%   and o for the negative examples. X is assumed to be a Mx2 matrix.

% Create New Figure
figure; hold on;

% ====================== YOUR CODE HERE ======================
% Instructions: Plot the positive and negative examples on a
%               2D plot, using the option 'k+' for the positive
%               examples and 'ko' for the negative examples.
%

pos = find(y == 1);%返回其中结果为1的序号
neg = find(y == 0);

plot(X(pos, 1), X(pos, 2), 'k+','LineWidth', 2,'MarkerSize', 7);
plot(X(neg, 1), X(neg, 2), 'ko', 'MarkerFaceColor', 'y','MarkerSize', 7);

legend();




% =========================================================================



hold off;

end

costFunctionReg.m文件

此文件的主要功能是计算正则化的代价函数和梯度。尤其需要主要的是,在计算梯度和代价函数时使用的公式与标准logistic算法并不一样。下面依次贴出代价函数和梯度的运算公式:

吴恩达机器学习正则化Logistic回归算法的MATLAB实现

向上面所展示的一样,在对theta进行累加时注意不要从0开始(也就是说舍弃第0个theta值),在MATLAB中体现就是不要把第一行的theta值累加进来。

吴恩达机器学习正则化Logistic回归算法的MATLAB实现

同样,在计算梯度时,j=0和j等于其他值时是不一样的,在J=0时直接带入公式就能得出结果,但是在J≥1时,与上面一样,是从第二个theta开始计算的。

function [J, grad] = costFunctionReg(theta, X, y, lambda)
%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
%   J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
%   theta as the parameter for regularized logistic regression and the
%   gradient of the cost w.r.t. to the parameters. 

% Initialize some useful values
m = length(y); % number of training examples

% You need to return the following variables correctly 
J = 0;
grad = zeros(size(theta));

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
%               You should set J to the cost.
%               Compute the partial derivatives and set grad to the partial
%               derivatives of the cost w.r.t. each parameter in theta

h = sigmoid(X * theta);
J = sum( -y .* log(h) - (1 - y) .* log(1 - h)) / m + lambda / (2 * m) * sum(theta(2:end,:) .* theta(2:end,:));  %

grad(1,:) = sum((h - y) .* X(:,1)) / m;
grad(2:end,:) = X(:,2:end)' * (h - y)  ./ m + lambda / m .* theta(2:end,:);

% =============================================================

end