machine learning exercise2 Coursera

第二次上机作业:

sigmoid.m
costFunction.m
predict.m
costFunctionReg.m

sigmoid.m

这是逻辑回归的假设函数计算
sigmoid

t = ones(size(z));
g = 1 ./ (exp(-z) + t);

costFunction.m and costFunctionReg

计算代价函数和梯度

无正规化:

costFunction

gradient

J = sum(-y .* log(sigmoid(X * theta)) - (1 - y) .* log(1 - sigmoid(X * theta))) / m;

p = X' * (sigmoid(X * theta) - y);
grad = p / m

正规化:

costFunctionReg

gradientReg1

gradientReg2

J = sum(- y .* log(sigmoid(X * theta)) - (1 - y) .* log(1 - sigmoid(X * theta))) / m
J += (lambda /(2 *m)) * sum(theta(2:end) .^2);


t =X'*(sigmoid(X * theta) - y) / m;
grad(1) = t(1);
grad(2 : end) = t(2 : end) + (lambda / m) * theta(2 : end);

predict.m

预测的函数

p_medium = sigmoid( X * theta);
pos = find(p_medium >= 0.5)
p(pos, 1) = 1;

fminunc函数

使用octave自带的算法,找出theta 和 cost

% Set options for fminunc
options = optimset('GradObj', 'on', 'MaxIter', 400);
% Run fminunc to obtain the optimal theta
% This function will return theta and the cost
[theta, cost] = ...
fminunc(@(t)(costFunction(t, X, y)), initial theta, options);

mapfeature.m

增加多项式函数
feature mapping

degree = 6;
out = ones(size(X1(:,1)));
for i = 1:degree
    for j = 0:i
        out(:, end+1) = (X1.^(i-j)).*(X2.^j);
    end
end

end