machine learning exercise5 Coursera

第五次上机作业:

linearRegCostFunction.m
learningCurve.m
polyFeatures.m
validationCurve.m

linearRegCostFunction.m

线性回归的代价函数及梯度计算

J = sum((X * theta - y) .^ 2) / (2 * m);
J += sum(theta(2 : end) .^ 2) * lambda / (2 * m);


p = X' * (X * theta - y) / m;
grad(1) = p(1) ;
grad(2 : end) = p(2 : end) + (lambda / m) * theta(2 : end);

learningCurve.m

学习曲线,样本数量和错误的曲线。

%求theta的函数
function [theta] = trainLinearReg(X, y, lambda)
initial_theta = zeros(size(X, 2), 1); 
costFunction = @(t) linearRegCostFunction(X, y, t, lambda);
options = optimset('MaxIter', 200, 'GradObj', 'on');
theta = fmincg(costFunction, initial_theta, options);
end

function [error_train, error_val] = ...
    learningCurve(X, y, Xval, yval, lambda) %X为样本,Xval为交叉验证集
m = size(X, 1);

error_train = zeros(m, 1);
error_val   = zeros(m, 1);

n = size(Xval,1);
       for i = 1:m

            %样本集从1到m训练
            [Theta] = trainLinearReg(X(1:i, :), y(1:i), lambda);
            %前i个的训练集错误
            error_train(i) = sum((X(1:i, :) * Theta - y(1:i)) .^ 2) / (2 * i);
            %运用theta后的整个交叉训练集的错误,与前面的不同
            error_val(i) = sum((Xval * Theta - yval) .^ 2)/(2 * n);           
        end

polyFeatures.m

增加多项式

%p为最高的次数
for i = 1:p
  X_poly(:, i) = X .^ i;
end

validationCurve.m

这个曲线是关于lambda对error的影响

lambda_vec = [0 0.001 0.003 0.01 0.03 0.1 0.3 1 3 10]';
error_train = zeros(length(lambda_vec), 1);
error_val = zeros(length(lambda_vec), 1);
n = size(Xval, 1);    
    for i = 1:length(lambda_vec)
           lambda = lambda_vec(i);
           [Theta] = trainLinearReg(X, y, lambda);
           error_train(i) = sum((X * Theta - y) .^ 2) / (2 * m);
           error_val(i) = sum((Xval * Theta - yval) .^ 2)/(2 * n);
    end