machine learning exercise - Coursera

第一次上机作业的任务:

warmUpExercise.m
plotData.m
gradientDescent.m
computeCost.m
gradientDescentMulti.m
computeCostMulti.m
featureNormalize.m
normalEqn.m

warmUpExercise.m

热身练习很简单,就是写一个单位矩阵:

A = eye(5);

plotData.m

画图题,标上x,y轴的变量意义。

plot(x, y, 'rx', 'MarkerSize', 10); % Plot the data
ylabel('Profit in $10,000s'); % Set the y?axis label
xlabel('Population of City in 10,000s'); % Set the x?axis label

computeCost.m

代价函数

J = sum((X * theta - y).^2)/(2*m);

gradientDescent.m

梯度下降求系数

temp1 = 0;
temp2 = 0;
for mter = 1:m
        temp1 += (theta(1, 1) * X(mter, 1) + theta(2, 1) * X(mter, 2) - y(mter, 1)) * X(mter,1);
        temp2 += (theta(1, 1) * X(mter, 1) + theta(2, 1) * X(mter, 2) - y(mter, 1)) * X(mter,2);
end

theta(1, 1) = theta(1, 1) - (alpha * temp1) / m;
theta(2, 1) = theta(2, 1) - (alpha * temp2) / m;

#featureNormalize.m
将特征值规范,用到公式 X = (X-mean(X)./std(X)),mean是求均值,std是求标准差。

mu = mean(X);
sigma = std(X);
for i = 1:length(X)
  X(i,:) -= mu;
  X(i,:)./= sigma;
end
X_norm = X;

computeCostMulti.m

同上单变量的代价函数

gradientDescentMulti.m

多变量的梯度下降,

 S = (1 / m) * (X' * (X * theta - y));
theta = theta - alpha .* S;

normalEqn.m

正规方程求解,pinv里有两个括号。

theta = pinv((X'*X))*X'*y;