matlab - Computing Cost function for Linear regression with one variable without using Matrix -
i'm new matlab , machine learning , tried compute cost function gradient descent.
the function computecost takes 3 arguments:
- x mx2 matrix
- y m-dimensional vector
- theta : 2-dimensional vector
i have solution using matrix multiplication
function j = computecost(x, y, theta) m = length(y); h = x * theta; serror = (h - y) .^ 2; j = sum(serror) / (2 * m); end
but now, tried same without matrix multiplication
function j = computecost(x, y, theta) m = length(y); s = 0; = 1:m h = x(i, 1) + theta(2) * x(i, 2); s = s + ((h - y(i)) ^ 2); end j = (1/2*m) * s; end
but didn't same result, , first sure (i use before).
you have 2 slight (but fundamental) errors - pretty simple errors though can overlooked.
you forgot include bias term in hypothesis:
h = x(i,1)*theta(1) + x(i,2)*theta(2); %// ^^^^^^
remember, hypothesis when comes linear regression equal
theta^{t}*x
. didn't include oftheta
terms - second term.your last statement normalize
(2*m)
off. have as:j = (1/2*m) * s;
because multiplication , division have same rules of operation, same
(1/2)*m
, that's not want. make sure(2*m)
has brackets surrounding ensure evaluateds / (2*m)
:j = (1/(2*m)) * s;
this ensure
2*m
evaluated first, reciprocal of taken multiplication of sum of squared errors.
when fix problems, same results using matrix formulation.
btw, slight typo in code. should m = length(y)
, not m = length(y)
.
Comments
Post a Comment