You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

lrCostFunction.m 2.0 kB

8 years ago
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758
  1. function [J, grad] = lrCostFunction(theta, X, y, lambda)
  2. %LRCOSTFUNCTION Compute cost and gradient for logistic regression with
  3. %regularization
  4. % J = LRCOSTFUNCTION(theta, X, y, lambda) computes the cost of using
  5. % theta as the parameter for regularized logistic regression and the
  6. % gradient of the cost w.r.t. to the parameters.
  7. % Initialize some useful values
  8. m = length(y); % number of training examples
  9. % You need to return the following variables correctly
  10. J = 0;
  11. grad = zeros(size(theta));
  12. % ====================== YOUR CODE HERE ======================
  13. % Instructions: Compute the cost of a particular choice of theta.
  14. % You should set J to the cost.
  15. % Compute the partial derivatives and set grad to the partial
  16. % derivatives of the cost w.r.t. each parameter in theta
  17. %
  18. hx = sigmoid(X * theta); %hypothesis, m * 1
  19. % Hint: The computation of the cost function and gradients can be
  20. % efficiently vectorized. For example, consider the computation
  21. %
  22. % sigmoid(X * theta)
  23. %
  24. % Each row of the resulting matrix will contain the value of the
  25. % prediction for that example. You can make use of this to vectorize
  26. % the cost function and gradient computations.
  27. %
  28. J = 1 / m * sum(-y' * log(hx) - (1 - y)' * log(1 - hx)) + lambda / (2 * m) * theta(2:end)' * theta(2:end);
  29. % Hint: When computing the gradient of the regularized cost function,
  30. % there're many possible vectorized solutions, but one solution
  31. % looks like:
  32. % grad = (unregularized gradient for logistic regression)
  33. % temp = theta;
  34. % temp(1) = 0; % because we don't add anything for j = 0
  35. % grad = grad + YOUR_CODE_HERE (using the temp variable)
  36. %
  37. gradf = (1 / m) * (X(:, 1)' * (hx - y));
  38. gradb = (1 / m) * (X(:, 2:end)' * (hx - y)) + lambda * theta(2:end) / m;
  39. grad = [gradf;gradb];
  40. size(grad)
  41. % =============================================================
  42. grad = grad(:);
  43. end

机器学习

Contributors (1)