You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

costFunctionReg.m 1.2 kB

8 years ago
1234567891011121314151617181920212223242526272829303132
  1. function [J, grad] = costFunctionReg(theta, X, y, lambda)
  2. %COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
  3. % J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
  4. % theta as the parameter for regularized logistic regression and the
  5. % gradient of the cost w.r.t. to the parameters.
  6. % Initialize some useful values
  7. m = length(y); % number of training examples
  8. % You need to return the following variables correctly
  9. J = 0;
  10. grad = zeros(size(theta));
  11. % ====================== YOUR CODE HERE ======================
  12. % Instructions: Compute the cost of a particular choice of theta.
  13. % You should set J to the cost.
  14. % Compute the partial derivatives and set grad to the partial
  15. % derivatives of the cost w.r.t. each parameter in theta
  16. hx = sigmoid(X * theta); %hypothesis, m * 1
  17. J = 1 / m * sum(-y' * log(hx) - (1 - y)' * log(1 - hx)) + lambda / (2 * m) * theta(2:end)' * theta(2:end);
  18. gradf = (1 / m) * (X(:, 1)' * (hx - y));
  19. gradb = (1 / m) * (X(:, 2:end)' * (hx - y)) + lambda * theta(2:end) / m;
  20. grad = [gradf;gradb];
  21. % =============================================================
  22. end

机器学习

Contributors (1)