Subscribe to DSC Newsletter

Logistic regression: minimize J(theta) by Gradient Descent algorithm

Hello everyone,

I want to minimize J(theta) of Logistic regression by using Gradient Descent(GD) algorithm.

I have wrote a code in matlab and python both by using GD but getting the value of theta very less/different(wrt fminunc function of Matlab) 

For example: for the given set of data, by using GD algorithm, with following input:

num_iters=400; alpha=0.0001;

got the following output:

  • theta:= [-0.002717 ; 0.010242; 0.000669]
  • For a student with scores 45 and 85, we predict an admission probability of 0.626000
  • Train Accuracy: 60.000000

while using the fminunc(a matlab inbuilt function) with iter=400 ,got the following output.

  • theta: = [-24.932761 ; 0.204406 ; 0.199616]
  • For a student with scores 45 and 85, we predict an admission probability of 0.774321
  • Train Accuracy: 89.000000

Please help me, which method is correct to adopt and why?

I am uploading the all files of my code(both matlab and python).

I am also unable to plot the decision boundary.

Can anyone please help me to sort out my problem?

Thanks and regards

Views: 335

Attachments:

Reply to This

© 2018   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service