# Logistic regression: minimize J(theta) by Gradient Descent algorithm

Hello everyone,

I want to minimize J(theta) of Logistic regression by using Gradient Descent(GD) algorithm.

I have wrote a code in matlab and python both by using GD but getting the value of theta very less/different(wrt fminunc function of Matlab)

For example: for the given set of data, by using GD algorithm, with following input:

num_iters=400; alpha=0.0001;

got the following output:

• theta:= [-0.002717 ; 0.010242; 0.000669]
• For a student with scores 45 and 85, we predict an admission probability of 0.626000
• Train Accuracy: 60.000000

while using the fminunc(a matlab inbuilt function) with iter=400 ,got the following output.

• theta: = [-24.932761 ; 0.204406 ; 0.199616]
• For a student with scores 45 and 85, we predict an admission probability of 0.774321
• Train Accuracy: 89.000000

I am uploading the all files of my code(both matlab and python).

I am also unable to plot the decision boundary.

Thanks and regards

Views: 668

Attachments: