I am trying to code up logistic regression in Python using the SciPy fmin_bfgs function, but am running into some issues. I wrote functions for the logistic (si
I was facing the same issues. When I experimented with different algorithms implementation in scipy.optimize.minimize , I found that for finding optimal logistic regression parameters for my data set , Newton Conjugate Gradient proved helpful. Call can be made to it like:
Result = scipy.optimize.minimize(fun = logLikelihoodLogit,
x0 = np.array([-.1, -.03, -.01, .44, .92, .53,1.8, .71]),
args = (mX, vY),
method = 'TNC',
jac = likelihoodScore);
optimLogit = Result.x;