currently I have successfully defined a custom kernel function(pre-computing the kernel matrix) using def function, and now I am using the GridSearchCV function to get the b
Wrap the model using a function:
def GBC(self):
model = GradientBoostingRegressor()
p = [{'learning_rate':[[0.0005,0.01,0.02,0.03]],'n_estimators':[[for i in range(1,100)]],'max_depth':[[4]]}]
return model,p
Then test it with a kernel by Parameter Grid:
def kernel(self,model,p):
parameter = ParameterGrid(p)
clf = GridSearchCV(model, parameter, cv=5, scoring='neg_mean_squared_error',n_jobs=2)
clf.fit(X,Y)
Use this approach you can manage the kind of function and its set of hyperparameters over a distinct function, call the function directly in main
a = the_class()
a.kernel(a.GBC())
Attacking the problem from a slightly different angle - how about using an automated parameter tuning with auto-sklearn? It is a drop-in replacement of sklearn and frequently it does a better job than manually tuned parameters.