2016-12-25 113 views
0

我使用GridSearchCV为我的SVR模型获取最佳参数(C & gamma),当我运行它时,结果,所以这段代码有什么问题?Scikit通过GridSearchCV学习发现最佳C&gamma()__I'm卡住

from sklearn.model_selection import KFold 
C_range = np.logspace(-2, 10, 13) 
gamma_range = np.logspace(-9, 3, 13) 
param_grid = dict(gamma=gamma_range, C=C_range) 
cv = KFold(n_splits=5, shuffle=False, random_state=None) 
grid = GridSearchCV(SVR(kernel='rbf'), param_grid=param_grid, cv=cv) 
grid.fit(X, y) 

print("The best parameters are %s with a score of %0.2f" 
    % (grid.best_params_, grid.best_score_)) 

回答

0

n_splits不是sklearn.cross_validation.ShuffleSplit设置了一个param相反,它是一个sklearn.model_selection.ShuffleSplit PARAM。

code basesklearn.cross_validation的:

class ShuffleSplit(BaseShuffleSplit): 
    """Random permutation cross-validation iterator. 

    .. deprecated:: 0.18 
     This module will be removed in 0.20. 
     Use :class:`sklearn.model_selection.ShuffleSplit` instead. 
+0

我更新,错误是固定的,但我现在有另一个错误...类型错误“KFold”对象不是可迭代 –

+0

'cv' PARAM不能把'KFold'作为输入。如果你希望'5'分割给出'cv = 5'作为输入,它将使用'KFold'或'StratifiedKFold'来产生分割。参见文档[here](http://scikit-learn.org/stable/modules/cross_validation.html#computing-cross-validated-metrics)。 – tihom