Using sklearn, how do I find depth of a decision tree?

元气小坏坏 提交于 2019-12-08 17:30:36

问题


I am training a decision tree with sklearn. When I use:

dt_clf = tree.DecisionTreeClassifier()

the max_depth parameter defaults to None. According to the documentation, if max_depth is None, then nodes are expanded until all leaves are pure or until all leaves contain less than min_samples_split samples.

After fitting my model, how do I find out what max_depth actually is? The get_params() function doesn't help. After fitting, get_params() it still says None.

How can I get the actual number for max_depth?

Docs: https://scikit-learn.org/stable/modules/generated/sklearn.tree.DecisionTreeClassifier.html


回答1:


Access the max_depth for the underlying Tree object:

from sklearn import tree
X = [[0, 0], [1, 1]]
Y = [0, 1]
clf = tree.DecisionTreeClassifier()
clf = clf.fit(X, Y)
print(clf.tree_.max_depth)
>>> 1

You may get more accessible attributes from the underlying tree object using:

help(clf.tree_)

These include max_depth, node_count, and other lower-level parameters.



来源:https://stackoverflow.com/questions/54499114/using-sklearn-how-do-i-find-depth-of-a-decision-tree

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!