李华,胡晓绢.基于交互信息的连续值属性决策树学习算法[J].南华大学学报(自然科学版),2008,22(2):84~87.[LI Hua1,HU Xiao-juan2.Using Mutual Information For Selecting Continuous-Valued Attribute in Decision Tree Learning[J].Journal of University of South China(Science and Technology),2008,22(2):84~87.]
基于交互信息的连续值属性决策树学习算法
Using Mutual Information For Selecting Continuous-Valued Attribute in Decision Tree Learning
投稿时间:2008-03-25  
DOI:
中文关键词:  经验概念学习  决策树  最小信息熵  交互信息  离散化
英文关键词:Decision trees  Information entropy Minimization  Discretization  Classification  Mutual information
基金项目:
作者单位
李华1,胡晓绢2 1.石家庄铁道学院 数理系河北 石家庄 0500432.石家庄铁道学院 电气与电子工程学院河北 石家庄 050043 
摘要点击次数: 743
全文下载次数: 1033
中文摘要:
      Fayyad连续值属性决策树学习算法使用信息熵的下降速度作为选取扩展属性标准的启发式,本文针对其易选取重复的条件属性等不足之处,引入属性间的交互信息,提出了一种改进算法——基于交互信息的连续值属性决策树学习算法,它的核心是使用信息熵和交互信息的下降速度作为选取扩展属性标准的启发式.实验结果表明,与Fayyad决策树学习算法相比,该算法降低了决策树中同一扩展属性的重复选取率,实现了信息熵的真正减少,提高了训练精度和测试精度,能构造出更优的决策树.
英文摘要:
      In this paper, we proposed a learning algorithm using the information entropy minimization heuristic and mutual information entropy heuristic to select expanded attributes. For a data set, of which the values of condition attributes are continuous, most of the current decision trees learning algorithms often select the previously selected attributes for branching. The repeated selection limits the accuracy of training and testing and the structure of decision trees may become complex. So in the selection of attributes, the previously selected attributes and the other attributes, which have high correlation to the previously selected attributes, should not be selected again. Here, we use mutual information to avoid selecting the previously selected attributes in the generation of decision trees and our test results show that this method can obtain good performance.
查看全文  查看/发表评论  下载PDF阅读器
关闭