Niranjan41288/Ensemble-Methods-using-R

语言: R

git: https://github.com/Niranjan41288/Ensemble-Methods-using-R

我已经完成了关于整体方法的个人项目(论文)。其中我第一次做了di ...的背景研究。
I have done my individual project (dissertation) on ensemble methods. In which I first did the background study on di…
README.md (中文)

合奏-方法-使用-R

我已经完成了关于整体方法的个人项目(论文)。其中我首先对不同的集合方法进行了背景研究,然后在底层机器学习算法上实现了Boosting,AdaBoost,Bagging和随机森林技术。我使用提升方法来提高像学习决策树桩这样的弱学习者的表现。为决策树(回归和分类问题)和KNN分类器实施套袋。使用随机森林进行分类树。我已经使用不同的阈值在逻辑回归算法上实现了一种称为“AdaBoost”的特殊增强算法。然后绘制不同的图形,如错误率作为增强,装袋和随机森林迭代的函数。比较装袋和加强的结果。在应用集合方法之前和应用集合方法之后分析分类器的性能。使用不同的模型评估技术,如交叉验证,MSE,PRSS,ROC曲线,混淆矩阵和out-of-bag误差估计来估计集合技术的性能。

本文使用googletrans自动翻译,仅供参考, 原文来自github.com

en_README.md

Ensemble-Methods-using-R

I have done my individual project (dissertation) on ensemble methods. In which I first did the background study on different ensemble methods and then implemented Boosting, AdaBoost, Bagging and random forest techniques on underlying machine learning algorithms. I used boosting method to boost the performance of weak learner like decision stumps. Implemented bagging for decision trees (both regression and classification problems) and for KNN classifier. Used random forest for classification trees. I have implemented a special algorithm of boosting called “AdaBoost” on logistic regression algorithm using different threshold values. Then plotted the different graphs like an error rate as a function of boosting, bagging and random forest iterations. Compared results of bagging with boosting. Analysed the performance of classifier before applying ensemble methods and after applying ensemble methods. Used different model evaluation techniques like cross-validation, MSE, PRSS, ROC curves, confusion matrix, and out-of-bag error estimation to estimate the performance of ensemble techniques.