Welcome to IJSDR UGC CARE norms ugc approved journal norms IJRTI Research Journal | ISSN : 2455-2631
INTERNATIONAL JOURNAL OF SCIENTIFIC DEVELOPMENT AND RESEARCH International Peer Reviewed & Refereed Journals, Open Access Journal ISSN Approved Journal No: 2455-2631 | Impact factor: 8.15 | ESTD Year: 2016
Scholarly open access journals, Peer-reviewed, and Refereed Journals, Impact factor 8.15 (Calculate by google scholar and Semantic Scholar | AI-Powered Research Tool) , Multidisciplinary, Monthly, Indexing in all major database & Metadata, Citation Generator, Digital Object Identifier(DOI)
Extreme Gradient Boosting using Squared Logistics Loss function
Authors Name:
Anju
, Akaash Vishal Hazarika
Unique Id:
IJSDR1708010
Published In:
Volume 2 Issue 8, August-2017
Abstract:
Tree boosting has empirically proven to be a highly effective approach to predictive modeling. It has shown remarkable results for a vast array of problems. More recently, a tree boosting method known as XGBoost has gained popularity by winning numerous machine learning competitions. In this manuscript, we will investigate how XGBoost differs from the more traditional ensemble techniques. Moreover, we will discuss the regularization techniques that these methods offer and the effect these have on the models. In addition to this, we will attempt to answer the question of why XGBoost seems to win so many competitions. To do this, we will provide some arguments for why tree boosting, and in particular XGBoost, seems to be such a highly effective and versatile approach to predictive modeling. The core argument is that tree boosting can be seen to adaptively determine the local neighborhoods of the model. Tree boosting can thus be seen to take the bias-variance tradeoff into consideration during model fitting. XGBoost further introduces some improvements which allow it to deal with the bias-variance tradeoff even more carefully. We performed these techniques in outliers also. Additionally, we perform XGBoost with a loss function named squared logistics loss (SqLL) and find out loss percentage. Also we applied this SqLL with other algorithm also.
Keywords:
XGBoost, AdaBoost, Random Forest, Big Data, Boosting, Loss, Logistics Loss Function
Cite Article:
"Extreme Gradient Boosting using Squared Logistics Loss function", International Journal of Science & Engineering Development Research (www.ijsdr.org), ISSN:2455-2631, Vol.2, Issue 8, page no.54 - 61, August-2017, Available :http://www.ijsdr.org/papers/IJSDR1708010.pdf
Downloads:
000223215
Publication Details:
Published Paper ID: IJSDR1708010
Registration ID:170694
Published In: Volume 2 Issue 8, August-2017
DOI (Digital Object Identifier):
Page No: 54 - 61
Publisher: IJSDR | www.ijsdr.org
ISSN Number: 2455-2631
Facebook Twitter Instagram LinkedIn