Welcome to IJSDR UGC CARE norms ugc approved journal norms IJRTI Research Journal | ISSN : 2455-2631
International Peer Reviewed & Refereed Journals, Open Access Journal
ISSN Approved Journal No: 2455-2631 | Impact factor: 8.15 | ESTD Year: 2016
Scholarly open access journals, Peer-reviewed, and Refereed Journals, Impact factor 8.15 (Calculate by google scholar and Semantic Scholar | AI-Powered Research Tool) , Multidisciplinary, Monthly, Indexing in all major database & Metadata, Citation Generator, Digital Object Identifier(DOI)

Issue: June 2023

Volume 8 | Issue 6

Impact factor: 8.15

Click Here For more Info

Imp Links for Author
Imp Links for Reviewer
Research Area
Subscribe IJSDR
Visitor Counter

Copyright Infringement Claims
Indexing Partner
Published Paper Details
Paper Title: Extreme Gradient Boosting using Squared Logistics Loss function
Authors Name: Anju , Akaash Vishal Hazarika
Unique Id: IJSDR1708010
Published In: Volume 2 Issue 8, August-2017
Abstract: Tree boosting has empirically proven to be a highly effective approach to predictive modeling. It has shown remarkable results for a vast array of problems. More recently, a tree boosting method known as XGBoost has gained popularity by winning numerous machine learning competitions. In this manuscript, we will investigate how XGBoost differs from the more traditional ensemble techniques. Moreover, we will discuss the regularization techniques that these methods offer and the effect these have on the models. In addition to this, we will attempt to answer the question of why XGBoost seems to win so many competitions. To do this, we will provide some arguments for why tree boosting, and in particular XGBoost, seems to be such a highly effective and versatile approach to predictive modeling. The core argument is that tree boosting can be seen to adaptively determine the local neighborhoods of the model. Tree boosting can thus be seen to take the bias-variance tradeoff into consideration during model fitting. XGBoost further introduces some improvements which allow it to deal with the bias-variance tradeoff even more carefully. We performed these techniques in outliers also. Additionally, we perform XGBoost with a loss function named squared logistics loss (SqLL) and find out loss percentage. Also we applied this SqLL with other algorithm also.
Keywords: XGBoost, AdaBoost, Random Forest, Big Data, Boosting, Loss, Logistics Loss Function
Cite Article: "Extreme Gradient Boosting using Squared Logistics Loss function", International Journal of Science & Engineering Development Research (www.ijsdr.org), ISSN:2455-2631, Vol.2, Issue 8, page no.54 - 61, August-2017, Available :http://www.ijsdr.org/papers/IJSDR1708010.pdf
Downloads: 000223215
Publication Details: Published Paper ID: IJSDR1708010
Registration ID:170694
Published In: Volume 2 Issue 8, August-2017
DOI (Digital Object Identifier):
Page No: 54 - 61
Publisher: IJSDR | www.ijsdr.org
ISSN Number: 2455-2631

Click Here to Download This Article

Article Preview

Click here for Article Preview

Major Indexing from www.ijsdr.org
Google Scholar ResearcherID Thomson Reuters Mendeley : reference manager Academia.edu
arXiv.org : cornell university library Research Gate CiteSeerX DOAJ : Directory of Open Access Journals
DRJI Index Copernicus International Scribd DocStoc

Track Paper
Important Links
Conference Proposal
DOI (A digital object identifier)

Providing A digital object identifier by DOI
How to GET DOI and Hard Copy Related
Open Access License Policy
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Creative Commons License
This material is Open Knowledge
This material is Open Data
This material is Open Content
Social Media

Indexing Partner