Building a Machine Learning model

For building the model, We make use of Soft Voting Classification with AdaBoost\citep{freund1995desicion} and XGBoost\citep{chen2016xgboost}  assigned as the estimators for the model, with a high weight assigned to XGBoost classifier model. Adaboost is a meta-estimator algorithm that fits the assigned algorithm to the dataset and any incorrectly classified instances are adjusted for better classification. XGBoost is an implementation of Gradient Boosted trees which have been optimized for speed and accuracy. The model is created using scikit-learn \citep{pedregosa2011scikit}  and XGBoost\citep{chen2016xgboost} Python package.scikit-learnscikit-learn \citep{pedregosa2011scikit}  and XGBoost\citep{chen2016xgboost} Python package.
The parameters for the Adaboost model are as follows-
algorithm="SAMME"   for SAMME discrete boosting algorithm
learning_rate=0.01 for setting the contribution of the classifier.
n_estimators=800 for setting the termination number at which boosting will be terminated.
base_classifier=DecisionTreeClassifier(max_depth=4) for using Decision Tree as a base classifier with maximum depth as 4.
The parameters for the XGBoost model are as follows-