| udacity/sagemaker-deployment |
445 |
|
0 |
0 |
over 3 years ago |
0 |
|
17 |
mit |
Jupyter Notebook |
| Code and associated files for the deploying ML models within AWS SageMaker |
| teezeit/tuning_xgboost |
87 |
|
0 |
0 |
about 9 years ago |
0 |
|
1 |
|
Python |
| Using consecutive (greedy) gridsearch with cross validation to tune xgboost hyperparameters in python. |
| gmontamat/gentun |
80 |
|
0 |
0 |
over 2 years ago |
0 |
|
12 |
apache-2.0 |
Python |
| Hyperparameter tuning for machine learning models using a distributed genetic algorithm |
| pfnet-research/autogbt-alt |
73 |
|
0 |
0 |
about 7 years ago |
0 |
|
0 |
mit |
Python |
| An experimental Python package that reimplements AutoGBT using LightGBM and Optuna. |
| YCG09/xgbspark-text-classification |
43 |
|
0 |
0 |
almost 8 years ago |
0 |
|
0 |
apache-2.0 |
Scala |
| XGBoost on Spark for Chinese Text Classification |
| rstudio/sparkxgb |
40 |
|
0 |
0 |
over 4 years ago |
0 |
|
14 |
other |
R |
| R interface for XGBoost on Spark |
| KSpiliop/Fraud_Detection |
30 |
|
0 |
0 |
almost 9 years ago |
0 |
|
0 |
|
Jupyter Notebook |
| Tuning XGBoost hyper-parameters with Simulated Annealing |
| davidcamilo0710/QATAR_2022_Prediction |
29 |
|
0 |
0 |
over 2 years ago |
0 |
|
0 |
|
Jupyter Notebook |
| QATAR 2022 World Cup prediction from the international matches played since the 90s, the qualifications of the teams in their last matches, and the potential of each team. |
| LeoPetrini/XGBoost-in-Insurance-2017 |
12 |
|
0 |
0 |
almost 9 years ago |
0 |
|
0 |
|
R |
| Data and Code to reproduce results for my talk at Paris: R in Insurance 2017 Conference |
| talperetz/awesome-gradient-boosting |
8 |
|
0 |
0 |
about 7 years ago |
0 |
|
0 |
mit |
|
| A curated list of Gradient Boosting resources for Data Scientists |