ConstraintGBM is a machine learning project that focus on the fairness and Neyman-Pearson tasks. The project is based on two popular gradient boosting libraries, XGBoost and LightGBM. Our project customizes the objective function of XGBoost and LightGBM to include additional information like the last predicted score. The customization enables us to address fairness and Neyman-Pearson tasks in our predictions.
- Customized objective function in C++.
- The main algorithm logic is implemented in Python, hence only need to call xgboost and lightgbm's
training
api.- Only support Python interface
All of our installed dependencies are consistent with xgboost-2.0.0-dev and lightgbm-3.3.5
To install ConstrainGBM, simply clone this repository and run the following command:
bash install.sh
Some examples are put under LightGBM-3.3.5/np_fair_test
and xgboost/np_fair_test
folder.
All experimental results within the paper can be reproduced with the following commands:
python ./test_adult_fair/adult_fair_lgb_random_para.py
python ./test_adult_fair/adult_fair_xgb_random_para.py
python ./test_compas_fair/compas_fair_lgb_random_para.py
python ./test_compas_fair/compas_fair_xgb_random_para.py
python ./test_credit_np/credit_np_lgb_random_para.py
python ./test_credit_np/credit_np_xgb_random_para.py
python ./test_drybean_np/drybean_np_lgb_random_para.py
python ./test_drybean_np/drybean_np_xgb_random_para.py
- ConstraintGBM is a research software, therefore it should not be used in production.
- Please open an issue if you find any problems, developers will try to fix and find alternatives.
Many thanks to the very active XGBoost and LightGBM community for their enthusiastic answers to our questions. Many thanks to chatgpt for assisting us in writing the code.