While the AdaBoost functions use adaptive boosting, the XGBoost functions use gradient boosting, which provides a general framework for adding any loss function and applies some optimizations for better scalability.
While the AdaBoost functions use adaptive boosting, the XGBoost functions use gradient boosting, which provides a general framework for adding any loss function and applies some optimizations for better scalability.