Kaggler 0.8.0 is released. It added model.BaseAutoML
and model.AutoLGB
for automatic feature selection and hyper-parameter tuning using hyperopt
.
The implementation is based on the solution of the team AvengersEnsmbl at the KDD Cup 2019 Auto ML track. Details and winners’ solutions at the competition are available at the competition website.
model.BaseAutoML
is the base class, from which you can inherit to implement your own auto ML class. model.AutoLGB
is the auto ML class for LightGBM. It’s simple to use as follows:
from kaggler.model import AutoLGB model = AutoLGB(objective='binary', metric='auc') model.tune(X_trn, y_trn) model.fit(X_trn, y_trn) p = model.predict(X_tst)
Other updates include:
- Add
.travis.yml
for Travis CI - Add tests with
pytest
- Use
flake8
linting - Fix the macOS installation issue (#34)
For more details, please check out the documentation and repository.
Any comments or contributions will be appreciated.
Kaggler. Data Scientist. Father of Five.