Kaggler 0.3.7 Released

Changes:

  • Cython optimization for performance – boundscheck(False), wraparound(False), and cdivision(True) are used.
  • Adaptive learning rate – instead of \frac{1}{\sqrt{n_i} + 1}, \frac{1}{\sqrt{\sum{g_i^2}} + 1} is used where g_i is the gradient of the associated weight.
  • Type correction – change the type of index from double to int.

You can upgrade Kaggler either by using pip:

$ (sudo) pip install -U Kaggler

or from the source at github:

$ git fetch origin
$ git rebase origin/master
$ python setup.py build_ext --inplace
$ (sudo) python setup.py install

I haven’t had a chance to use it with real competition data yet – after the Avazu competition, I deleted whole build directory 🙁 – and I don’t have numbers for how much faster (or slower?!) it becomes after these changes yet.

I will jump into another competition soon, and let you know how it works. 🙂

Kaggler. Data Scientist.

Author: jeongyoonlee

Kaggler. Data Scientist.

Leave a Reply

Your email address will not be published. Required fields are marked *