My new Python package libactivation is now on the Python Package Index (PyPI):

https://pypi.org/project/libactivation/

The package implements a series of activation functional - sigmoidal and others, i.a. the Rectified Linear Unit (ReLU) - as well as their derivatives, for various machine learning purpose, such as neural networks.

Development takes place on Github:

https://github.com/bquast/libactivation

Bugs can also be filed on Github:

https://github.com/bquast/libactivation/issues

Much of the inspiration came from my 2015 R package sigmoid, which was split out of my Recurrent Neural Network framework RNN:

https://en.m.wikipedia.org/wiki/Rnn_(software)

Updated: