My new Python package libactivation is now on the Python Package Index (PyPI):

The package implements a series of activation functional - sigmoidal and others, i.a. the Rectified Linear Unit (ReLU) - as well as their derivatives, for various machine learning purpose, such as neural networks.

Development takes place on Github:

Bugs can also be filed on Github:

Much of the inspiration came from my 2015 R package sigmoid, which was split out of my Recurrent Neural Network framework RNN: