## TensorFlow API – Activation Function

Activation Functions The activation ops provide different types of onolinearities for use in neural networks. These include: smooth nonlinearities (sigmoid, tanh, elu, softplus, and softsign) continous but not everywhere differentiable functions (relu, relu6, and relu_x) and random regularization (dropout) All activation ops apply componentwise, and produce a tensor of the same shape as the input … Continue reading TensorFlow API – Activation Function