TensorFlow API – Activation Function

Activation Functions

The activation ops provide different types of onolinearities for use in neural networks. These include:

  • smooth nonlinearities (sigmoid, tanh, elu, softplus, and softsign)
  • continous but not everywhere differentiable functions (relu, relu6, and relu_x) and
  • random regularization (dropout)

All activation ops apply componentwise, and produce a tensor of the same shape as the input tensor.

1
tf.nn.relu(features, name=None)

Computes rectified linear: max(features, 0).

1
tf.nn.relu6(features, name=None)

Computes Rectified Linear 6: min(max(features, 0), 6).

1
tf.nn.elu(features, name=None)

Compute exponential linear:

1
exp(feature) - 1

if < 0,

1
features

otherwise.

See Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)

Screen Shot 2017-01-14 at 1.39.39 P

1
tf.nn.softplus(features, name=None)

Compute softplus:

1
log(exp(features) + 1)

.

1
tf.nn.softsign(features, name=None)

Compute softsign:

1
features / (abs(features) + 1)

.

1
tf.nn.dropout(x, keep_prob, noise_shape=None, seed=None, name=None)

Computes dropout.

With probability

1
keep_prob

, outputs the input element scaled up by

1
1 / keep_prob

, otherwise outputs

1
0

. The scaling is so that the expected sum is unchanged.

By default, each element is kept or dropped independently. If

1
noise_shape

is specified, it must be broadcastable to the shape of x , and only dimensions with

1
noise_shape[i]

will make independent decisions.

1
tf.nn.bias_add(value, bias, data_format=None, name=None)

Add

1
bias

to

1
value

.
This is (mostly) a special case of

1
tf.add

where

1
bias

is restricted to 1-D. Broadcasting is supported, so

1
value

may have any number of dimensions.

1
tf.sigmoid(x, name=None)

Computes sigmoid of

1
x

element-wise. Specifically,

1
y = 1 / (1 + exp(-x))

.

1
tf.tanh(x, name=None)

Computes hyperbolic tangent of

1
x

element-wise.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.