Flux re-exports all of the functions exported by the NNlib package.
Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call σ.(xs)
, relu.(xs)
and so on.
NNlib.celu
NNlib.elu
NNlib.gelu
NNlib.hardsigmoid
NNlib.hardtanh
NNlib.leakyrelu
NNlib.lisht
NNlib.logcosh
NNlib.logsigmoid
NNlib.mish
NNlib.relu
NNlib.relu6
NNlib.rrelu
NNlib.selu
NNlib.sigmoid
NNlib.softplus
NNlib.softshrink
NNlib.softsign
NNlib.swish
NNlib.tanhshrink
NNlib.trelu
NNlib.softmax
NNlib.logsoftmax
NNlib.maxpool
NNlib.meanpool
NNlib.conv
NNlib.depthwiseconv
NNlib.upsample_nearest
NNlib.upsample_bilinear
NNlib.pixel_shuffle
NNlib.batched_mul
NNlib.batched_mul!
NNlib.batched_adjoint
NNlib.batched_transpose
NNlib.gather
NNlib.scatter