athena.layers.functional
¶
Utils for common layers.
Module Contents¶
Functions¶
|
generate a postional encoding list |
|
reshape from [N T D C] -> [N T D*C] |
|
Splice a tensor along the last dimension with context. |
|
Gaussian Error Linear Unit. |
|
Gated Linear Unit. |
- athena.layers.functional.make_positional_encoding(position, d_model)¶
generate a postional encoding list
- athena.layers.functional.collapse4d(x, name=None)¶
reshape from [N T D C] -> [N T D*C] using tf.shape(x), which generate a tensor instead of x.shape
- athena.layers.functional.splice(x, context)¶
Splice a tensor along the last dimension with context.
Example:
>>> t = [[[1, 2, 3], >>> [4, 5, 6], >>> [7, 8, 9]]] >>> splice_tensor(t, [0, 1]) = >>> [[[1, 2, 3, 4, 5, 6], >>> [4, 5, 6, 7, 8, 9], >>> [7, 8, 9, 7, 8, 9]]]
- Parameters
tensor – a tf.Tensor with shape (B, T, D) a.k.a. (N, H, W)
context – a list of context offsets
- Returns
spliced tensor with shape (…, D * len(context))
- athena.layers.functional.gelu(x)¶
Gaussian Error Linear Unit. This is a smoother version of the RELU. Original paper: https://arxiv.org/abs/1606.08415
- Parameters
x – float Tensor to perform activation.
- Returns
x with the GELU activation applied.
- athena.layers.functional.glu(x, axis=-1)¶
Gated Linear Unit.
Original paper: https://arxiv.org/abs/1612.08083
- Parameters
x – float Tensor to perform activation.
- Returns
x with the GLU activation applied.