mlair.model_modules.flatten
¶
Module Contents¶
Functions¶
|
Apply activation on a given input layer. |
|
Flatten output of convolutional layers. |
Attributes¶
-
mlair.model_modules.flatten.
__date__
= 2019-12-02¶
-
mlair.model_modules.flatten.
get_activation
(input_to_activate: tensorflow.keras.layers, activation: Union[Callable, str], **kwargs)¶ Apply activation on a given input layer.
This helper function is able to handle advanced keras activations as well as strings for standard activations.
- Parameters
input_to_activate – keras layer to apply activation on
activation – activation to apply on `input_to_activate’. Can be a standard keras strings or activation layers
kwargs – keyword arguments used inside activation layer
- Returns
activation
input_x = ... # your input data x_in = keras.layer(<without activation>)(input_x) # get activation via string x_act_string = get_activation(x_in, 'relu') # or get activation via layer callable x_act_layer = get_activation(x_in, keras.layers.advanced_activations.ELU)
-
mlair.model_modules.flatten.
flatten_tail
(input_x: tensorflow.keras.layers, inner_neurons: int, activation: Union[Callable, str], output_neurons: int, output_activation: Union[Callable, str], reduction_filter: int = None, name: str = None, bound_weight: bool = False, dropout_rate: float = None, kernel_regularizer: tensorflow.keras.regularizers = None)¶ Flatten output of convolutional layers.
- Parameters
input_x – Multidimensional keras layer (ConvLayer)
output_neurons – Number of neurons in the last layer (must fit the shape of labels)
output_activation – final activation function
name – Name of the flatten tail.
bound_weight – Use `tanh’ as inner activation if set to True, otherwise `activation’
dropout_rate – Dropout rate to be applied between trainable layers
activation – activation to after conv and dense layers
reduction_filter – number of filters used for information compression on `input_x’ before flatten()
inner_neurons – Number of neurons in inner dense layer
kernel_regularizer – regularizer to apply on conv and dense layers
- Returns
flatten branch with size n=output_neurons
input_x = ... # your input data conv_out = Conv2D(*args)(input_x) # your convolution stack out = flatten_tail(conv_out, inner_neurons=64, activation=keras.layers.advanced_activations.ELU, output_neurons=4 output_activation='linear', reduction_filter=64, name='Main', bound_weight=False, dropout_rate=.3, kernel_regularizer=keras.regularizers.l2() ) model = keras.Model(inputs=input_x, outputs=[out])