:py:mod:`mlair.model_modules.flatten` ===================================== .. py:module:: mlair.model_modules.flatten Module Contents --------------- Functions ~~~~~~~~~ .. autoapisummary:: mlair.model_modules.flatten.get_activation mlair.model_modules.flatten.flatten_tail Attributes ~~~~~~~~~~ .. autoapisummary:: mlair.model_modules.flatten.__author__ mlair.model_modules.flatten.__date__ .. py:data:: __author__ :annotation: = Felix Kleinert, Lukas Leufen .. py:data:: __date__ :annotation: = 2019-12-02 .. py:function:: get_activation(input_to_activate: tensorflow.keras.layers, activation: Union[Callable, str], **kwargs) Apply activation on a given input layer. This helper function is able to handle advanced keras activations as well as strings for standard activations. :param input_to_activate: keras layer to apply activation on :param activation: activation to apply on `input_to_activate'. Can be a standard keras strings or activation layers :param kwargs: keyword arguments used inside activation layer :return: activation .. code-block:: python input_x = ... # your input data x_in = keras.layer()(input_x) # get activation via string x_act_string = get_activation(x_in, 'relu') # or get activation via layer callable x_act_layer = get_activation(x_in, keras.layers.advanced_activations.ELU) .. py:function:: flatten_tail(input_x: tensorflow.keras.layers, inner_neurons: int, activation: Union[Callable, str], output_neurons: int, output_activation: Union[Callable, str], reduction_filter: int = None, name: str = None, bound_weight: bool = False, dropout_rate: float = None, kernel_regularizer: tensorflow.keras.regularizers = None) Flatten output of convolutional layers. :param input_x: Multidimensional keras layer (ConvLayer) :param output_neurons: Number of neurons in the last layer (must fit the shape of labels) :param output_activation: final activation function :param name: Name of the flatten tail. :param bound_weight: Use `tanh' as inner activation if set to True, otherwise `activation' :param dropout_rate: Dropout rate to be applied between trainable layers :param activation: activation to after conv and dense layers :param reduction_filter: number of filters used for information compression on `input_x' before flatten() :param inner_neurons: Number of neurons in inner dense layer :param kernel_regularizer: regularizer to apply on conv and dense layers :return: flatten branch with size n=output_neurons .. code-block:: python input_x = ... # your input data conv_out = Conv2D(*args)(input_x) # your convolution stack out = flatten_tail(conv_out, inner_neurons=64, activation=keras.layers.advanced_activations.ELU, output_neurons=4 output_activation='linear', reduction_filter=64, name='Main', bound_weight=False, dropout_rate=.3, kernel_regularizer=keras.regularizers.l2() ) model = keras.Model(inputs=input_x, outputs=[out])