:py:mod:`mlair.model_modules.model_class` ========================================= .. py:module:: mlair.model_modules.model_class .. autoapi-nested-parse:: Module for neural models to use during experiment. To work properly, each customised model needs to inherit from AbstractModelClass and needs an implementation of the set_model method. In this module, you can find some exemplary model classes that have been build and were running in a experiment. * `MyLittleModel`: small model implementation with a single 1x1 Conv, and 4 Dense layers (64, 32, 16, window_lead_time). * `MyBranchedModel`: a model with single 1x1 Conv, and 4 Dense layers (64, 32, 16, window_lead_time), it has three output branches from different layers of the model. * `MyTowerModel`: a more complex model with inception blocks (called towers) * `MyPaperModel`: A model used for the publication: In addition, a short introduction how to create your own model is given hereinafter. How to create a customised model? ################################# * Create a new class: .. code-block:: python class MyCustomisedModel(AbstractModelClass): def __init__(self, input_shape: list, output_shape: list): super().__init__(input_shape[0], output_shape[0]) # settings self.dropout_rate = 0.1 self.activation = keras.layers.PReLU # apply to model self.set_model() self.set_compile_options() self.set_custom_objects(loss=self.compile_options['loss']) * Make sure to add the `super().__init__()` and at least `set_model()` and `set_compile_options()` to your custom init method. * If you have custom objects in your model, that are not part of keras, you need to add them to custom objects. To do this, call `set_custom_objects` with arbitrarily kwargs. In the shown example, the loss has been added, because it wasn't a standard loss. Apart from this, we always encourage you to add the loss as custom object, to prevent potential errors when loading an already created model instead of training a new one. * Build your model inside `set_model()`, e.g. .. code-block:: python class MyCustomisedModel(AbstractModelClass): def set_model(self): x_input = keras.layers.Input(shape=self._input_shape) x_in = keras.layers.Conv2D(32, (1, 1), padding='same', name='{}_Conv_1x1'.format("major"))(x_input) x_in = self.activation(name='{}_conv_act'.format("major"))(x_in) x_in = keras.layers.Flatten(name='{}'.format("major"))(x_in) x_in = keras.layers.Dropout(self.dropout_rate, name='{}_Dropout_1'.format("major"))(x_in) x_in = keras.layers.Dense(16, name='{}_Dense_16'.format("major"))(x_in) x_in = self.activation()(x_in) x_in = keras.layers.Dense(self._output_shape, name='{}_Dense'.format("major"))(x_in) out_main = self.activation()(x_in) self.model = keras.Model(inputs=x_input, outputs=[out_main]) * Your are free, how to design your model. Just make sure to save it in the class attribute model. * Additionally, set your custom compile options including the loss. .. code-block:: python class MyCustomisedModel(AbstractModelClass): def set_compile_options(self): self.initial_lr = 1e-2 self.optimizer = keras.optimizers.SGD(lr=self.initial_lr, momentum=0.9) self.lr_decay = mlair.model_modules.keras_extensions.LearningRateDecay(base_lr=self.initial_lr, drop=.94, epochs_drop=10) self.loss = keras.losses.mean_squared_error self.compile_options = {"metrics": ["mse", "mae"]} * If you have a branched model with multiple outputs, you need either set only a single loss for all branch outputs or to provide the same number of loss functions considering the right order. E.g. .. code-block:: python class MyCustomisedModel(AbstractModelClass): def set_model(self): ... self.model = keras.Model(inputs=x_input, outputs=[out_minor_1, out_minor_2, out_main]) def set_compile_options(self): self.loss = [keras.losses.mean_absolute_error] + # for out_minor_1 [keras.losses.mean_squared_error] + # for out_minor_2 [keras.losses.mean_squared_error] # for out_main How to access my customised model? ################################## If the customised model is created, you can easily access the model with >>> MyCustomisedModel().model The loss is accessible via >>> MyCustomisedModel().loss You can treat the instance of your model as instance but also as the model itself. If you call a method, that refers to the model instead of the model instance, you can directly apply the command on the instance instead of adding the model parameter call. >>> MyCustomisedModel().model.compile(**kwargs) == MyCustomisedModel().compile(**kwargs) True Module Contents --------------- Classes ~~~~~~~ .. autoapisummary:: mlair.model_modules.model_class.MyLittleModelHourly mlair.model_modules.model_class.MyBranchedModel mlair.model_modules.model_class.MyTowerModel mlair.model_modules.model_class.IntelliO3_ts_architecture Attributes ~~~~~~~~~~ .. autoapisummary:: mlair.model_modules.model_class.__author__ mlair.model_modules.model_class.__date__ .. py:data:: __author__ :annotation: = Lukas Leufen, Felix Kleinert .. py:data:: __date__ :annotation: = 2020-05-12 .. py:class:: MyLittleModelHourly(input_shape: list, output_shape: list) Bases: :py:obj:`mlair.model_modules.AbstractModelClass` A customised model with a 1x1 Conv, and 4 Dense layers (64, 32, 16, window_lead_time), where the last layer is the output layer depending on the window_lead_time parameter. Dropout is used between the Convolution and the first Dense layer. .. py:method:: set_model(self) Build the model. .. py:method:: set_compile_options(self) This method only has to be defined in child class, when additional compile options should be used () (other options than optimizer and loss) Has to be set as dictionary: {'optimizer': None, 'loss': None, 'metrics': None, 'loss_weights': None, 'sample_weight_mode': None, 'weighted_metrics': None, 'target_tensors': None } :return: .. py:class:: MyBranchedModel(input_shape: list, output_shape: list) Bases: :py:obj:`mlair.model_modules.AbstractModelClass` A customised model with a 1x1 Conv, and 4 Dense layers (64, 32, 16, window_lead_time), where the last layer is the output layer depending on the window_lead_time parameter. Dropout is used between the Convolution and the first Dense layer. .. py:method:: set_model(self) Build the model. .. py:method:: set_compile_options(self) This method only has to be defined in child class, when additional compile options should be used () (other options than optimizer and loss) Has to be set as dictionary: {'optimizer': None, 'loss': None, 'metrics': None, 'loss_weights': None, 'sample_weight_mode': None, 'weighted_metrics': None, 'target_tensors': None } :return: .. py:class:: MyTowerModel(input_shape: list, output_shape: list) Bases: :py:obj:`mlair.model_modules.AbstractModelClass` The AbstractModelClass provides a unified skeleton for any model provided to the machine learning workflow. The model can always be accessed by calling ModelClass.model or directly by an model method without parsing the model attribute name (e.g. ModelClass.model.compile -> ModelClass.compile). Beside the model, this class provides the corresponding loss function. .. py:method:: set_model(self) Build the model. .. py:method:: set_compile_options(self) This method only has to be defined in child class, when additional compile options should be used () (other options than optimizer and loss) Has to be set as dictionary: {'optimizer': None, 'loss': None, 'metrics': None, 'loss_weights': None, 'sample_weight_mode': None, 'weighted_metrics': None, 'target_tensors': None } :return: .. py:class:: IntelliO3_ts_architecture(input_shape: list, output_shape: list) Bases: :py:obj:`mlair.model_modules.AbstractModelClass` The AbstractModelClass provides a unified skeleton for any model provided to the machine learning workflow. The model can always be accessed by calling ModelClass.model or directly by an model method without parsing the model attribute name (e.g. ModelClass.model.compile -> ModelClass.compile). Beside the model, this class provides the corresponding loss function. .. py:method:: set_model(self) Build the model. :param activation: activation function :param window_history_size: number of historical time steps included in the input data :param channels: number of variables used in input data :param dropout_rate: dropout rate used in the model [0, 1) :param window_lead_time: number of time steps to forecast in the output layer :return: built keras model .. py:method:: set_compile_options(self) This method only has to be defined in child class, when additional compile options should be used () (other options than optimizer and loss) Has to be set as dictionary: {'optimizer': None, 'loss': None, 'metrics': None, 'loss_weights': None, 'sample_weight_mode': None, 'weighted_metrics': None, 'target_tensors': None } :return: