report.html

Report generated on 18-Dec-2023 at 17:50:48 by pytest-html v3.1.1

Environment

CI true
CI_COMMIT_REF_NAME enxhi_issue460_remove_TOAR-I_access
CI_COMMIT_REF_SLUG enxhi-issue460-remove-toar-i-access
CI_COMMIT_SHA a4fd44042ad6685a4548c82920503e726c1d9acb
CI_JOB_ID 967176
CI_JOB_NAME tests
CI_JOB_STAGE test
CI_PIPELINE_ID 170224
CI_PROJECT_DIR /builds/esde/machine-learning/mlair
CI_PROJECT_ID 2411
CI_PROJECT_NAME mlair
CI_PROJECT_NAMESPACE esde/machine-learning
CI_PROJECT_PATH esde/machine-learning/mlair
CI_PROJECT_URL https://gitlab.jsc.fz-juelich.de/esde/machine-learning/mlair
CI_REGISTRY registry.jsc.fz-juelich.de
CI_REGISTRY_IMAGE registry.jsc.fz-juelich.de/esde/machine-learning/mlair
CI_REGISTRY_USER gitlab-ci-token
CI_RUNNER_DESCRIPTION gitlab-runner on zam347 with already installed requirements of MLAir using opensuse/leap
CI_RUNNER_ID 657
CI_RUNNER_TAGS ["leap", "opensuse", "zam347", "mlair", "machinelearningtools"]
CI_SERVER yes
CI_SERVER_NAME GitLab
CI_SERVER_REVISION d2d66de7163
CI_SERVER_VERSION 16.6.2
GITLAB_CI true
GITLAB_USER_EMAIL e.kreshpa@fz-juelich.de
GITLAB_USER_ID 1690
Packages {"pluggy": "0.13.1", "py": "1.11.0", "pytest": "6.2.2"}
Platform Linux-5.15.0-73-generic-x86_64-with-glibc2.31
Plugins {"cov": "2.11.1", "html": "3.1.1", "lazy-fixture": "0.6.3", "metadata": "2.0.1"}
Python 3.9.13

Summary

572 tests ran in 1044.17 seconds.

560 passed, 0 skipped, 12 failed, 6 errors, 0 expected failures, 0 unexpected passes

Results

Result Test Duration Links
Error test/test_run_modules/test_pre_processing.py::TestPreProcessing::test_run::setup 2.87
self = <test_pre_processing.TestPreProcessing object at 0x7fb090283250>

@pytest.fixture
def obj_with_exp_setup(self):
with RunEnvironment():
> ExperimentSetup(stations=['DEBW107', 'DEBW013', 'DEBW087', 'DEBW99X'],
statistics_per_var={'o3': 'dma8eu', 'temp': 'maximum'}, station_type="background",
data_origin={'o3': 'UBA', 'temp': 'UBA'}, data_handler=DefaultDataHandler)

test/test_run_modules/test_pre_processing.py:32:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <mlair.run_modules.experiment_setup.ExperimentSetup object at 0x7fb090283100>
experiment_date = None, stations = ['DEBW107', 'DEBW013', 'DEBW087', 'DEBW99X']
variables = None, statistics_per_var = {'o3': 'dma8eu', 'temp': 'maximum'}
start = None, end = None, window_history_size = None, target_var = 'o3'
target_dim = None, window_lead_time = None, window_dim = None, dimensions = None
time_dim = None, iter_dim = None, interpolation_method = None
interpolation_limit = None, train_start = None, train_end = None
val_start = None, val_end = None, test_start = None, test_end = None
use_all_stations_on_all_data_sets = None, train_model = True
fraction_of_train = None
experiment_path = '/builds/esde/machine-learning/mlair/TestExperiment_daily'
plot_path = None, forecast_path = None, overwrite_local_data = None
sampling = 'daily', create_new_model = None
bootstrap_path = '/home/root/mlair/data/bootstrap'
permute_data_on_training = None, transformation = None, train_min_length = None
val_min_length = None, test_min_length = None, extreme_values = None
extremes_on_right_tail_only = None, evaluate_feature_importance = None
plot_list = None, feature_importance_n_boots = None
feature_importance_create_new_bootstraps = True
feature_importance_bootstrap_method = None
feature_importance_bootstrap_type = None, data_path = '/home/root/mlair/data/'
batch_path = None, login_nodes = None, hpc_hosts = None, model = None
batch_size = None, epochs = None, early_stopping_epochs = None
restore_best_model_weights = None
data_handler = <class 'mlair.data_handler.default_data_handler.DefaultDataHandler'>
data_origin = {'o3': 'UBA', 'temp': 'UBA'}, competitors = None
competitor_path = None, use_multiprocessing = None
use_multiprocessing_on_debug = None, max_number_multiprocessing = None
start_script = None, overwrite_lazy_data = None
uncertainty_estimate_block_length = None
uncertainty_estimate_evaluate_competitors = None
uncertainty_estimate_n_boots = None, do_uncertainty_estimate = None
do_bias_free_evaluation = None, model_display_name = None
transformation_file = None, calculate_fresh_transformation = None
snapshot_load_path = None, create_snapshot = None, snapshot_path = None
model_path = None, kwargs = {'station_type': 'background'}, upsampling = False
permute_data = False, experiment_name = 'TestExperiment_daily'
debug_mode = False
default_plot_path = '/builds/esde/machine-learning/mlair/TestExperiment_daily/plots'
default_forecast_path = '/builds/esde/machine-learning/mlair/TestExperiment_daily/forecasts'
train_val_min_length = 180

def __init__(self,
experiment_date=None,
stations: Union[str, List[str]] = None,
variables: Union[str, List[str]] = None,
statistics_per_var: Dict = None,
start: str = None,
end: str = None,
window_history_size: int = None,
target_var="o3",
target_dim=None,
window_lead_time: int = None,
window_dim=None,
dimensions=None,
time_dim=None,
iter_dim=None,
interpolation_method=None,
interpolation_limit=None, train_start=None, train_end=None, val_start=None, val_end=None,
test_start=None,
test_end=None, use_all_stations_on_all_data_sets=None, train_model: bool = None,
fraction_of_train: float = None,
experiment_path=None, plot_path: str = None, forecast_path: str = None, overwrite_local_data=None,
sampling: str = None,
create_new_model=None, bootstrap_path=None, permute_data_on_training=None, transformation=None,
train_min_length=None, val_min_length=None, test_min_length=None, extreme_values: list = None,
extremes_on_right_tail_only: bool = None, evaluate_feature_importance: bool = None, plot_list=None,
feature_importance_n_boots: int = None, feature_importance_create_new_bootstraps: bool = None,
feature_importance_bootstrap_method=None, feature_importance_bootstrap_type=None,
data_path: str = None, batch_path: str = None, login_nodes=None,
hpc_hosts=None, model=None, batch_size=None, epochs=None,
early_stopping_epochs: int = None, restore_best_model_weights: bool = None,
data_handler=None,
data_origin: Dict = None, competitors: list = None, competitor_path: str = None,
use_multiprocessing: bool = None, use_multiprocessing_on_debug: bool = None,
max_number_multiprocessing: int = None, start_script: Union[Callable, str] = None,
overwrite_lazy_data: bool = None, uncertainty_estimate_block_length: str = None,
uncertainty_estimate_evaluate_competitors: bool = None, uncertainty_estimate_n_boots: int = None,
do_uncertainty_estimate: bool = None, do_bias_free_evaluation: bool = None,
model_display_name: str = None, transformation_file: str = None,
calculate_fresh_transformation: bool = None, snapshot_load_path: str = None,
create_snapshot: bool = None, snapshot_path: str = None, model_path: str = None, **kwargs):

# create run framework
super().__init__()

# experiment setup, hyperparameters
self._set_param("data_path", path_config.prepare_host(data_path=data_path))
self._set_param("hostname", path_config.get_host())
self._set_param("hpc_hosts", hpc_hosts, default=DEFAULT_HPC_HOST_LIST + DEFAULT_HPC_LOGIN_LIST)
self._set_param("login_nodes", login_nodes, default=DEFAULT_HPC_LOGIN_LIST)
self._set_param("create_new_model", create_new_model, default=DEFAULT_CREATE_NEW_MODEL)
if self.data_store.get("create_new_model"):
train_model = True
data_path = self.data_store.get("data_path")
bootstrap_path = path_config.set_bootstrap_path(bootstrap_path, data_path)
self._set_param("bootstrap_path", bootstrap_path)
self._set_param("train_model", train_model, default=DEFAULT_TRAIN_MODEL)
self._set_param("fraction_of_training", fraction_of_train, default=DEFAULT_FRACTION_OF_TRAINING)
self._set_param("extreme_values", extreme_values, default=DEFAULT_EXTREME_VALUES, scope="train")
self._set_param("extremes_on_right_tail_only", extremes_on_right_tail_only,
default=DEFAULT_EXTREMES_ON_RIGHT_TAIL_ONLY, scope="train")
self._set_param("upsampling", extreme_values is not None, scope="train")
upsampling = self.data_store.get("upsampling", "train")
permute_data = DEFAULT_PERMUTE_DATA if permute_data_on_training is None else permute_data_on_training
self._set_param("permute_data", permute_data or upsampling, scope="train")
self._set_param("batch_size", batch_size, default=DEFAULT_BATCH_SIZE)
self._set_param("epochs", epochs, default=DEFAULT_EPOCHS)
self._set_param("early_stopping_epochs", early_stopping_epochs, default=DEFAULT_EARLY_STOPPING_EPOCHS)
self._set_param("restore_best_model_weights", restore_best_model_weights,
default=DEFAULT_RESTORE_BEST_MODEL_WEIGHTS)

# set experiment name
sampling = self._set_param("sampling", sampling, default=DEFAULT_SAMPLING) # always related to output sampling
experiment_name = path_config.set_experiment_name(name=experiment_date, sampling=sampling)
experiment_path = path_config.set_experiment_path(name=experiment_name, path=experiment_path)
self._set_param("experiment_name", experiment_name)
self._set_param("experiment_path", experiment_path)
logging.info(f"Experiment path is: {experiment_path}")
path_config.check_path_and_create(self.data_store.get("experiment_path"))

# host system setup
debug_mode = sys.gettrace() is not None
self._set_param("debug_mode", debug_mode)
if debug_mode is True:
self._set_param("use_multiprocessing", use_multiprocessing_on_debug,
default=DEFAULT_USE_MULTIPROCESSING_ON_DEBUG)
else:
self._set_param("use_multiprocessing", use_multiprocessing, default=DEFAULT_USE_MULTIPROCESSING)
self._set_param("max_number_multiprocessing", max_number_multiprocessing,
default=DEFAULT_MAX_NUMBER_MULTIPROCESSING)

# batch path (temporary)
self._set_param("batch_path", batch_path, default=os.path.join(experiment_path, "batch_data"))

# set model path
self._set_param("model_load_path", model_path)
self._set_param("model_path", None, os.path.join(experiment_path, "model"))
path_config.check_path_and_create(self.data_store.get("model_path"))

# set plot path
default_plot_path = os.path.join(experiment_path, "plots")
self._set_param("plot_path", plot_path, default=default_plot_path)
path_config.check_path_and_create(self.data_store.get("plot_path"))

# set results path
default_forecast_path = os.path.join(experiment_path, "forecasts")
self._set_param("forecast_path", forecast_path, default_forecast_path)
path_config.check_path_and_create(self.data_store.get("forecast_path"))

# set logging path
self._set_param("logging_path", None, os.path.join(experiment_path, "logging"))
path_config.check_path_and_create(self.data_store.get("logging_path"))

# set tmp path
self._set_param("tmp_path", None, os.path.join(experiment_path, "tmp"))
path_config.check_path_and_create(self.data_store.get("tmp_path"), remove_existing=True)

# snapshot settings
self._set_param("snapshot_path", snapshot_path, default=os.path.join(experiment_path, "snapshot"))
path_config.check_path_and_create(self.data_store.get("snapshot_path"), remove_existing=False)
self._set_param("create_snapshot", create_snapshot, default=DEFAULT_CREATE_SNAPSHOT)
if snapshot_load_path is not None:
self._set_param("snapshot_load_path", snapshot_load_path)

# setup for data
self._set_param("stations", stations, default=DEFAULT_STATIONS, apply=helpers.to_list)
self._set_param("statistics_per_var", statistics_per_var, default=DEFAULT_VAR_ALL_DICT)
self._set_param("variables", variables, default=list(self.data_store.get("statistics_per_var").keys()))
self._set_param("data_origin", data_origin, default=DEFAULT_DATA_ORIGIN)
self._set_param("start", start, default=DEFAULT_START)
self._set_param("end", end, default=DEFAULT_END)
self._set_param("window_history_size", window_history_size, default=DEFAULT_WINDOW_HISTORY_SIZE)
self._set_param("overwrite_local_data", overwrite_local_data, default=DEFAULT_OVERWRITE_LOCAL_DATA,
scope="preprocessing")
self._set_param("overwrite_lazy_data", overwrite_lazy_data, default=DEFAULT_OVERWRITE_LAZY_DATA,
scope="preprocessing")
self._set_param("transformation", transformation, default={})
self._set_param("transformation", None, scope="preprocessing")
self._set_param("transformation_file", transformation_file, default=None)
if calculate_fresh_transformation is not None:
self._set_param("calculate_fresh_transformation", calculate_fresh_transformation)
self._set_param("data_handler", data_handler, default=DefaultDataHandler)

# iter and window dimension
self._set_param("iter_dim", iter_dim, default=DEFAULT_ITER_DIM)
self._set_param("window_dim", window_dim, default=DEFAULT_WINDOW_DIM)

# target
self._set_param("target_var", target_var, default=DEFAULT_TARGET_VAR)
self._set_param("target_dim", target_dim, default=DEFAULT_TARGET_DIM)
self._set_param("window_lead_time", window_lead_time, default=DEFAULT_WINDOW_LEAD_TIME)

# interpolation
self._set_param("dimensions", dimensions, default=DEFAULT_DIMENSIONS)
self._set_param("time_dim", time_dim, default=DEFAULT_TIME_DIM)
self._set_param("interpolation_method", interpolation_method, default=DEFAULT_INTERPOLATION_METHOD)
self._set_param("interpolation_limit", interpolation_limit, default=DEFAULT_INTERPOLATION_LIMIT)

# train set parameters
self._set_param("start", train_start, default=DEFAULT_TRAIN_START, scope="train")
self._set_param("end", train_end, default=DEFAULT_TRAIN_END, scope="train")
self._set_param("min_length", train_min_length, default=DEFAULT_TRAIN_MIN_LENGTH, scope="train")

# validation set parameters
self._set_param("start", val_start, default=DEFAULT_VAL_START, scope="val")
self._set_param("end", val_end, default=DEFAULT_VAL_END, scope="val")
self._set_param("min_length", val_min_length, default=DEFAULT_VAL_MIN_LENGTH, scope="val")

# test set parameters
self._set_param("start", test_start, default=DEFAULT_TEST_START, scope="test")
self._set_param("end", test_end, default=DEFAULT_TEST_END, scope="test")
self._set_param("min_length", test_min_length, default=DEFAULT_TEST_MIN_LENGTH, scope="test")

# train_val set parameters
self._set_param("start", self.data_store.get("start", "train"), scope="train_val")
self._set_param("end", self.data_store.get("end", "val"), scope="train_val")
train_val_min_length = sum([self.data_store.get("min_length", s) for s in ["train", "val"]])
self._set_param("min_length", train_val_min_length, default=DEFAULT_TRAIN_VAL_MIN_LENGTH, scope="train_val")

# use all stations on all data sets (train, val, test)
self._set_param("use_all_stations_on_all_data_sets", use_all_stations_on_all_data_sets,
default=DEFAULT_USE_ALL_STATIONS_ON_ALL_DATA_SETS)

# set post-processing instructions
self._set_param("do_uncertainty_estimate", do_uncertainty_estimate,
default=DEFAULT_DO_UNCERTAINTY_ESTIMATE, scope="general.postprocessing")
self._set_param("block_length", uncertainty_estimate_block_length,
default=DEFAULT_UNCERTAINTY_ESTIMATE_BLOCK_LENGTH, scope="uncertainty_estimate")
self._set_param("evaluate_competitors", uncertainty_estimate_evaluate_competitors,
default=DEFAULT_UNCERTAINTY_ESTIMATE_EVALUATE_COMPETITORS, scope="uncertainty_estimate")
self._set_param("n_boots", uncertainty_estimate_n_boots,
default=DEFAULT_UNCERTAINTY_ESTIMATE_N_BOOTS, scope="uncertainty_estimate")
self._set_param("do_bias_free_evaluation", do_bias_free_evaluation,
default=DEFAULT_DO_BIAS_FREE_EVALUATION, scope="general.postprocessing")

self._set_param("evaluate_feature_importance", evaluate_feature_importance,
default=DEFAULT_EVALUATE_FEATURE_IMPORTANCE, scope="general.postprocessing")
feature_importance_create_new_bootstraps = max([self.data_store.get("train_model", "general"),
feature_importance_create_new_bootstraps or
DEFAULT_FEATURE_IMPORTANCE_CREATE_NEW_BOOTSTRAPS])
self._set_param("create_new_bootstraps", feature_importance_create_new_bootstraps, scope="feature_importance")
self._set_param("n_boots", feature_importance_n_boots, default=DEFAULT_FEATURE_IMPORTANCE_N_BOOTS,
scope="feature_importance")
self._set_param("bootstrap_method", feature_importance_bootstrap_method,
default=DEFAULT_FEATURE_IMPORTANCE_BOOTSTRAP_METHOD, scope="feature_importance")
self._set_param("bootstrap_type", feature_importance_bootstrap_type,
default=DEFAULT_FEATURE_IMPORTANCE_BOOTSTRAP_TYPE, scope="feature_importance")

self._set_param("plot_list", plot_list, default=DEFAULT_PLOT_LIST, scope="general.postprocessing")
if model_display_name is not None:
self._set_param("model_display_name", model_display_name)
self._set_param("neighbors", ["DEBW030"]) # TODO: just for testing

# set competitors
if model_display_name is not None and competitors is not None and model_display_name in competitors:
raise IndexError(f"Given model_display_name {model_display_name} is also present in the competitors "
f"variable {competitors}. To assure a proper workflow it is required to have unique names "
f"for each model and competitor. Please use a different model display name or competitor.")
self._set_param("competitors", competitors, default=DEFAULT_COMPETITORS)
competitor_path_default = os.path.join(self.data_store.get("data_path"), "competitors",
"_".join(self.data_store.get("target_var")))
self._set_param("competitor_path", competitor_path, default=competitor_path_default)

# check variables, statistics and target variable
self._check_target_var()
self._compare_variables_and_statistics()

# set model architecture class
self._set_param("model_class", model, VanillaModel)

# store starting script if provided
if start_script is not None:
self._store_start_script(start_script, experiment_path)

# set remaining kwargs
if len(kwargs) > 0:
for k, v in kwargs.items():
if len(self.data_store.search_name(k)) == 0:
self._set_param(k, v)
else:
s = ", ".join([f"{k}({s})={self.data_store.get(k, scope=s)}"
for s in self.data_store.search_name(k)])
> raise KeyError(f"Given argument {k} with value {v} cannot be set for this experiment due to a "
f"conflict with an existing entry with same naming: {s}")
E KeyError: 'Given argument station_type with value background cannot be set for this experiment due to a conflict with an existing entry with same naming: station_type(general)=background'

mlair/run_modules/experiment_setup.py:449: KeyError
-----------------------------Captured stderr setup------------------------------
2023-12-18 17:38:17,929 - ERROR: 'Given argument station_type with value background cannot be set for this experiment due to a conflict with an existing entry with same naming: station_type(general)=background' [run_environment.py:__exit__:141] Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/test/test_run_modules/test_pre_processing.py", line 32, in obj_with_exp_setup ExperimentSetup(stations=['DEBW107', 'DEBW013', 'DEBW087', 'DEBW99X'], File "/builds/esde/machine-learning/mlair/mlair/run_modules/experiment_setup.py", line 449, in __init__ raise KeyError(f"Given argument {k} with value {v} cannot be set for this experiment due to a " KeyError: 'Given argument station_type with value background cannot be set for this experiment due to a conflict with an existing entry with same naming: station_type(general)=background' 2023-12-18 17:38:17,929 - ERROR: 'Given argument station_type with value background cannot be set for this experiment due to a conflict with an existing entry with same naming: station_type(general)=background' [run_environment.py:__exit__:141] Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/test/test_run_modules/test_pre_processing.py", line 32, in obj_with_exp_setup ExperimentSetup(stations=['DEBW107', 'DEBW013', 'DEBW087', 'DEBW99X'], File "/builds/esde/machine-learning/mlair/mlair/run_modules/experiment_setup.py", line 449, in __init__ raise KeyError(f"Given argument {k} with value {v} cannot be set for this experiment due to a " KeyError: 'Given argument station_type with value background cannot be set for this experiment due to a conflict with an existing entry with same naming: station_type(general)=background' 2023-12-18 17:38:17,929 - ERROR: 'Given argument station_type with value background cannot be set for this experiment due to a conflict with an existing entry with same naming: station_type(general)=background' [run_environment.py:__exit__:141] Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/test/test_run_modules/test_pre_processing.py", line 32, in obj_with_exp_setup ExperimentSetup(stations=['DEBW107', 'DEBW013', 'DEBW087', 'DEBW99X'], File "/builds/esde/machine-learning/mlair/mlair/run_modules/experiment_setup.py", line 449, in __init__ raise KeyError(f"Given argument {k} with value {v} cannot be set for this experiment due to a " KeyError: 'Given argument station_type with value background cannot be set for this experiment due to a conflict with an existing entry with same naming: station_type(general)=background' 2023-12-18 17:38:17,929 - ERROR: 'Given argument station_type with value background cannot be set for this experiment due to a conflict with an existing entry with same naming: station_type(general)=background' [run_environment.py:__exit__:141] Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/test/test_run_modules/test_pre_processing.py", line 32, in obj_with_exp_setup ExperimentSetup(stations=['DEBW107', 'DEBW013', 'DEBW087', 'DEBW99X'], File "/builds/esde/machine-learning/mlair/mlair/run_modules/experiment_setup.py", line 449, in __init__ raise KeyError(f"Given argument {k} with value {v} cannot be set for this experiment due to a " KeyError: 'Given argument station_type with value background cannot be set for this experiment due to a conflict with an existing entry with same naming: station_type(general)=background' 2023-12-18 17:38:17,929 - ERROR: 'Given argument station_type with value background cannot be set for this experiment due to a conflict with an existing entry with same naming: station_type(general)=background' [run_environment.py:__exit__:141] Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/test/test_run_modules/test_pre_processing.py", line 32, in obj_with_exp_setup ExperimentSetup(stations=['DEBW107', 'DEBW013', 'DEBW087', 'DEBW99X'], File "/builds/esde/machine-learning/mlair/mlair/run_modules/experiment_setup.py", line 449, in __init__ raise KeyError(f"Given argument {k} with value {v} cannot be set for this experiment due to a " KeyError: 'Given argument station_type with value background cannot be set for this experiment due to a conflict with an existing entry with same naming: station_type(general)=background'
-------------------------------Captured log setup-------------------------------
ERROR root:run_environment.py:141 'Given argument station_type with value background cannot be set for this experiment due to a conflict with an existing entry with same naming: station_type(general)=background' Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/test/test_run_modules/test_pre_processing.py", line 32, in obj_with_exp_setup ExperimentSetup(stations=['DEBW107', 'DEBW013', 'DEBW087', 'DEBW99X'], File "/builds/esde/machine-learning/mlair/mlair/run_modules/experiment_setup.py", line 449, in __init__ raise KeyError(f"Given argument {k} with value {v} cannot be set for this experiment due to a " KeyError: 'Given argument station_type with value background cannot be set for this experiment due to a conflict with an existing entry with same naming: station_type(general)=background'
Error test/test_run_modules/test_training.py::TestTraining::test_init::setup 18.79
self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

def _acquire_with_cache_info(self, needs_lock=True):
"""Acquire a file, returning the file and whether it was cached."""
with self._optional_lock(needs_lock):
try:
> file = self._cache[self._key]

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:199:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.lru_cache.LRUCache object at 0x7fb1f7064180>
key = [<class 'netCDF4._netCDF4.Dataset'>, ('/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

def __getitem__(self, key: K) -> V:
# record recent use of the key by moving it to the front of the list
with self._lock:
> value = self._cache[key]
E KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

/opt/venv/lib64/python3.9/site-packages/xarray/backends/lru_cache.py:53: KeyError

During handling of the above exception, another exception occurred:

self = StationPrep(station=['DEBW107'], data_path='/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/d...w_history_size=7, window_lead_time=2, interpolation_limit=0, interpolation_method='linear', overwrite_local_data=False)
path = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily'
station = ['DEBW107'], statistics_per_var = {'o3': 'dma8eu', 'temp': 'maximum'}
sampling = 'daily', store_data_locally = True
data_origin = {'o3': 'UBA', 'temp': 'UBA'}, start = None, end = None

def load_data(self, path, station, statistics_per_var, sampling, store_data_locally=False,
data_origin: Dict = None, start=None, end=None):
"""
Load data and meta data either from local disk (preferred) or download new data by using a custom download method.

Data is either downloaded, if no local data is available or parameter overwrite_local_data is true. In both
cases, downloaded data is only stored locally if store_data_locally is not disabled. If this parameter is not
set, it is assumed, that data should be saved locally.
"""
check_path_and_create(path)
file_name = self._set_file_name(path, station, statistics_per_var)
meta_file = self._set_meta_file_name(path, station, statistics_per_var)
if self.overwrite_local_data is True:
logging.debug(f"{self.station[0]}: overwrite_local_data is true, therefore reload {file_name}")
if os.path.exists(file_name):
os.remove(file_name)
if os.path.exists(meta_file):
os.remove(meta_file)
data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling,
store_data_locally=store_data_locally, data_origin=data_origin,
time_dim=self.time_dim, target_dim=self.target_dim,
iter_dim=self.iter_dim, window_dim=self.window_dim,
era5_data_path=self._era5_data_path,
era5_file_names=self._era5_file_names,
ifs_data_path=self._ifs_data_path,
ifs_file_names=self._ifs_file_names)
logging.debug(f"{self.station[0]}: loaded new data")
else:
try:
logging.debug(f"{self.station[0]}: try to load local data from: {file_name}")
> data = xr.open_dataarray(file_name)

mlair/data_handler/data_handler_single_station.py:361:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

filename_or_obj = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
group = None, decode_cf = True, mask_and_scale = None, decode_times = True
autoclose = None, concat_characters = True, decode_coords = True, engine = None
chunks = None, lock = None, cache = None, drop_variables = None
backend_kwargs = None, use_cftime = None, decode_timedelta = None

def open_dataarray(
filename_or_obj,
group=None,
decode_cf=True,
mask_and_scale=None,
decode_times=True,
autoclose=None,
concat_characters=True,
decode_coords=True,
engine=None,
chunks=None,
lock=None,
cache=None,
drop_variables=None,
backend_kwargs=None,
use_cftime=None,
decode_timedelta=None,
):
"""Open an DataArray from a file or file-like object containing a single
data variable.

This is designed to read netCDF files with only one data variable. If
multiple variables are present then a ValueError is raised.

Parameters
----------
filename_or_obj : str, Path, file-like or DataStore
Strings and Paths are interpreted as a path to a netCDF file or an
OpenDAP URL and opened with python-netCDF4, unless the filename ends
with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
group : str, optional
Path to the netCDF4 group in the given file to open (only works for
netCDF4 files).
decode_cf : bool, optional
Whether to decode these variables, assuming they were saved according
to CF conventions.
mask_and_scale : bool, optional
If True, replace array values equal to `_FillValue` with NA and scale
values according to the formula `original_values * scale_factor +
add_offset`, where `_FillValue`, `scale_factor` and `add_offset` are
taken from variable attributes (if they exist). If the `_FillValue` or
`missing_value` attribute contains multiple values a warning will be
issued and all array values matching one of the multiple values will
be replaced by NA. mask_and_scale defaults to True except for the
pseudonetcdf backend.
decode_times : bool, optional
If True, decode times encoded in the standard NetCDF datetime format
into datetime objects. Otherwise, leave them encoded as numbers.
concat_characters : bool, optional
If True, concatenate along the last dimension of character arrays to
form string arrays. Dimensions will only be concatenated over (and
removed) if they have no corresponding variable and if they are only
used as the last dimension of character arrays.
decode_coords : bool, optional
If True, decode the 'coordinates' attribute to identify coordinates in
the resulting dataset.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", "cfgrib"}, \
optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
"netcdf4".
chunks : int or dict, optional
If chunks is provided, it used to load the new dataset into dask
arrays.
lock : False or lock-like, optional
Resource lock to use when reading data from disk. Only relevant when
using dask or another form of parallelism. By default, appropriate
locks are chosen to safely read and write files with the currently
active dask scheduler.
cache : bool, optional
If True, cache data loaded from the underlying datastore in memory as
NumPy arrays when accessed to avoid reading from the underlying data-
store multiple times. Defaults to True unless you specify the `chunks`
argument to use dask, in which case it defaults to False. Does not
change the behavior of coordinates corresponding to dimensions, which
always load their data from disk into a ``pandas.Index``.
drop_variables: str or iterable, optional
A variable or list of variables to exclude from being parsed from the
dataset. This may be useful to drop variables with problems or
inconsistent values.
backend_kwargs: dict, optional
A dictionary of keyword arguments to pass on to the backend. This
may be useful when backend options would improve performance or
allow user control of dataset processing.
use_cftime: bool, optional
Only relevant if encoded dates come from a standard calendar
(e.g. "gregorian", "proleptic_gregorian", "standard", or not
specified). If None (default), attempt to decode times to
``np.datetime64[ns]`` objects; if this is not possible, decode times to
``cftime.datetime`` objects. If True, always decode times to
``cftime.datetime`` objects, regardless of whether or not they can be
represented using ``np.datetime64[ns]`` objects. If False, always
decode times to ``np.datetime64[ns]`` objects; if this is not possible
raise an error.
decode_timedelta : bool, optional
If True, decode variables and coordinates with time units in
{"days", "hours", "minutes", "seconds", "milliseconds", "microseconds"}
into timedelta objects. If False, leave them encoded as numbers.
If None (default), assume the same value of decode_time.

Notes
-----
This is designed to be fully compatible with `DataArray.to_netcdf`. Saving
using `DataArray.to_netcdf` and then loading with this function will
produce an identical result.

All parameters are passed directly to `xarray.open_dataset`. See that
documentation for further details.

See also
--------
open_dataset
"""

> dataset = open_dataset(
filename_or_obj,
group=group,
decode_cf=decode_cf,
mask_and_scale=mask_and_scale,
decode_times=decode_times,
autoclose=autoclose,
concat_characters=concat_characters,
decode_coords=decode_coords,
engine=engine,
chunks=chunks,
lock=lock,
cache=cache,
drop_variables=drop_variables,
backend_kwargs=backend_kwargs,
use_cftime=use_cftime,
decode_timedelta=decode_timedelta,
)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py:701:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

filename_or_obj = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
group = None, decode_cf = True, mask_and_scale = True, decode_times = True
autoclose = None, concat_characters = True, decode_coords = True
engine = 'netcdf4', chunks = None, lock = None, cache = True
drop_variables = None, backend_kwargs = {}, use_cftime = None
decode_timedelta = None

def open_dataset(
filename_or_obj,
group=None,
decode_cf=True,
mask_and_scale=None,
decode_times=True,
autoclose=None,
concat_characters=True,
decode_coords=True,
engine=None,
chunks=None,
lock=None,
cache=None,
drop_variables=None,
backend_kwargs=None,
use_cftime=None,
decode_timedelta=None,
):
"""Open and decode a dataset from a file or file-like object.

Parameters
----------
filename_or_obj : str, Path, file-like or DataStore
Strings and Path objects are interpreted as a path to a netCDF file
or an OpenDAP URL and opened with python-netCDF4, unless the filename
ends with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
group : str, optional
Path to the netCDF4 group in the given file to open (only works for
netCDF4 files).
decode_cf : bool, optional
Whether to decode these variables, assuming they were saved according
to CF conventions.
mask_and_scale : bool, optional
If True, replace array values equal to `_FillValue` with NA and scale
values according to the formula `original_values * scale_factor +
add_offset`, where `_FillValue`, `scale_factor` and `add_offset` are
taken from variable attributes (if they exist). If the `_FillValue` or
`missing_value` attribute contains multiple values a warning will be
issued and all array values matching one of the multiple values will
be replaced by NA. mask_and_scale defaults to True except for the
pseudonetcdf backend.
decode_times : bool, optional
If True, decode times encoded in the standard NetCDF datetime format
into datetime objects. Otherwise, leave them encoded as numbers.
autoclose : bool, optional
If True, automatically close files to avoid OS Error of too many files
being open. However, this option doesn't work with streams, e.g.,
BytesIO.
concat_characters : bool, optional
If True, concatenate along the last dimension of character arrays to
form string arrays. Dimensions will only be concatenated over (and
removed) if they have no corresponding variable and if they are only
used as the last dimension of character arrays.
decode_coords : bool, optional
If True, decode the 'coordinates' attribute to identify coordinates in
the resulting dataset.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", "cfgrib", \
"pseudonetcdf", "zarr"}, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
"netcdf4".
chunks : int or dict, optional
If chunks is provided, it is used to load the new dataset into dask
arrays. ``chunks={}`` loads the dataset with dask using a single
chunk for all arrays. When using ``engine="zarr"``, setting
``chunks='auto'`` will create dask chunks based on the variable's zarr
chunks.
lock : False or lock-like, optional
Resource lock to use when reading data from disk. Only relevant when
using dask or another form of parallelism. By default, appropriate
locks are chosen to safely read and write files with the currently
active dask scheduler.
cache : bool, optional
If True, cache data loaded from the underlying datastore in memory as
NumPy arrays when accessed to avoid reading from the underlying data-
store multiple times. Defaults to True unless you specify the `chunks`
argument to use dask, in which case it defaults to False. Does not
change the behavior of coordinates corresponding to dimensions, which
always load their data from disk into a ``pandas.Index``.
drop_variables: str or iterable, optional
A variable or list of variables to exclude from being parsed from the
dataset. This may be useful to drop variables with problems or
inconsistent values.
backend_kwargs: dict, optional
A dictionary of keyword arguments to pass on to the backend. This
may be useful when backend options would improve performance or
allow user control of dataset processing.
use_cftime: bool, optional
Only relevant if encoded dates come from a standard calendar
(e.g. "gregorian", "proleptic_gregorian", "standard", or not
specified). If None (default), attempt to decode times to
``np.datetime64[ns]`` objects; if this is not possible, decode times to
``cftime.datetime`` objects. If True, always decode times to
``cftime.datetime`` objects, regardless of whether or not they can be
represented using ``np.datetime64[ns]`` objects. If False, always
decode times to ``np.datetime64[ns]`` objects; if this is not possible
raise an error.
decode_timedelta : bool, optional
If True, decode variables and coordinates with time units in
{"days", "hours", "minutes", "seconds", "milliseconds", "microseconds"}
into timedelta objects. If False, leave them encoded as numbers.
If None (default), assume the same value of decode_time.

Returns
-------
dataset : Dataset
The newly created dataset.

Notes
-----
``open_dataset`` opens the file with read-only access. When you modify
values of a Dataset, even one linked to files on disk, only the in-memory
copy you are manipulating in xarray is modified: the original file on disk
is never touched.

See Also
--------
open_mfdataset
"""
if os.environ.get("XARRAY_BACKEND_API", "v1") == "v2":
kwargs = locals().copy()
from . import apiv2, plugins

if engine in plugins.ENGINES:
return apiv2.open_dataset(**kwargs)

if autoclose is not None:
warnings.warn(
"The autoclose argument is no longer used by "
"xarray.open_dataset() and is now ignored; it will be removed in "
"a future version of xarray. If necessary, you can control the "
"maximum number of simultaneous open files with "
"xarray.set_options(file_cache_maxsize=...).",
FutureWarning,
stacklevel=2,
)

if mask_and_scale is None:
mask_and_scale = not engine == "pseudonetcdf"

if not decode_cf:
mask_and_scale = False
decode_times = False
concat_characters = False
decode_coords = False
decode_timedelta = False

if cache is None:
cache = chunks is None

if backend_kwargs is None:
backend_kwargs = {}

def maybe_decode_store(store, chunks):
ds = conventions.decode_cf(
store,
mask_and_scale=mask_and_scale,
decode_times=decode_times,
concat_characters=concat_characters,
decode_coords=decode_coords,
drop_variables=drop_variables,
use_cftime=use_cftime,
decode_timedelta=decode_timedelta,
)

_protect_dataset_variables_inplace(ds, cache)

if chunks is not None and engine != "zarr":
from dask.base import tokenize

# if passed an actual file path, augment the token with
# the file modification time
if isinstance(filename_or_obj, str) and not is_remote_uri(filename_or_obj):
mtime = os.path.getmtime(filename_or_obj)
else:
mtime = None
token = tokenize(
filename_or_obj,
mtime,
group,
decode_cf,
mask_and_scale,
decode_times,
concat_characters,
decode_coords,
engine,
chunks,
drop_variables,
use_cftime,
decode_timedelta,
)
name_prefix = "open_dataset-%s" % token
ds2 = ds.chunk(chunks, name_prefix=name_prefix, token=token)

elif engine == "zarr":
# adapted from Dataset.Chunk() and taken from open_zarr
if not (isinstance(chunks, (int, dict)) or chunks is None):
if chunks != "auto":
raise ValueError(
"chunks must be an int, dict, 'auto', or None. "
"Instead found %s. " % chunks
)

if chunks == "auto":
try:
import dask.array # noqa
except ImportError:
chunks = None

# auto chunking needs to be here and not in ZarrStore because
# the variable chunks does not survive decode_cf
# return trivial case
if chunks is None:
return ds

if isinstance(chunks, int):
chunks = dict.fromkeys(ds.dims, chunks)

variables = {
k: _maybe_chunk(
k,
v,
store.get_chunk(k, v, chunks),
overwrite_encoded_chunks=overwrite_encoded_chunks,
)
for k, v in ds.variables.items()
}
ds2 = ds._replace(variables)

else:
ds2 = ds
ds2._file_obj = ds._file_obj
return ds2

filename_or_obj = _normalize_path(filename_or_obj)

if isinstance(filename_or_obj, AbstractDataStore):
store = filename_or_obj
else:
if engine is None:
engine = _autodetect_engine(filename_or_obj)

extra_kwargs = {}
if group is not None:
extra_kwargs["group"] = group
if lock is not None:
extra_kwargs["lock"] = lock

if engine == "zarr":
backend_kwargs = backend_kwargs.copy()
overwrite_encoded_chunks = backend_kwargs.pop(
"overwrite_encoded_chunks", None
)

opener = _get_backend_cls(engine)
> store = opener(filename_or_obj, **extra_kwargs, **backend_kwargs)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py:572:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

cls = <class 'xarray.backends.netCDF4_.NetCDF4DataStore'>
filename = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
mode = 'r', format = 'NETCDF4', group = None, clobber = True, diskless = False
persist = False
lock = CombinedLock([<SerializableLock: 699a300e-8f5a-4345-91e9-203cb9fa349d>, <SerializableLock: eed4651e-796e-44f0-b068-cb59cf4773fa>])
lock_maker = None, autoclose = False

@classmethod
def open(
cls,
filename,
mode="r",
format="NETCDF4",
group=None,
clobber=True,
diskless=False,
persist=False,
lock=None,
lock_maker=None,
autoclose=False,
):
import netCDF4

if not isinstance(filename, str):
raise ValueError(
"can only read bytes or file-like objects "
"with engine='scipy' or 'h5netcdf'"
)

if format is None:
format = "NETCDF4"

if lock is None:
if mode == "r":
if is_remote_uri(filename):
lock = NETCDFC_LOCK
else:
lock = NETCDF4_PYTHON_LOCK
else:
if format is None or format.startswith("NETCDF4"):
base_lock = NETCDF4_PYTHON_LOCK
else:
base_lock = NETCDFC_LOCK
lock = combine_locks([base_lock, get_write_lock(filename)])

kwargs = dict(
clobber=clobber, diskless=diskless, persist=persist, format=format
)
manager = CachingFileManager(
netCDF4.Dataset, filename, mode=mode, kwargs=kwargs
)
> return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:364:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fb02c6dd7c0>
manager = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
group = None, mode = 'r'
lock = CombinedLock([<SerializableLock: 699a300e-8f5a-4345-91e9-203cb9fa349d>, <SerializableLock: eed4651e-796e-44f0-b068-cb59cf4773fa>])
autoclose = False

def __init__(
self, manager, group=None, mode=None, lock=NETCDF4_PYTHON_LOCK, autoclose=False
):
import netCDF4

if isinstance(manager, netCDF4.Dataset):
if group is None:
root, group = find_root_and_group(manager)
else:
if not type(manager) is netCDF4.Dataset:
raise ValueError(
"must supply a root netCDF4.Dataset if the group "
"argument is provided"
)
root = manager
manager = DummyFileManager(root)

self._manager = manager
self._group = group
self._mode = mode
> self.format = self.ds.data_model

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:314:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fb02c6dd7c0>

@property
def ds(self):
> return self._acquire()

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:373:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fb02c6dd7c0>
needs_lock = True

def _acquire(self, needs_lock=True):
> with self._manager.acquire_context(needs_lock) as root:

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:367:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <contextlib._GeneratorContextManager object at 0x7fafeef8f250>

def __enter__(self):
# do not keep args and kwds alive unnecessarily
# they are only needed for recreation, which is not possible anymore
del self.args, self.kwds, self.func
try:
> return next(self.gen)

/usr/lib64/python3.9/contextlib.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

@contextlib.contextmanager
def acquire_context(self, needs_lock=True):
"""Context manager for acquiring a file."""
> file, cached = self._acquire_with_cache_info(needs_lock)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:187:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

def _acquire_with_cache_info(self, needs_lock=True):
"""Acquire a file, returning the file and whether it was cached."""
with self._optional_lock(needs_lock):
try:
file = self._cache[self._key]
except KeyError:
kwargs = self._kwargs
if self._mode is not _DEFAULT_MODE:
kwargs = kwargs.copy()
kwargs["mode"] = self._mode
> file = self._opener(*self._args, **kwargs)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:205:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

> ???

src/netCDF4/_netCDF4.pyx:2353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

> ???
E FileNotFoundError: [Errno 2] No such file or directory: b'/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'

src/netCDF4/_netCDF4.pyx:1963: FileNotFoundError

During handling of the above exception, another exception occurred:

self = <test_training.TestTraining object at 0x7fafef8e7d60>
path = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment'
window_history_size = 7, window_lead_time = 2
statistics_per_var = {'o3': 'dma8eu', 'temp': 'maximum'}
data_origin = {'o3': 'UBA', 'temp': 'UBA'}

@pytest.fixture
def data_collection(self, path, window_history_size, window_lead_time, statistics_per_var, data_origin):
> data_prep = DefaultDataHandler.build('DEBW107', data_path=os.path.join(path, 'data'),
experiment_path=os.path.join(path, 'exp_path'),
statistics_per_var=statistics_per_var, station_type="background",
sampling="daily", target_dim="variables",
target_var="o3", time_dim="datetime", data_origin=data_origin,
window_history_size=window_history_size,
window_lead_time=window_lead_time, name_affix="train")

test/test_run_modules/test_training.py:126:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
mlair/data_handler/default_data_handler.py:72: in build
sp = cls.data_handler(station, **sp_keys)
mlair/data_handler/data_handler_single_station.py:125: in __init__
self.setup_samples()
mlair/helpers/time_tracking.py:40: in __call__
return self.__wrapped__(*args, **kwargs)
mlair/data_handler/data_handler_single_station.py:271: in setup_samples
self.make_input_target()
mlair/data_handler/data_handler_single_station.py:312: in make_input_target
data, self.meta = self.load_data(self.path, self.station, stats_per_var, self.sampling,
mlair/data_handler/data_handler_single_station.py:368: in load_data
data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling,
mlair/helpers/data_sources/data_loader.py:69: in download_data
df_toar, meta_toar = data_sources.toar_data.download_toar(station=station, toar_stats=toar_stats,
mlair/helpers/data_sources/toar_data.py:19: in download_toar
df_toar_xr = xr.DataArray(df_toar, dims=[time_dim, target_dim])
/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py:403: in __init__
coords, dims = _infer_coords_and_dims(data.shape, coords, dims)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

shape = (), coords = None, dims = ['datetime', 'variables']

def _infer_coords_and_dims(
shape, coords, dims
) -> "Tuple[Dict[Any, Variable], Tuple[Hashable, ...]]":
"""All the logic for creating a new DataArray"""

if (
coords is not None
and not utils.is_dict_like(coords)
and len(coords) != len(shape)
):
raise ValueError(
"coords is not dict-like, but it has %s items, "
"which does not match the %s dimensions of the "
"data" % (len(coords), len(shape))
)

if isinstance(dims, str):
dims = (dims,)

if dims is None:
dims = ["dim_%s" % n for n in range(len(shape))]
if coords is not None and len(coords) == len(shape):
# try to infer dimensions from coords
if utils.is_dict_like(coords):
# deprecated in GH993, removed in GH1539
raise ValueError(
"inferring DataArray dimensions from "
"dictionary like ``coords`` is no longer "
"supported. Use an explicit list of "
"``dims`` instead."
)
for n, (dim, coord) in enumerate(zip(dims, coords)):
coord = as_variable(coord, name=dims[n]).to_index_variable()
dims[n] = coord.name
dims = tuple(dims)
elif len(dims) != len(shape):
> raise ValueError(
"different number of dimensions on data "
"and dims: %s vs %s" % (len(shape), len(dims))
)
E ValueError: different number of dimensions on data and dims: 0 vs 2

/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py:121: ValueError
Error test/test_run_modules/test_training.py::TestTraining::test_no_training::setup 18.94
self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

def _acquire_with_cache_info(self, needs_lock=True):
"""Acquire a file, returning the file and whether it was cached."""
with self._optional_lock(needs_lock):
try:
> file = self._cache[self._key]

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:199:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.lru_cache.LRUCache object at 0x7fb1f7064180>
key = [<class 'netCDF4._netCDF4.Dataset'>, ('/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

def __getitem__(self, key: K) -> V:
# record recent use of the key by moving it to the front of the list
with self._lock:
> value = self._cache[key]
E KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

/opt/venv/lib64/python3.9/site-packages/xarray/backends/lru_cache.py:53: KeyError

During handling of the above exception, another exception occurred:

self = StationPrep(station=['DEBW107'], data_path='/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/d...w_history_size=7, window_lead_time=2, interpolation_limit=0, interpolation_method='linear', overwrite_local_data=False)
path = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily'
station = ['DEBW107'], statistics_per_var = {'o3': 'dma8eu', 'temp': 'maximum'}
sampling = 'daily', store_data_locally = True
data_origin = {'o3': 'UBA', 'temp': 'UBA'}, start = None, end = None

def load_data(self, path, station, statistics_per_var, sampling, store_data_locally=False,
data_origin: Dict = None, start=None, end=None):
"""
Load data and meta data either from local disk (preferred) or download new data by using a custom download method.

Data is either downloaded, if no local data is available or parameter overwrite_local_data is true. In both
cases, downloaded data is only stored locally if store_data_locally is not disabled. If this parameter is not
set, it is assumed, that data should be saved locally.
"""
check_path_and_create(path)
file_name = self._set_file_name(path, station, statistics_per_var)
meta_file = self._set_meta_file_name(path, station, statistics_per_var)
if self.overwrite_local_data is True:
logging.debug(f"{self.station[0]}: overwrite_local_data is true, therefore reload {file_name}")
if os.path.exists(file_name):
os.remove(file_name)
if os.path.exists(meta_file):
os.remove(meta_file)
data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling,
store_data_locally=store_data_locally, data_origin=data_origin,
time_dim=self.time_dim, target_dim=self.target_dim,
iter_dim=self.iter_dim, window_dim=self.window_dim,
era5_data_path=self._era5_data_path,
era5_file_names=self._era5_file_names,
ifs_data_path=self._ifs_data_path,
ifs_file_names=self._ifs_file_names)
logging.debug(f"{self.station[0]}: loaded new data")
else:
try:
logging.debug(f"{self.station[0]}: try to load local data from: {file_name}")
> data = xr.open_dataarray(file_name)

mlair/data_handler/data_handler_single_station.py:361:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

filename_or_obj = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
group = None, decode_cf = True, mask_and_scale = None, decode_times = True
autoclose = None, concat_characters = True, decode_coords = True, engine = None
chunks = None, lock = None, cache = None, drop_variables = None
backend_kwargs = None, use_cftime = None, decode_timedelta = None

def open_dataarray(
filename_or_obj,
group=None,
decode_cf=True,
mask_and_scale=None,
decode_times=True,
autoclose=None,
concat_characters=True,
decode_coords=True,
engine=None,
chunks=None,
lock=None,
cache=None,
drop_variables=None,
backend_kwargs=None,
use_cftime=None,
decode_timedelta=None,
):
"""Open an DataArray from a file or file-like object containing a single
data variable.

This is designed to read netCDF files with only one data variable. If
multiple variables are present then a ValueError is raised.

Parameters
----------
filename_or_obj : str, Path, file-like or DataStore
Strings and Paths are interpreted as a path to a netCDF file or an
OpenDAP URL and opened with python-netCDF4, unless the filename ends
with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
group : str, optional
Path to the netCDF4 group in the given file to open (only works for
netCDF4 files).
decode_cf : bool, optional
Whether to decode these variables, assuming they were saved according
to CF conventions.
mask_and_scale : bool, optional
If True, replace array values equal to `_FillValue` with NA and scale
values according to the formula `original_values * scale_factor +
add_offset`, where `_FillValue`, `scale_factor` and `add_offset` are
taken from variable attributes (if they exist). If the `_FillValue` or
`missing_value` attribute contains multiple values a warning will be
issued and all array values matching one of the multiple values will
be replaced by NA. mask_and_scale defaults to True except for the
pseudonetcdf backend.
decode_times : bool, optional
If True, decode times encoded in the standard NetCDF datetime format
into datetime objects. Otherwise, leave them encoded as numbers.
concat_characters : bool, optional
If True, concatenate along the last dimension of character arrays to
form string arrays. Dimensions will only be concatenated over (and
removed) if they have no corresponding variable and if they are only
used as the last dimension of character arrays.
decode_coords : bool, optional
If True, decode the 'coordinates' attribute to identify coordinates in
the resulting dataset.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", "cfgrib"}, \
optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
"netcdf4".
chunks : int or dict, optional
If chunks is provided, it used to load the new dataset into dask
arrays.
lock : False or lock-like, optional
Resource lock to use when reading data from disk. Only relevant when
using dask or another form of parallelism. By default, appropriate
locks are chosen to safely read and write files with the currently
active dask scheduler.
cache : bool, optional
If True, cache data loaded from the underlying datastore in memory as
NumPy arrays when accessed to avoid reading from the underlying data-
store multiple times. Defaults to True unless you specify the `chunks`
argument to use dask, in which case it defaults to False. Does not
change the behavior of coordinates corresponding to dimensions, which
always load their data from disk into a ``pandas.Index``.
drop_variables: str or iterable, optional
A variable or list of variables to exclude from being parsed from the
dataset. This may be useful to drop variables with problems or
inconsistent values.
backend_kwargs: dict, optional
A dictionary of keyword arguments to pass on to the backend. This
may be useful when backend options would improve performance or
allow user control of dataset processing.
use_cftime: bool, optional
Only relevant if encoded dates come from a standard calendar
(e.g. "gregorian", "proleptic_gregorian", "standard", or not
specified). If None (default), attempt to decode times to
``np.datetime64[ns]`` objects; if this is not possible, decode times to
``cftime.datetime`` objects. If True, always decode times to
``cftime.datetime`` objects, regardless of whether or not they can be
represented using ``np.datetime64[ns]`` objects. If False, always
decode times to ``np.datetime64[ns]`` objects; if this is not possible
raise an error.
decode_timedelta : bool, optional
If True, decode variables and coordinates with time units in
{"days", "hours", "minutes", "seconds", "milliseconds", "microseconds"}
into timedelta objects. If False, leave them encoded as numbers.
If None (default), assume the same value of decode_time.

Notes
-----
This is designed to be fully compatible with `DataArray.to_netcdf`. Saving
using `DataArray.to_netcdf` and then loading with this function will
produce an identical result.

All parameters are passed directly to `xarray.open_dataset`. See that
documentation for further details.

See also
--------
open_dataset
"""

> dataset = open_dataset(
filename_or_obj,
group=group,
decode_cf=decode_cf,
mask_and_scale=mask_and_scale,
decode_times=decode_times,
autoclose=autoclose,
concat_characters=concat_characters,
decode_coords=decode_coords,
engine=engine,
chunks=chunks,
lock=lock,
cache=cache,
drop_variables=drop_variables,
backend_kwargs=backend_kwargs,
use_cftime=use_cftime,
decode_timedelta=decode_timedelta,
)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py:701:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

filename_or_obj = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
group = None, decode_cf = True, mask_and_scale = True, decode_times = True
autoclose = None, concat_characters = True, decode_coords = True
engine = 'netcdf4', chunks = None, lock = None, cache = True
drop_variables = None, backend_kwargs = {}, use_cftime = None
decode_timedelta = None

def open_dataset(
filename_or_obj,
group=None,
decode_cf=True,
mask_and_scale=None,
decode_times=True,
autoclose=None,
concat_characters=True,
decode_coords=True,
engine=None,
chunks=None,
lock=None,
cache=None,
drop_variables=None,
backend_kwargs=None,
use_cftime=None,
decode_timedelta=None,
):
"""Open and decode a dataset from a file or file-like object.

Parameters
----------
filename_or_obj : str, Path, file-like or DataStore
Strings and Path objects are interpreted as a path to a netCDF file
or an OpenDAP URL and opened with python-netCDF4, unless the filename
ends with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
group : str, optional
Path to the netCDF4 group in the given file to open (only works for
netCDF4 files).
decode_cf : bool, optional
Whether to decode these variables, assuming they were saved according
to CF conventions.
mask_and_scale : bool, optional
If True, replace array values equal to `_FillValue` with NA and scale
values according to the formula `original_values * scale_factor +
add_offset`, where `_FillValue`, `scale_factor` and `add_offset` are
taken from variable attributes (if they exist). If the `_FillValue` or
`missing_value` attribute contains multiple values a warning will be
issued and all array values matching one of the multiple values will
be replaced by NA. mask_and_scale defaults to True except for the
pseudonetcdf backend.
decode_times : bool, optional
If True, decode times encoded in the standard NetCDF datetime format
into datetime objects. Otherwise, leave them encoded as numbers.
autoclose : bool, optional
If True, automatically close files to avoid OS Error of too many files
being open. However, this option doesn't work with streams, e.g.,
BytesIO.
concat_characters : bool, optional
If True, concatenate along the last dimension of character arrays to
form string arrays. Dimensions will only be concatenated over (and
removed) if they have no corresponding variable and if they are only
used as the last dimension of character arrays.
decode_coords : bool, optional
If True, decode the 'coordinates' attribute to identify coordinates in
the resulting dataset.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", "cfgrib", \
"pseudonetcdf", "zarr"}, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
"netcdf4".
chunks : int or dict, optional
If chunks is provided, it is used to load the new dataset into dask
arrays. ``chunks={}`` loads the dataset with dask using a single
chunk for all arrays. When using ``engine="zarr"``, setting
``chunks='auto'`` will create dask chunks based on the variable's zarr
chunks.
lock : False or lock-like, optional
Resource lock to use when reading data from disk. Only relevant when
using dask or another form of parallelism. By default, appropriate
locks are chosen to safely read and write files with the currently
active dask scheduler.
cache : bool, optional
If True, cache data loaded from the underlying datastore in memory as
NumPy arrays when accessed to avoid reading from the underlying data-
store multiple times. Defaults to True unless you specify the `chunks`
argument to use dask, in which case it defaults to False. Does not
change the behavior of coordinates corresponding to dimensions, which
always load their data from disk into a ``pandas.Index``.
drop_variables: str or iterable, optional
A variable or list of variables to exclude from being parsed from the
dataset. This may be useful to drop variables with problems or
inconsistent values.
backend_kwargs: dict, optional
A dictionary of keyword arguments to pass on to the backend. This
may be useful when backend options would improve performance or
allow user control of dataset processing.
use_cftime: bool, optional
Only relevant if encoded dates come from a standard calendar
(e.g. "gregorian", "proleptic_gregorian", "standard", or not
specified). If None (default), attempt to decode times to
``np.datetime64[ns]`` objects; if this is not possible, decode times to
``cftime.datetime`` objects. If True, always decode times to
``cftime.datetime`` objects, regardless of whether or not they can be
represented using ``np.datetime64[ns]`` objects. If False, always
decode times to ``np.datetime64[ns]`` objects; if this is not possible
raise an error.
decode_timedelta : bool, optional
If True, decode variables and coordinates with time units in
{"days", "hours", "minutes", "seconds", "milliseconds", "microseconds"}
into timedelta objects. If False, leave them encoded as numbers.
If None (default), assume the same value of decode_time.

Returns
-------
dataset : Dataset
The newly created dataset.

Notes
-----
``open_dataset`` opens the file with read-only access. When you modify
values of a Dataset, even one linked to files on disk, only the in-memory
copy you are manipulating in xarray is modified: the original file on disk
is never touched.

See Also
--------
open_mfdataset
"""
if os.environ.get("XARRAY_BACKEND_API", "v1") == "v2":
kwargs = locals().copy()
from . import apiv2, plugins

if engine in plugins.ENGINES:
return apiv2.open_dataset(**kwargs)

if autoclose is not None:
warnings.warn(
"The autoclose argument is no longer used by "
"xarray.open_dataset() and is now ignored; it will be removed in "
"a future version of xarray. If necessary, you can control the "
"maximum number of simultaneous open files with "
"xarray.set_options(file_cache_maxsize=...).",
FutureWarning,
stacklevel=2,
)

if mask_and_scale is None:
mask_and_scale = not engine == "pseudonetcdf"

if not decode_cf:
mask_and_scale = False
decode_times = False
concat_characters = False
decode_coords = False
decode_timedelta = False

if cache is None:
cache = chunks is None

if backend_kwargs is None:
backend_kwargs = {}

def maybe_decode_store(store, chunks):
ds = conventions.decode_cf(
store,
mask_and_scale=mask_and_scale,
decode_times=decode_times,
concat_characters=concat_characters,
decode_coords=decode_coords,
drop_variables=drop_variables,
use_cftime=use_cftime,
decode_timedelta=decode_timedelta,
)

_protect_dataset_variables_inplace(ds, cache)

if chunks is not None and engine != "zarr":
from dask.base import tokenize

# if passed an actual file path, augment the token with
# the file modification time
if isinstance(filename_or_obj, str) and not is_remote_uri(filename_or_obj):
mtime = os.path.getmtime(filename_or_obj)
else:
mtime = None
token = tokenize(
filename_or_obj,
mtime,
group,
decode_cf,
mask_and_scale,
decode_times,
concat_characters,
decode_coords,
engine,
chunks,
drop_variables,
use_cftime,
decode_timedelta,
)
name_prefix = "open_dataset-%s" % token
ds2 = ds.chunk(chunks, name_prefix=name_prefix, token=token)

elif engine == "zarr":
# adapted from Dataset.Chunk() and taken from open_zarr
if not (isinstance(chunks, (int, dict)) or chunks is None):
if chunks != "auto":
raise ValueError(
"chunks must be an int, dict, 'auto', or None. "
"Instead found %s. " % chunks
)

if chunks == "auto":
try:
import dask.array # noqa
except ImportError:
chunks = None

# auto chunking needs to be here and not in ZarrStore because
# the variable chunks does not survive decode_cf
# return trivial case
if chunks is None:
return ds

if isinstance(chunks, int):
chunks = dict.fromkeys(ds.dims, chunks)

variables = {
k: _maybe_chunk(
k,
v,
store.get_chunk(k, v, chunks),
overwrite_encoded_chunks=overwrite_encoded_chunks,
)
for k, v in ds.variables.items()
}
ds2 = ds._replace(variables)

else:
ds2 = ds
ds2._file_obj = ds._file_obj
return ds2

filename_or_obj = _normalize_path(filename_or_obj)

if isinstance(filename_or_obj, AbstractDataStore):
store = filename_or_obj
else:
if engine is None:
engine = _autodetect_engine(filename_or_obj)

extra_kwargs = {}
if group is not None:
extra_kwargs["group"] = group
if lock is not None:
extra_kwargs["lock"] = lock

if engine == "zarr":
backend_kwargs = backend_kwargs.copy()
overwrite_encoded_chunks = backend_kwargs.pop(
"overwrite_encoded_chunks", None
)

opener = _get_backend_cls(engine)
> store = opener(filename_or_obj, **extra_kwargs, **backend_kwargs)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py:572:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

cls = <class 'xarray.backends.netCDF4_.NetCDF4DataStore'>
filename = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
mode = 'r', format = 'NETCDF4', group = None, clobber = True, diskless = False
persist = False
lock = CombinedLock([<SerializableLock: 699a300e-8f5a-4345-91e9-203cb9fa349d>, <SerializableLock: eed4651e-796e-44f0-b068-cb59cf4773fa>])
lock_maker = None, autoclose = False

@classmethod
def open(
cls,
filename,
mode="r",
format="NETCDF4",
group=None,
clobber=True,
diskless=False,
persist=False,
lock=None,
lock_maker=None,
autoclose=False,
):
import netCDF4

if not isinstance(filename, str):
raise ValueError(
"can only read bytes or file-like objects "
"with engine='scipy' or 'h5netcdf'"
)

if format is None:
format = "NETCDF4"

if lock is None:
if mode == "r":
if is_remote_uri(filename):
lock = NETCDFC_LOCK
else:
lock = NETCDF4_PYTHON_LOCK
else:
if format is None or format.startswith("NETCDF4"):
base_lock = NETCDF4_PYTHON_LOCK
else:
base_lock = NETCDFC_LOCK
lock = combine_locks([base_lock, get_write_lock(filename)])

kwargs = dict(
clobber=clobber, diskless=diskless, persist=persist, format=format
)
manager = CachingFileManager(
netCDF4.Dataset, filename, mode=mode, kwargs=kwargs
)
> return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:364:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fb02c6dddc0>
manager = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
group = None, mode = 'r'
lock = CombinedLock([<SerializableLock: 699a300e-8f5a-4345-91e9-203cb9fa349d>, <SerializableLock: eed4651e-796e-44f0-b068-cb59cf4773fa>])
autoclose = False

def __init__(
self, manager, group=None, mode=None, lock=NETCDF4_PYTHON_LOCK, autoclose=False
):
import netCDF4

if isinstance(manager, netCDF4.Dataset):
if group is None:
root, group = find_root_and_group(manager)
else:
if not type(manager) is netCDF4.Dataset:
raise ValueError(
"must supply a root netCDF4.Dataset if the group "
"argument is provided"
)
root = manager
manager = DummyFileManager(root)

self._manager = manager
self._group = group
self._mode = mode
> self.format = self.ds.data_model

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:314:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fb02c6dddc0>

@property
def ds(self):
> return self._acquire()

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:373:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fb02c6dddc0>
needs_lock = True

def _acquire(self, needs_lock=True):
> with self._manager.acquire_context(needs_lock) as root:

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:367:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <contextlib._GeneratorContextManager object at 0x7fafee750d30>

def __enter__(self):
# do not keep args and kwds alive unnecessarily
# they are only needed for recreation, which is not possible anymore
del self.args, self.kwds, self.func
try:
> return next(self.gen)

/usr/lib64/python3.9/contextlib.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

@contextlib.contextmanager
def acquire_context(self, needs_lock=True):
"""Context manager for acquiring a file."""
> file, cached = self._acquire_with_cache_info(needs_lock)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:187:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

def _acquire_with_cache_info(self, needs_lock=True):
"""Acquire a file, returning the file and whether it was cached."""
with self._optional_lock(needs_lock):
try:
file = self._cache[self._key]
except KeyError:
kwargs = self._kwargs
if self._mode is not _DEFAULT_MODE:
kwargs = kwargs.copy()
kwargs["mode"] = self._mode
> file = self._opener(*self._args, **kwargs)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:205:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

> ???

src/netCDF4/_netCDF4.pyx:2353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

> ???
E FileNotFoundError: [Errno 2] No such file or directory: b'/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'

src/netCDF4/_netCDF4.pyx:1963: FileNotFoundError

During handling of the above exception, another exception occurred:

self = <test_training.TestTraining object at 0x7fafee7504f0>
path = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment'
window_history_size = 7, window_lead_time = 2
statistics_per_var = {'o3': 'dma8eu', 'temp': 'maximum'}
data_origin = {'o3': 'UBA', 'temp': 'UBA'}

@pytest.fixture
def data_collection(self, path, window_history_size, window_lead_time, statistics_per_var, data_origin):
> data_prep = DefaultDataHandler.build('DEBW107', data_path=os.path.join(path, 'data'),
experiment_path=os.path.join(path, 'exp_path'),
statistics_per_var=statistics_per_var, station_type="background",
sampling="daily", target_dim="variables",
target_var="o3", time_dim="datetime", data_origin=data_origin,
window_history_size=window_history_size,
window_lead_time=window_lead_time, name_affix="train")

test/test_run_modules/test_training.py:126:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
mlair/data_handler/default_data_handler.py:72: in build
sp = cls.data_handler(station, **sp_keys)
mlair/data_handler/data_handler_single_station.py:125: in __init__
self.setup_samples()
mlair/helpers/time_tracking.py:40: in __call__
return self.__wrapped__(*args, **kwargs)
mlair/data_handler/data_handler_single_station.py:271: in setup_samples
self.make_input_target()
mlair/data_handler/data_handler_single_station.py:312: in make_input_target
data, self.meta = self.load_data(self.path, self.station, stats_per_var, self.sampling,
mlair/data_handler/data_handler_single_station.py:368: in load_data
data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling,
mlair/helpers/data_sources/data_loader.py:69: in download_data
df_toar, meta_toar = data_sources.toar_data.download_toar(station=station, toar_stats=toar_stats,
mlair/helpers/data_sources/toar_data.py:19: in download_toar
df_toar_xr = xr.DataArray(df_toar, dims=[time_dim, target_dim])
/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py:403: in __init__
coords, dims = _infer_coords_and_dims(data.shape, coords, dims)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

shape = (), coords = None, dims = ['datetime', 'variables']

def _infer_coords_and_dims(
shape, coords, dims
) -> "Tuple[Dict[Any, Variable], Tuple[Hashable, ...]]":
"""All the logic for creating a new DataArray"""

if (
coords is not None
and not utils.is_dict_like(coords)
and len(coords) != len(shape)
):
raise ValueError(
"coords is not dict-like, but it has %s items, "
"which does not match the %s dimensions of the "
"data" % (len(coords), len(shape))
)

if isinstance(dims, str):
dims = (dims,)

if dims is None:
dims = ["dim_%s" % n for n in range(len(shape))]
if coords is not None and len(coords) == len(shape):
# try to infer dimensions from coords
if utils.is_dict_like(coords):
# deprecated in GH993, removed in GH1539
raise ValueError(
"inferring DataArray dimensions from "
"dictionary like ``coords`` is no longer "
"supported. Use an explicit list of "
"``dims`` instead."
)
for n, (dim, coord) in enumerate(zip(dims, coords)):
coord = as_variable(coord, name=dims[n]).to_index_variable()
dims[n] = coord.name
dims = tuple(dims)
elif len(dims) != len(shape):
> raise ValueError(
"different number of dimensions on data "
"and dims: %s vs %s" % (len(shape), len(dims))
)
E ValueError: different number of dimensions on data and dims: 0 vs 2

/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py:121: ValueError
Error test/test_run_modules/test_training.py::TestTraining::test_run::setup 18.83
self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

def _acquire_with_cache_info(self, needs_lock=True):
"""Acquire a file, returning the file and whether it was cached."""
with self._optional_lock(needs_lock):
try:
> file = self._cache[self._key]

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:199:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.lru_cache.LRUCache object at 0x7fb1f7064180>
key = [<class 'netCDF4._netCDF4.Dataset'>, ('/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

def __getitem__(self, key: K) -> V:
# record recent use of the key by moving it to the front of the list
with self._lock:
> value = self._cache[key]
E KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

/opt/venv/lib64/python3.9/site-packages/xarray/backends/lru_cache.py:53: KeyError

During handling of the above exception, another exception occurred:

self = StationPrep(station=['DEBW107'], data_path='/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/d...w_history_size=7, window_lead_time=2, interpolation_limit=0, interpolation_method='linear', overwrite_local_data=False)
path = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily'
station = ['DEBW107'], statistics_per_var = {'o3': 'dma8eu', 'temp': 'maximum'}
sampling = 'daily', store_data_locally = True
data_origin = {'o3': 'UBA', 'temp': 'UBA'}, start = None, end = None

def load_data(self, path, station, statistics_per_var, sampling, store_data_locally=False,
data_origin: Dict = None, start=None, end=None):
"""
Load data and meta data either from local disk (preferred) or download new data by using a custom download method.

Data is either downloaded, if no local data is available or parameter overwrite_local_data is true. In both
cases, downloaded data is only stored locally if store_data_locally is not disabled. If this parameter is not
set, it is assumed, that data should be saved locally.
"""
check_path_and_create(path)
file_name = self._set_file_name(path, station, statistics_per_var)
meta_file = self._set_meta_file_name(path, station, statistics_per_var)
if self.overwrite_local_data is True:
logging.debug(f"{self.station[0]}: overwrite_local_data is true, therefore reload {file_name}")
if os.path.exists(file_name):
os.remove(file_name)
if os.path.exists(meta_file):
os.remove(meta_file)
data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling,
store_data_locally=store_data_locally, data_origin=data_origin,
time_dim=self.time_dim, target_dim=self.target_dim,
iter_dim=self.iter_dim, window_dim=self.window_dim,
era5_data_path=self._era5_data_path,
era5_file_names=self._era5_file_names,
ifs_data_path=self._ifs_data_path,
ifs_file_names=self._ifs_file_names)
logging.debug(f"{self.station[0]}: loaded new data")
else:
try:
logging.debug(f"{self.station[0]}: try to load local data from: {file_name}")
> data = xr.open_dataarray(file_name)

mlair/data_handler/data_handler_single_station.py:361:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

filename_or_obj = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
group = None, decode_cf = True, mask_and_scale = None, decode_times = True
autoclose = None, concat_characters = True, decode_coords = True, engine = None
chunks = None, lock = None, cache = None, drop_variables = None
backend_kwargs = None, use_cftime = None, decode_timedelta = None

def open_dataarray(
filename_or_obj,
group=None,
decode_cf=True,
mask_and_scale=None,
decode_times=True,
autoclose=None,
concat_characters=True,
decode_coords=True,
engine=None,
chunks=None,
lock=None,
cache=None,
drop_variables=None,
backend_kwargs=None,
use_cftime=None,
decode_timedelta=None,
):
"""Open an DataArray from a file or file-like object containing a single
data variable.

This is designed to read netCDF files with only one data variable. If
multiple variables are present then a ValueError is raised.

Parameters
----------
filename_or_obj : str, Path, file-like or DataStore
Strings and Paths are interpreted as a path to a netCDF file or an
OpenDAP URL and opened with python-netCDF4, unless the filename ends
with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
group : str, optional
Path to the netCDF4 group in the given file to open (only works for
netCDF4 files).
decode_cf : bool, optional
Whether to decode these variables, assuming they were saved according
to CF conventions.
mask_and_scale : bool, optional
If True, replace array values equal to `_FillValue` with NA and scale
values according to the formula `original_values * scale_factor +
add_offset`, where `_FillValue`, `scale_factor` and `add_offset` are
taken from variable attributes (if they exist). If the `_FillValue` or
`missing_value` attribute contains multiple values a warning will be
issued and all array values matching one of the multiple values will
be replaced by NA. mask_and_scale defaults to True except for the
pseudonetcdf backend.
decode_times : bool, optional
If True, decode times encoded in the standard NetCDF datetime format
into datetime objects. Otherwise, leave them encoded as numbers.
concat_characters : bool, optional
If True, concatenate along the last dimension of character arrays to
form string arrays. Dimensions will only be concatenated over (and
removed) if they have no corresponding variable and if they are only
used as the last dimension of character arrays.
decode_coords : bool, optional
If True, decode the 'coordinates' attribute to identify coordinates in
the resulting dataset.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", "cfgrib"}, \
optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
"netcdf4".
chunks : int or dict, optional
If chunks is provided, it used to load the new dataset into dask
arrays.
lock : False or lock-like, optional
Resource lock to use when reading data from disk. Only relevant when
using dask or another form of parallelism. By default, appropriate
locks are chosen to safely read and write files with the currently
active dask scheduler.
cache : bool, optional
If True, cache data loaded from the underlying datastore in memory as
NumPy arrays when accessed to avoid reading from the underlying data-
store multiple times. Defaults to True unless you specify the `chunks`
argument to use dask, in which case it defaults to False. Does not
change the behavior of coordinates corresponding to dimensions, which
always load their data from disk into a ``pandas.Index``.
drop_variables: str or iterable, optional
A variable or list of variables to exclude from being parsed from the
dataset. This may be useful to drop variables with problems or
inconsistent values.
backend_kwargs: dict, optional
A dictionary of keyword arguments to pass on to the backend. This
may be useful when backend options would improve performance or
allow user control of dataset processing.
use_cftime: bool, optional
Only relevant if encoded dates come from a standard calendar
(e.g. "gregorian", "proleptic_gregorian", "standard", or not
specified). If None (default), attempt to decode times to
``np.datetime64[ns]`` objects; if this is not possible, decode times to
``cftime.datetime`` objects. If True, always decode times to
``cftime.datetime`` objects, regardless of whether or not they can be
represented using ``np.datetime64[ns]`` objects. If False, always
decode times to ``np.datetime64[ns]`` objects; if this is not possible
raise an error.
decode_timedelta : bool, optional
If True, decode variables and coordinates with time units in
{"days", "hours", "minutes", "seconds", "milliseconds", "microseconds"}
into timedelta objects. If False, leave them encoded as numbers.
If None (default), assume the same value of decode_time.

Notes
-----
This is designed to be fully compatible with `DataArray.to_netcdf`. Saving
using `DataArray.to_netcdf` and then loading with this function will
produce an identical result.

All parameters are passed directly to `xarray.open_dataset`. See that
documentation for further details.

See also
--------
open_dataset
"""

> dataset = open_dataset(
filename_or_obj,
group=group,
decode_cf=decode_cf,
mask_and_scale=mask_and_scale,
decode_times=decode_times,
autoclose=autoclose,
concat_characters=concat_characters,
decode_coords=decode_coords,
engine=engine,
chunks=chunks,
lock=lock,
cache=cache,
drop_variables=drop_variables,
backend_kwargs=backend_kwargs,
use_cftime=use_cftime,
decode_timedelta=decode_timedelta,
)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py:701:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

filename_or_obj = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
group = None, decode_cf = True, mask_and_scale = True, decode_times = True
autoclose = None, concat_characters = True, decode_coords = True
engine = 'netcdf4', chunks = None, lock = None, cache = True
drop_variables = None, backend_kwargs = {}, use_cftime = None
decode_timedelta = None

def open_dataset(
filename_or_obj,
group=None,
decode_cf=True,
mask_and_scale=None,
decode_times=True,
autoclose=None,
concat_characters=True,
decode_coords=True,
engine=None,
chunks=None,
lock=None,
cache=None,
drop_variables=None,
backend_kwargs=None,
use_cftime=None,
decode_timedelta=None,
):
"""Open and decode a dataset from a file or file-like object.

Parameters
----------
filename_or_obj : str, Path, file-like or DataStore
Strings and Path objects are interpreted as a path to a netCDF file
or an OpenDAP URL and opened with python-netCDF4, unless the filename
ends with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
group : str, optional
Path to the netCDF4 group in the given file to open (only works for
netCDF4 files).
decode_cf : bool, optional
Whether to decode these variables, assuming they were saved according
to CF conventions.
mask_and_scale : bool, optional
If True, replace array values equal to `_FillValue` with NA and scale
values according to the formula `original_values * scale_factor +
add_offset`, where `_FillValue`, `scale_factor` and `add_offset` are
taken from variable attributes (if they exist). If the `_FillValue` or
`missing_value` attribute contains multiple values a warning will be
issued and all array values matching one of the multiple values will
be replaced by NA. mask_and_scale defaults to True except for the
pseudonetcdf backend.
decode_times : bool, optional
If True, decode times encoded in the standard NetCDF datetime format
into datetime objects. Otherwise, leave them encoded as numbers.
autoclose : bool, optional
If True, automatically close files to avoid OS Error of too many files
being open. However, this option doesn't work with streams, e.g.,
BytesIO.
concat_characters : bool, optional
If True, concatenate along the last dimension of character arrays to
form string arrays. Dimensions will only be concatenated over (and
removed) if they have no corresponding variable and if they are only
used as the last dimension of character arrays.
decode_coords : bool, optional
If True, decode the 'coordinates' attribute to identify coordinates in
the resulting dataset.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", "cfgrib", \
"pseudonetcdf", "zarr"}, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
"netcdf4".
chunks : int or dict, optional
If chunks is provided, it is used to load the new dataset into dask
arrays. ``chunks={}`` loads the dataset with dask using a single
chunk for all arrays. When using ``engine="zarr"``, setting
``chunks='auto'`` will create dask chunks based on the variable's zarr
chunks.
lock : False or lock-like, optional
Resource lock to use when reading data from disk. Only relevant when
using dask or another form of parallelism. By default, appropriate
locks are chosen to safely read and write files with the currently
active dask scheduler.
cache : bool, optional
If True, cache data loaded from the underlying datastore in memory as
NumPy arrays when accessed to avoid reading from the underlying data-
store multiple times. Defaults to True unless you specify the `chunks`
argument to use dask, in which case it defaults to False. Does not
change the behavior of coordinates corresponding to dimensions, which
always load their data from disk into a ``pandas.Index``.
drop_variables: str or iterable, optional
A variable or list of variables to exclude from being parsed from the
dataset. This may be useful to drop variables with problems or
inconsistent values.
backend_kwargs: dict, optional
A dictionary of keyword arguments to pass on to the backend. This
may be useful when backend options would improve performance or
allow user control of dataset processing.
use_cftime: bool, optional
Only relevant if encoded dates come from a standard calendar
(e.g. "gregorian", "proleptic_gregorian", "standard", or not
specified). If None (default), attempt to decode times to
``np.datetime64[ns]`` objects; if this is not possible, decode times to
``cftime.datetime`` objects. If True, always decode times to
``cftime.datetime`` objects, regardless of whether or not they can be
represented using ``np.datetime64[ns]`` objects. If False, always
decode times to ``np.datetime64[ns]`` objects; if this is not possible
raise an error.
decode_timedelta : bool, optional
If True, decode variables and coordinates with time units in
{"days", "hours", "minutes", "seconds", "milliseconds", "microseconds"}
into timedelta objects. If False, leave them encoded as numbers.
If None (default), assume the same value of decode_time.

Returns
-------
dataset : Dataset
The newly created dataset.

Notes
-----
``open_dataset`` opens the file with read-only access. When you modify
values of a Dataset, even one linked to files on disk, only the in-memory
copy you are manipulating in xarray is modified: the original file on disk
is never touched.

See Also
--------
open_mfdataset
"""
if os.environ.get("XARRAY_BACKEND_API", "v1") == "v2":
kwargs = locals().copy()
from . import apiv2, plugins

if engine in plugins.ENGINES:
return apiv2.open_dataset(**kwargs)

if autoclose is not None:
warnings.warn(
"The autoclose argument is no longer used by "
"xarray.open_dataset() and is now ignored; it will be removed in "
"a future version of xarray. If necessary, you can control the "
"maximum number of simultaneous open files with "
"xarray.set_options(file_cache_maxsize=...).",
FutureWarning,
stacklevel=2,
)

if mask_and_scale is None:
mask_and_scale = not engine == "pseudonetcdf"

if not decode_cf:
mask_and_scale = False
decode_times = False
concat_characters = False
decode_coords = False
decode_timedelta = False

if cache is None:
cache = chunks is None

if backend_kwargs is None:
backend_kwargs = {}

def maybe_decode_store(store, chunks):
ds = conventions.decode_cf(
store,
mask_and_scale=mask_and_scale,
decode_times=decode_times,
concat_characters=concat_characters,
decode_coords=decode_coords,
drop_variables=drop_variables,
use_cftime=use_cftime,
decode_timedelta=decode_timedelta,
)

_protect_dataset_variables_inplace(ds, cache)

if chunks is not None and engine != "zarr":
from dask.base import tokenize

# if passed an actual file path, augment the token with
# the file modification time
if isinstance(filename_or_obj, str) and not is_remote_uri(filename_or_obj):
mtime = os.path.getmtime(filename_or_obj)
else:
mtime = None
token = tokenize(
filename_or_obj,
mtime,
group,
decode_cf,
mask_and_scale,
decode_times,
concat_characters,
decode_coords,
engine,
chunks,
drop_variables,
use_cftime,
decode_timedelta,
)
name_prefix = "open_dataset-%s" % token
ds2 = ds.chunk(chunks, name_prefix=name_prefix, token=token)

elif engine == "zarr":
# adapted from Dataset.Chunk() and taken from open_zarr
if not (isinstance(chunks, (int, dict)) or chunks is None):
if chunks != "auto":
raise ValueError(
"chunks must be an int, dict, 'auto', or None. "
"Instead found %s. " % chunks
)

if chunks == "auto":
try:
import dask.array # noqa
except ImportError:
chunks = None

# auto chunking needs to be here and not in ZarrStore because
# the variable chunks does not survive decode_cf
# return trivial case
if chunks is None:
return ds

if isinstance(chunks, int):
chunks = dict.fromkeys(ds.dims, chunks)

variables = {
k: _maybe_chunk(
k,
v,
store.get_chunk(k, v, chunks),
overwrite_encoded_chunks=overwrite_encoded_chunks,
)
for k, v in ds.variables.items()
}
ds2 = ds._replace(variables)

else:
ds2 = ds
ds2._file_obj = ds._file_obj
return ds2

filename_or_obj = _normalize_path(filename_or_obj)

if isinstance(filename_or_obj, AbstractDataStore):
store = filename_or_obj
else:
if engine is None:
engine = _autodetect_engine(filename_or_obj)

extra_kwargs = {}
if group is not None:
extra_kwargs["group"] = group
if lock is not None:
extra_kwargs["lock"] = lock

if engine == "zarr":
backend_kwargs = backend_kwargs.copy()
overwrite_encoded_chunks = backend_kwargs.pop(
"overwrite_encoded_chunks", None
)

opener = _get_backend_cls(engine)
> store = opener(filename_or_obj, **extra_kwargs, **backend_kwargs)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py:572:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

cls = <class 'xarray.backends.netCDF4_.NetCDF4DataStore'>
filename = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
mode = 'r', format = 'NETCDF4', group = None, clobber = True, diskless = False
persist = False
lock = CombinedLock([<SerializableLock: 699a300e-8f5a-4345-91e9-203cb9fa349d>, <SerializableLock: eed4651e-796e-44f0-b068-cb59cf4773fa>])
lock_maker = None, autoclose = False

@classmethod
def open(
cls,
filename,
mode="r",
format="NETCDF4",
group=None,
clobber=True,
diskless=False,
persist=False,
lock=None,
lock_maker=None,
autoclose=False,
):
import netCDF4

if not isinstance(filename, str):
raise ValueError(
"can only read bytes or file-like objects "
"with engine='scipy' or 'h5netcdf'"
)

if format is None:
format = "NETCDF4"

if lock is None:
if mode == "r":
if is_remote_uri(filename):
lock = NETCDFC_LOCK
else:
lock = NETCDF4_PYTHON_LOCK
else:
if format is None or format.startswith("NETCDF4"):
base_lock = NETCDF4_PYTHON_LOCK
else:
base_lock = NETCDFC_LOCK
lock = combine_locks([base_lock, get_write_lock(filename)])

kwargs = dict(
clobber=clobber, diskless=diskless, persist=persist, format=format
)
manager = CachingFileManager(
netCDF4.Dataset, filename, mode=mode, kwargs=kwargs
)
> return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:364:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fafeec2d4c0>
manager = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
group = None, mode = 'r'
lock = CombinedLock([<SerializableLock: 699a300e-8f5a-4345-91e9-203cb9fa349d>, <SerializableLock: eed4651e-796e-44f0-b068-cb59cf4773fa>])
autoclose = False

def __init__(
self, manager, group=None, mode=None, lock=NETCDF4_PYTHON_LOCK, autoclose=False
):
import netCDF4

if isinstance(manager, netCDF4.Dataset):
if group is None:
root, group = find_root_and_group(manager)
else:
if not type(manager) is netCDF4.Dataset:
raise ValueError(
"must supply a root netCDF4.Dataset if the group "
"argument is provided"
)
root = manager
manager = DummyFileManager(root)

self._manager = manager
self._group = group
self._mode = mode
> self.format = self.ds.data_model

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:314:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fafeec2d4c0>

@property
def ds(self):
> return self._acquire()

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:373:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fafeec2d4c0>
needs_lock = True

def _acquire(self, needs_lock=True):
> with self._manager.acquire_context(needs_lock) as root:

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:367:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <contextlib._GeneratorContextManager object at 0x7fafeea926a0>

def __enter__(self):
# do not keep args and kwds alive unnecessarily
# they are only needed for recreation, which is not possible anymore
del self.args, self.kwds, self.func
try:
> return next(self.gen)

/usr/lib64/python3.9/contextlib.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

@contextlib.contextmanager
def acquire_context(self, needs_lock=True):
"""Context manager for acquiring a file."""
> file, cached = self._acquire_with_cache_info(needs_lock)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:187:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

def _acquire_with_cache_info(self, needs_lock=True):
"""Acquire a file, returning the file and whether it was cached."""
with self._optional_lock(needs_lock):
try:
file = self._cache[self._key]
except KeyError:
kwargs = self._kwargs
if self._mode is not _DEFAULT_MODE:
kwargs = kwargs.copy()
kwargs["mode"] = self._mode
> file = self._opener(*self._args, **kwargs)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:205:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

> ???

src/netCDF4/_netCDF4.pyx:2353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

> ???
E FileNotFoundError: [Errno 2] No such file or directory: b'/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'

src/netCDF4/_netCDF4.pyx:1963: FileNotFoundError

During handling of the above exception, another exception occurred:

self = <test_training.TestTraining object at 0x7fafeea92d90>
path = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment'
window_history_size = 7, window_lead_time = 2
statistics_per_var = {'o3': 'dma8eu', 'temp': 'maximum'}
data_origin = {'o3': 'UBA', 'temp': 'UBA'}

@pytest.fixture
def data_collection(self, path, window_history_size, window_lead_time, statistics_per_var, data_origin):
> data_prep = DefaultDataHandler.build('DEBW107', data_path=os.path.join(path, 'data'),
experiment_path=os.path.join(path, 'exp_path'),
statistics_per_var=statistics_per_var, station_type="background",
sampling="daily", target_dim="variables",
target_var="o3", time_dim="datetime", data_origin=data_origin,
window_history_size=window_history_size,
window_lead_time=window_lead_time, name_affix="train")

test/test_run_modules/test_training.py:126:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
mlair/data_handler/default_data_handler.py:72: in build
sp = cls.data_handler(station, **sp_keys)
mlair/data_handler/data_handler_single_station.py:125: in __init__
self.setup_samples()
mlair/helpers/time_tracking.py:40: in __call__
return self.__wrapped__(*args, **kwargs)
mlair/data_handler/data_handler_single_station.py:271: in setup_samples
self.make_input_target()
mlair/data_handler/data_handler_single_station.py:312: in make_input_target
data, self.meta = self.load_data(self.path, self.station, stats_per_var, self.sampling,
mlair/data_handler/data_handler_single_station.py:368: in load_data
data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling,
mlair/helpers/data_sources/data_loader.py:69: in download_data
df_toar, meta_toar = data_sources.toar_data.download_toar(station=station, toar_stats=toar_stats,
mlair/helpers/data_sources/toar_data.py:19: in download_toar
df_toar_xr = xr.DataArray(df_toar, dims=[time_dim, target_dim])
/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py:403: in __init__
coords, dims = _infer_coords_and_dims(data.shape, coords, dims)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

shape = (), coords = None, dims = ['datetime', 'variables']

def _infer_coords_and_dims(
shape, coords, dims
) -> "Tuple[Dict[Any, Variable], Tuple[Hashable, ...]]":
"""All the logic for creating a new DataArray"""

if (
coords is not None
and not utils.is_dict_like(coords)
and len(coords) != len(shape)
):
raise ValueError(
"coords is not dict-like, but it has %s items, "
"which does not match the %s dimensions of the "
"data" % (len(coords), len(shape))
)

if isinstance(dims, str):
dims = (dims,)

if dims is None:
dims = ["dim_%s" % n for n in range(len(shape))]
if coords is not None and len(coords) == len(shape):
# try to infer dimensions from coords
if utils.is_dict_like(coords):
# deprecated in GH993, removed in GH1539
raise ValueError(
"inferring DataArray dimensions from "
"dictionary like ``coords`` is no longer "
"supported. Use an explicit list of "
"``dims`` instead."
)
for n, (dim, coord) in enumerate(zip(dims, coords)):
coord = as_variable(coord, name=dims[n]).to_index_variable()
dims[n] = coord.name
dims = tuple(dims)
elif len(dims) != len(shape):
> raise ValueError(
"different number of dimensions on data "
"and dims: %s vs %s" % (len(shape), len(dims))
)
E ValueError: different number of dimensions on data and dims: 0 vs 2

/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py:121: ValueError
Error test/test_run_modules/test_training.py::TestTraining::test_train::setup 19.87
self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

def _acquire_with_cache_info(self, needs_lock=True):
"""Acquire a file, returning the file and whether it was cached."""
with self._optional_lock(needs_lock):
try:
> file = self._cache[self._key]

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:199:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.lru_cache.LRUCache object at 0x7fb1f7064180>
key = [<class 'netCDF4._netCDF4.Dataset'>, ('/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

def __getitem__(self, key: K) -> V:
# record recent use of the key by moving it to the front of the list
with self._lock:
> value = self._cache[key]
E KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

/opt/venv/lib64/python3.9/site-packages/xarray/backends/lru_cache.py:53: KeyError

During handling of the above exception, another exception occurred:

self = StationPrep(station=['DEBW107'], data_path='/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/d...w_history_size=7, window_lead_time=2, interpolation_limit=0, interpolation_method='linear', overwrite_local_data=False)
path = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily'
station = ['DEBW107'], statistics_per_var = {'o3': 'dma8eu', 'temp': 'maximum'}
sampling = 'daily', store_data_locally = True
data_origin = {'o3': 'UBA', 'temp': 'UBA'}, start = None, end = None

def load_data(self, path, station, statistics_per_var, sampling, store_data_locally=False,
data_origin: Dict = None, start=None, end=None):
"""
Load data and meta data either from local disk (preferred) or download new data by using a custom download method.

Data is either downloaded, if no local data is available or parameter overwrite_local_data is true. In both
cases, downloaded data is only stored locally if store_data_locally is not disabled. If this parameter is not
set, it is assumed, that data should be saved locally.
"""
check_path_and_create(path)
file_name = self._set_file_name(path, station, statistics_per_var)
meta_file = self._set_meta_file_name(path, station, statistics_per_var)
if self.overwrite_local_data is True:
logging.debug(f"{self.station[0]}: overwrite_local_data is true, therefore reload {file_name}")
if os.path.exists(file_name):
os.remove(file_name)
if os.path.exists(meta_file):
os.remove(meta_file)
data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling,
store_data_locally=store_data_locally, data_origin=data_origin,
time_dim=self.time_dim, target_dim=self.target_dim,
iter_dim=self.iter_dim, window_dim=self.window_dim,
era5_data_path=self._era5_data_path,
era5_file_names=self._era5_file_names,
ifs_data_path=self._ifs_data_path,
ifs_file_names=self._ifs_file_names)
logging.debug(f"{self.station[0]}: loaded new data")
else:
try:
logging.debug(f"{self.station[0]}: try to load local data from: {file_name}")
> data = xr.open_dataarray(file_name)

mlair/data_handler/data_handler_single_station.py:361:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

filename_or_obj = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
group = None, decode_cf = True, mask_and_scale = None, decode_times = True
autoclose = None, concat_characters = True, decode_coords = True, engine = None
chunks = None, lock = None, cache = None, drop_variables = None
backend_kwargs = None, use_cftime = None, decode_timedelta = None

def open_dataarray(
filename_or_obj,
group=None,
decode_cf=True,
mask_and_scale=None,
decode_times=True,
autoclose=None,
concat_characters=True,
decode_coords=True,
engine=None,
chunks=None,
lock=None,
cache=None,
drop_variables=None,
backend_kwargs=None,
use_cftime=None,
decode_timedelta=None,
):
"""Open an DataArray from a file or file-like object containing a single
data variable.

This is designed to read netCDF files with only one data variable. If
multiple variables are present then a ValueError is raised.

Parameters
----------
filename_or_obj : str, Path, file-like or DataStore
Strings and Paths are interpreted as a path to a netCDF file or an
OpenDAP URL and opened with python-netCDF4, unless the filename ends
with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
group : str, optional
Path to the netCDF4 group in the given file to open (only works for
netCDF4 files).
decode_cf : bool, optional
Whether to decode these variables, assuming they were saved according
to CF conventions.
mask_and_scale : bool, optional
If True, replace array values equal to `_FillValue` with NA and scale
values according to the formula `original_values * scale_factor +
add_offset`, where `_FillValue`, `scale_factor` and `add_offset` are
taken from variable attributes (if they exist). If the `_FillValue` or
`missing_value` attribute contains multiple values a warning will be
issued and all array values matching one of the multiple values will
be replaced by NA. mask_and_scale defaults to True except for the
pseudonetcdf backend.
decode_times : bool, optional
If True, decode times encoded in the standard NetCDF datetime format
into datetime objects. Otherwise, leave them encoded as numbers.
concat_characters : bool, optional
If True, concatenate along the last dimension of character arrays to
form string arrays. Dimensions will only be concatenated over (and
removed) if they have no corresponding variable and if they are only
used as the last dimension of character arrays.
decode_coords : bool, optional
If True, decode the 'coordinates' attribute to identify coordinates in
the resulting dataset.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", "cfgrib"}, \
optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
"netcdf4".
chunks : int or dict, optional
If chunks is provided, it used to load the new dataset into dask
arrays.
lock : False or lock-like, optional
Resource lock to use when reading data from disk. Only relevant when
using dask or another form of parallelism. By default, appropriate
locks are chosen to safely read and write files with the currently
active dask scheduler.
cache : bool, optional
If True, cache data loaded from the underlying datastore in memory as
NumPy arrays when accessed to avoid reading from the underlying data-
store multiple times. Defaults to True unless you specify the `chunks`
argument to use dask, in which case it defaults to False. Does not
change the behavior of coordinates corresponding to dimensions, which
always load their data from disk into a ``pandas.Index``.
drop_variables: str or iterable, optional
A variable or list of variables to exclude from being parsed from the
dataset. This may be useful to drop variables with problems or
inconsistent values.
backend_kwargs: dict, optional
A dictionary of keyword arguments to pass on to the backend. This
may be useful when backend options would improve performance or
allow user control of dataset processing.
use_cftime: bool, optional
Only relevant if encoded dates come from a standard calendar
(e.g. "gregorian", "proleptic_gregorian", "standard", or not
specified). If None (default), attempt to decode times to
``np.datetime64[ns]`` objects; if this is not possible, decode times to
``cftime.datetime`` objects. If True, always decode times to
``cftime.datetime`` objects, regardless of whether or not they can be
represented using ``np.datetime64[ns]`` objects. If False, always
decode times to ``np.datetime64[ns]`` objects; if this is not possible
raise an error.
decode_timedelta : bool, optional
If True, decode variables and coordinates with time units in
{"days", "hours", "minutes", "seconds", "milliseconds", "microseconds"}
into timedelta objects. If False, leave them encoded as numbers.
If None (default), assume the same value of decode_time.

Notes
-----
This is designed to be fully compatible with `DataArray.to_netcdf`. Saving
using `DataArray.to_netcdf` and then loading with this function will
produce an identical result.

All parameters are passed directly to `xarray.open_dataset`. See that
documentation for further details.

See also
--------
open_dataset
"""

> dataset = open_dataset(
filename_or_obj,
group=group,
decode_cf=decode_cf,
mask_and_scale=mask_and_scale,
decode_times=decode_times,
autoclose=autoclose,
concat_characters=concat_characters,
decode_coords=decode_coords,
engine=engine,
chunks=chunks,
lock=lock,
cache=cache,
drop_variables=drop_variables,
backend_kwargs=backend_kwargs,
use_cftime=use_cftime,
decode_timedelta=decode_timedelta,
)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py:701:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

filename_or_obj = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
group = None, decode_cf = True, mask_and_scale = True, decode_times = True
autoclose = None, concat_characters = True, decode_coords = True
engine = 'netcdf4', chunks = None, lock = None, cache = True
drop_variables = None, backend_kwargs = {}, use_cftime = None
decode_timedelta = None

def open_dataset(
filename_or_obj,
group=None,
decode_cf=True,
mask_and_scale=None,
decode_times=True,
autoclose=None,
concat_characters=True,
decode_coords=True,
engine=None,
chunks=None,
lock=None,
cache=None,
drop_variables=None,
backend_kwargs=None,
use_cftime=None,
decode_timedelta=None,
):
"""Open and decode a dataset from a file or file-like object.

Parameters
----------
filename_or_obj : str, Path, file-like or DataStore
Strings and Path objects are interpreted as a path to a netCDF file
or an OpenDAP URL and opened with python-netCDF4, unless the filename
ends with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
group : str, optional
Path to the netCDF4 group in the given file to open (only works for
netCDF4 files).
decode_cf : bool, optional
Whether to decode these variables, assuming they were saved according
to CF conventions.
mask_and_scale : bool, optional
If True, replace array values equal to `_FillValue` with NA and scale
values according to the formula `original_values * scale_factor +
add_offset`, where `_FillValue`, `scale_factor` and `add_offset` are
taken from variable attributes (if they exist). If the `_FillValue` or
`missing_value` attribute contains multiple values a warning will be
issued and all array values matching one of the multiple values will
be replaced by NA. mask_and_scale defaults to True except for the
pseudonetcdf backend.
decode_times : bool, optional
If True, decode times encoded in the standard NetCDF datetime format
into datetime objects. Otherwise, leave them encoded as numbers.
autoclose : bool, optional
If True, automatically close files to avoid OS Error of too many files
being open. However, this option doesn't work with streams, e.g.,
BytesIO.
concat_characters : bool, optional
If True, concatenate along the last dimension of character arrays to
form string arrays. Dimensions will only be concatenated over (and
removed) if they have no corresponding variable and if they are only
used as the last dimension of character arrays.
decode_coords : bool, optional
If True, decode the 'coordinates' attribute to identify coordinates in
the resulting dataset.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", "cfgrib", \
"pseudonetcdf", "zarr"}, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
"netcdf4".
chunks : int or dict, optional
If chunks is provided, it is used to load the new dataset into dask
arrays. ``chunks={}`` loads the dataset with dask using a single
chunk for all arrays. When using ``engine="zarr"``, setting
``chunks='auto'`` will create dask chunks based on the variable's zarr
chunks.
lock : False or lock-like, optional
Resource lock to use when reading data from disk. Only relevant when
using dask or another form of parallelism. By default, appropriate
locks are chosen to safely read and write files with the currently
active dask scheduler.
cache : bool, optional
If True, cache data loaded from the underlying datastore in memory as
NumPy arrays when accessed to avoid reading from the underlying data-
store multiple times. Defaults to True unless you specify the `chunks`
argument to use dask, in which case it defaults to False. Does not
change the behavior of coordinates corresponding to dimensions, which
always load their data from disk into a ``pandas.Index``.
drop_variables: str or iterable, optional
A variable or list of variables to exclude from being parsed from the
dataset. This may be useful to drop variables with problems or
inconsistent values.
backend_kwargs: dict, optional
A dictionary of keyword arguments to pass on to the backend. This
may be useful when backend options would improve performance or
allow user control of dataset processing.
use_cftime: bool, optional
Only relevant if encoded dates come from a standard calendar
(e.g. "gregorian", "proleptic_gregorian", "standard", or not
specified). If None (default), attempt to decode times to
``np.datetime64[ns]`` objects; if this is not possible, decode times to
``cftime.datetime`` objects. If True, always decode times to
``cftime.datetime`` objects, regardless of whether or not they can be
represented using ``np.datetime64[ns]`` objects. If False, always
decode times to ``np.datetime64[ns]`` objects; if this is not possible
raise an error.
decode_timedelta : bool, optional
If True, decode variables and coordinates with time units in
{"days", "hours", "minutes", "seconds", "milliseconds", "microseconds"}
into timedelta objects. If False, leave them encoded as numbers.
If None (default), assume the same value of decode_time.

Returns
-------
dataset : Dataset
The newly created dataset.

Notes
-----
``open_dataset`` opens the file with read-only access. When you modify
values of a Dataset, even one linked to files on disk, only the in-memory
copy you are manipulating in xarray is modified: the original file on disk
is never touched.

See Also
--------
open_mfdataset
"""
if os.environ.get("XARRAY_BACKEND_API", "v1") == "v2":
kwargs = locals().copy()
from . import apiv2, plugins

if engine in plugins.ENGINES:
return apiv2.open_dataset(**kwargs)

if autoclose is not None:
warnings.warn(
"The autoclose argument is no longer used by "
"xarray.open_dataset() and is now ignored; it will be removed in "
"a future version of xarray. If necessary, you can control the "
"maximum number of simultaneous open files with "
"xarray.set_options(file_cache_maxsize=...).",
FutureWarning,
stacklevel=2,
)

if mask_and_scale is None:
mask_and_scale = not engine == "pseudonetcdf"

if not decode_cf:
mask_and_scale = False
decode_times = False
concat_characters = False
decode_coords = False
decode_timedelta = False

if cache is None:
cache = chunks is None

if backend_kwargs is None:
backend_kwargs = {}

def maybe_decode_store(store, chunks):
ds = conventions.decode_cf(
store,
mask_and_scale=mask_and_scale,
decode_times=decode_times,
concat_characters=concat_characters,
decode_coords=decode_coords,
drop_variables=drop_variables,
use_cftime=use_cftime,
decode_timedelta=decode_timedelta,
)

_protect_dataset_variables_inplace(ds, cache)

if chunks is not None and engine != "zarr":
from dask.base import tokenize

# if passed an actual file path, augment the token with
# the file modification time
if isinstance(filename_or_obj, str) and not is_remote_uri(filename_or_obj):
mtime = os.path.getmtime(filename_or_obj)
else:
mtime = None
token = tokenize(
filename_or_obj,
mtime,
group,
decode_cf,
mask_and_scale,
decode_times,
concat_characters,
decode_coords,
engine,
chunks,
drop_variables,
use_cftime,
decode_timedelta,
)
name_prefix = "open_dataset-%s" % token
ds2 = ds.chunk(chunks, name_prefix=name_prefix, token=token)

elif engine == "zarr":
# adapted from Dataset.Chunk() and taken from open_zarr
if not (isinstance(chunks, (int, dict)) or chunks is None):
if chunks != "auto":
raise ValueError(
"chunks must be an int, dict, 'auto', or None. "
"Instead found %s. " % chunks
)

if chunks == "auto":
try:
import dask.array # noqa
except ImportError:
chunks = None

# auto chunking needs to be here and not in ZarrStore because
# the variable chunks does not survive decode_cf
# return trivial case
if chunks is None:
return ds

if isinstance(chunks, int):
chunks = dict.fromkeys(ds.dims, chunks)

variables = {
k: _maybe_chunk(
k,
v,
store.get_chunk(k, v, chunks),
overwrite_encoded_chunks=overwrite_encoded_chunks,
)
for k, v in ds.variables.items()
}
ds2 = ds._replace(variables)

else:
ds2 = ds
ds2._file_obj = ds._file_obj
return ds2

filename_or_obj = _normalize_path(filename_or_obj)

if isinstance(filename_or_obj, AbstractDataStore):
store = filename_or_obj
else:
if engine is None:
engine = _autodetect_engine(filename_or_obj)

extra_kwargs = {}
if group is not None:
extra_kwargs["group"] = group
if lock is not None:
extra_kwargs["lock"] = lock

if engine == "zarr":
backend_kwargs = backend_kwargs.copy()
overwrite_encoded_chunks = backend_kwargs.pop(
"overwrite_encoded_chunks", None
)

opener = _get_backend_cls(engine)
> store = opener(filename_or_obj, **extra_kwargs, **backend_kwargs)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py:572:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

cls = <class 'xarray.backends.netCDF4_.NetCDF4DataStore'>
filename = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
mode = 'r', format = 'NETCDF4', group = None, clobber = True, diskless = False
persist = False
lock = CombinedLock([<SerializableLock: 699a300e-8f5a-4345-91e9-203cb9fa349d>, <SerializableLock: eed4651e-796e-44f0-b068-cb59cf4773fa>])
lock_maker = None, autoclose = False

@classmethod
def open(
cls,
filename,
mode="r",
format="NETCDF4",
group=None,
clobber=True,
diskless=False,
persist=False,
lock=None,
lock_maker=None,
autoclose=False,
):
import netCDF4

if not isinstance(filename, str):
raise ValueError(
"can only read bytes or file-like objects "
"with engine='scipy' or 'h5netcdf'"
)

if format is None:
format = "NETCDF4"

if lock is None:
if mode == "r":
if is_remote_uri(filename):
lock = NETCDFC_LOCK
else:
lock = NETCDF4_PYTHON_LOCK
else:
if format is None or format.startswith("NETCDF4"):
base_lock = NETCDF4_PYTHON_LOCK
else:
base_lock = NETCDFC_LOCK
lock = combine_locks([base_lock, get_write_lock(filename)])

kwargs = dict(
clobber=clobber, diskless=diskless, persist=persist, format=format
)
manager = CachingFileManager(
netCDF4.Dataset, filename, mode=mode, kwargs=kwargs
)
> return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:364:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fb02c6dd3a0>
manager = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
group = None, mode = 'r'
lock = CombinedLock([<SerializableLock: 699a300e-8f5a-4345-91e9-203cb9fa349d>, <SerializableLock: eed4651e-796e-44f0-b068-cb59cf4773fa>])
autoclose = False

def __init__(
self, manager, group=None, mode=None, lock=NETCDF4_PYTHON_LOCK, autoclose=False
):
import netCDF4

if isinstance(manager, netCDF4.Dataset):
if group is None:
root, group = find_root_and_group(manager)
else:
if not type(manager) is netCDF4.Dataset:
raise ValueError(
"must supply a root netCDF4.Dataset if the group "
"argument is provided"
)
root = manager
manager = DummyFileManager(root)

self._manager = manager
self._group = group
self._mode = mode
> self.format = self.ds.data_model

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:314:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fb02c6dd3a0>

@property
def ds(self):
> return self._acquire()

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:373:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fb02c6dd3a0>
needs_lock = True

def _acquire(self, needs_lock=True):
> with self._manager.acquire_context(needs_lock) as root:

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:367:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <contextlib._GeneratorContextManager object at 0x7fafee9d8520>

def __enter__(self):
# do not keep args and kwds alive unnecessarily
# they are only needed for recreation, which is not possible anymore
del self.args, self.kwds, self.func
try:
> return next(self.gen)

/usr/lib64/python3.9/contextlib.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

@contextlib.contextmanager
def acquire_context(self, needs_lock=True):
"""Context manager for acquiring a file."""
> file, cached = self._acquire_with_cache_info(needs_lock)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:187:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

def _acquire_with_cache_info(self, needs_lock=True):
"""Acquire a file, returning the file and whether it was cached."""
with self._optional_lock(needs_lock):
try:
file = self._cache[self._key]
except KeyError:
kwargs = self._kwargs
if self._mode is not _DEFAULT_MODE:
kwargs = kwargs.copy()
kwargs["mode"] = self._mode
> file = self._opener(*self._args, **kwargs)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:205:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

> ???

src/netCDF4/_netCDF4.pyx:2353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

> ???
E FileNotFoundError: [Errno 2] No such file or directory: b'/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'

src/netCDF4/_netCDF4.pyx:1963: FileNotFoundError

During handling of the above exception, another exception occurred:

self = <test_training.TestTraining object at 0x7fafeec50b80>
path = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment'
window_history_size = 7, window_lead_time = 2
statistics_per_var = {'o3': 'dma8eu', 'temp': 'maximum'}
data_origin = {'o3': 'UBA', 'temp': 'UBA'}

@pytest.fixture
def data_collection(self, path, window_history_size, window_lead_time, statistics_per_var, data_origin):
> data_prep = DefaultDataHandler.build('DEBW107', data_path=os.path.join(path, 'data'),
experiment_path=os.path.join(path, 'exp_path'),
statistics_per_var=statistics_per_var, station_type="background",
sampling="daily", target_dim="variables",
target_var="o3", time_dim="datetime", data_origin=data_origin,
window_history_size=window_history_size,
window_lead_time=window_lead_time, name_affix="train")

test/test_run_modules/test_training.py:126:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
mlair/data_handler/default_data_handler.py:72: in build
sp = cls.data_handler(station, **sp_keys)
mlair/data_handler/data_handler_single_station.py:125: in __init__
self.setup_samples()
mlair/helpers/time_tracking.py:40: in __call__
return self.__wrapped__(*args, **kwargs)
mlair/data_handler/data_handler_single_station.py:271: in setup_samples
self.make_input_target()
mlair/data_handler/data_handler_single_station.py:312: in make_input_target
data, self.meta = self.load_data(self.path, self.station, stats_per_var, self.sampling,
mlair/data_handler/data_handler_single_station.py:368: in load_data
data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling,
mlair/helpers/data_sources/data_loader.py:69: in download_data
df_toar, meta_toar = data_sources.toar_data.download_toar(station=station, toar_stats=toar_stats,
mlair/helpers/data_sources/toar_data.py:19: in download_toar
df_toar_xr = xr.DataArray(df_toar, dims=[time_dim, target_dim])
/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py:403: in __init__
coords, dims = _infer_coords_and_dims(data.shape, coords, dims)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

shape = (), coords = None, dims = ['datetime', 'variables']

def _infer_coords_and_dims(
shape, coords, dims
) -> "Tuple[Dict[Any, Variable], Tuple[Hashable, ...]]":
"""All the logic for creating a new DataArray"""

if (
coords is not None
and not utils.is_dict_like(coords)
and len(coords) != len(shape)
):
raise ValueError(
"coords is not dict-like, but it has %s items, "
"which does not match the %s dimensions of the "
"data" % (len(coords), len(shape))
)

if isinstance(dims, str):
dims = (dims,)

if dims is None:
dims = ["dim_%s" % n for n in range(len(shape))]
if coords is not None and len(coords) == len(shape):
# try to infer dimensions from coords
if utils.is_dict_like(coords):
# deprecated in GH993, removed in GH1539
raise ValueError(
"inferring DataArray dimensions from "
"dictionary like ``coords`` is no longer "
"supported. Use an explicit list of "
"``dims`` instead."
)
for n, (dim, coord) in enumerate(zip(dims, coords)):
coord = as_variable(coord, name=dims[n]).to_index_variable()
dims[n] = coord.name
dims = tuple(dims)
elif len(dims) != len(shape):
> raise ValueError(
"different number of dimensions on data "
"and dims: %s vs %s" % (len(shape), len(dims))
)
E ValueError: different number of dimensions on data and dims: 0 vs 2

/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py:121: ValueError
Error test/test_run_modules/test_training.py::TestTraining::test_resume_training1::setup 18.60
self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

def _acquire_with_cache_info(self, needs_lock=True):
"""Acquire a file, returning the file and whether it was cached."""
with self._optional_lock(needs_lock):
try:
> file = self._cache[self._key]

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:199:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.lru_cache.LRUCache object at 0x7fb1f7064180>
key = [<class 'netCDF4._netCDF4.Dataset'>, ('/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

def __getitem__(self, key: K) -> V:
# record recent use of the key by moving it to the front of the list
with self._lock:
> value = self._cache[key]
E KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

/opt/venv/lib64/python3.9/site-packages/xarray/backends/lru_cache.py:53: KeyError

During handling of the above exception, another exception occurred:

self = StationPrep(station=['DEBW107'], data_path='/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/d...w_history_size=7, window_lead_time=2, interpolation_limit=0, interpolation_method='linear', overwrite_local_data=False)
path = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily'
station = ['DEBW107'], statistics_per_var = {'o3': 'dma8eu', 'temp': 'maximum'}
sampling = 'daily', store_data_locally = True
data_origin = {'o3': 'UBA', 'temp': 'UBA'}, start = None, end = None

def load_data(self, path, station, statistics_per_var, sampling, store_data_locally=False,
data_origin: Dict = None, start=None, end=None):
"""
Load data and meta data either from local disk (preferred) or download new data by using a custom download method.

Data is either downloaded, if no local data is available or parameter overwrite_local_data is true. In both
cases, downloaded data is only stored locally if store_data_locally is not disabled. If this parameter is not
set, it is assumed, that data should be saved locally.
"""
check_path_and_create(path)
file_name = self._set_file_name(path, station, statistics_per_var)
meta_file = self._set_meta_file_name(path, station, statistics_per_var)
if self.overwrite_local_data is True:
logging.debug(f"{self.station[0]}: overwrite_local_data is true, therefore reload {file_name}")
if os.path.exists(file_name):
os.remove(file_name)
if os.path.exists(meta_file):
os.remove(meta_file)
data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling,
store_data_locally=store_data_locally, data_origin=data_origin,
time_dim=self.time_dim, target_dim=self.target_dim,
iter_dim=self.iter_dim, window_dim=self.window_dim,
era5_data_path=self._era5_data_path,
era5_file_names=self._era5_file_names,
ifs_data_path=self._ifs_data_path,
ifs_file_names=self._ifs_file_names)
logging.debug(f"{self.station[0]}: loaded new data")
else:
try:
logging.debug(f"{self.station[0]}: try to load local data from: {file_name}")
> data = xr.open_dataarray(file_name)

mlair/data_handler/data_handler_single_station.py:361:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

filename_or_obj = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
group = None, decode_cf = True, mask_and_scale = None, decode_times = True
autoclose = None, concat_characters = True, decode_coords = True, engine = None
chunks = None, lock = None, cache = None, drop_variables = None
backend_kwargs = None, use_cftime = None, decode_timedelta = None

def open_dataarray(
filename_or_obj,
group=None,
decode_cf=True,
mask_and_scale=None,
decode_times=True,
autoclose=None,
concat_characters=True,
decode_coords=True,
engine=None,
chunks=None,
lock=None,
cache=None,
drop_variables=None,
backend_kwargs=None,
use_cftime=None,
decode_timedelta=None,
):
"""Open an DataArray from a file or file-like object containing a single
data variable.

This is designed to read netCDF files with only one data variable. If
multiple variables are present then a ValueError is raised.

Parameters
----------
filename_or_obj : str, Path, file-like or DataStore
Strings and Paths are interpreted as a path to a netCDF file or an
OpenDAP URL and opened with python-netCDF4, unless the filename ends
with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
group : str, optional
Path to the netCDF4 group in the given file to open (only works for
netCDF4 files).
decode_cf : bool, optional
Whether to decode these variables, assuming they were saved according
to CF conventions.
mask_and_scale : bool, optional
If True, replace array values equal to `_FillValue` with NA and scale
values according to the formula `original_values * scale_factor +
add_offset`, where `_FillValue`, `scale_factor` and `add_offset` are
taken from variable attributes (if they exist). If the `_FillValue` or
`missing_value` attribute contains multiple values a warning will be
issued and all array values matching one of the multiple values will
be replaced by NA. mask_and_scale defaults to True except for the
pseudonetcdf backend.
decode_times : bool, optional
If True, decode times encoded in the standard NetCDF datetime format
into datetime objects. Otherwise, leave them encoded as numbers.
concat_characters : bool, optional
If True, concatenate along the last dimension of character arrays to
form string arrays. Dimensions will only be concatenated over (and
removed) if they have no corresponding variable and if they are only
used as the last dimension of character arrays.
decode_coords : bool, optional
If True, decode the 'coordinates' attribute to identify coordinates in
the resulting dataset.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", "cfgrib"}, \
optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
"netcdf4".
chunks : int or dict, optional
If chunks is provided, it used to load the new dataset into dask
arrays.
lock : False or lock-like, optional
Resource lock to use when reading data from disk. Only relevant when
using dask or another form of parallelism. By default, appropriate
locks are chosen to safely read and write files with the currently
active dask scheduler.
cache : bool, optional
If True, cache data loaded from the underlying datastore in memory as
NumPy arrays when accessed to avoid reading from the underlying data-
store multiple times. Defaults to True unless you specify the `chunks`
argument to use dask, in which case it defaults to False. Does not
change the behavior of coordinates corresponding to dimensions, which
always load their data from disk into a ``pandas.Index``.
drop_variables: str or iterable, optional
A variable or list of variables to exclude from being parsed from the
dataset. This may be useful to drop variables with problems or
inconsistent values.
backend_kwargs: dict, optional
A dictionary of keyword arguments to pass on to the backend. This
may be useful when backend options would improve performance or
allow user control of dataset processing.
use_cftime: bool, optional
Only relevant if encoded dates come from a standard calendar
(e.g. "gregorian", "proleptic_gregorian", "standard", or not
specified). If None (default), attempt to decode times to
``np.datetime64[ns]`` objects; if this is not possible, decode times to
``cftime.datetime`` objects. If True, always decode times to
``cftime.datetime`` objects, regardless of whether or not they can be
represented using ``np.datetime64[ns]`` objects. If False, always
decode times to ``np.datetime64[ns]`` objects; if this is not possible
raise an error.
decode_timedelta : bool, optional
If True, decode variables and coordinates with time units in
{"days", "hours", "minutes", "seconds", "milliseconds", "microseconds"}
into timedelta objects. If False, leave them encoded as numbers.
If None (default), assume the same value of decode_time.

Notes
-----
This is designed to be fully compatible with `DataArray.to_netcdf`. Saving
using `DataArray.to_netcdf` and then loading with this function will
produce an identical result.

All parameters are passed directly to `xarray.open_dataset`. See that
documentation for further details.

See also
--------
open_dataset
"""

> dataset = open_dataset(
filename_or_obj,
group=group,
decode_cf=decode_cf,
mask_and_scale=mask_and_scale,
decode_times=decode_times,
autoclose=autoclose,
concat_characters=concat_characters,
decode_coords=decode_coords,
engine=engine,
chunks=chunks,
lock=lock,
cache=cache,
drop_variables=drop_variables,
backend_kwargs=backend_kwargs,
use_cftime=use_cftime,
decode_timedelta=decode_timedelta,
)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py:701:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

filename_or_obj = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
group = None, decode_cf = True, mask_and_scale = True, decode_times = True
autoclose = None, concat_characters = True, decode_coords = True
engine = 'netcdf4', chunks = None, lock = None, cache = True
drop_variables = None, backend_kwargs = {}, use_cftime = None
decode_timedelta = None

def open_dataset(
filename_or_obj,
group=None,
decode_cf=True,
mask_and_scale=None,
decode_times=True,
autoclose=None,
concat_characters=True,
decode_coords=True,
engine=None,
chunks=None,
lock=None,
cache=None,
drop_variables=None,
backend_kwargs=None,
use_cftime=None,
decode_timedelta=None,
):
"""Open and decode a dataset from a file or file-like object.

Parameters
----------
filename_or_obj : str, Path, file-like or DataStore
Strings and Path objects are interpreted as a path to a netCDF file
or an OpenDAP URL and opened with python-netCDF4, unless the filename
ends with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
group : str, optional
Path to the netCDF4 group in the given file to open (only works for
netCDF4 files).
decode_cf : bool, optional
Whether to decode these variables, assuming they were saved according
to CF conventions.
mask_and_scale : bool, optional
If True, replace array values equal to `_FillValue` with NA and scale
values according to the formula `original_values * scale_factor +
add_offset`, where `_FillValue`, `scale_factor` and `add_offset` are
taken from variable attributes (if they exist). If the `_FillValue` or
`missing_value` attribute contains multiple values a warning will be
issued and all array values matching one of the multiple values will
be replaced by NA. mask_and_scale defaults to True except for the
pseudonetcdf backend.
decode_times : bool, optional
If True, decode times encoded in the standard NetCDF datetime format
into datetime objects. Otherwise, leave them encoded as numbers.
autoclose : bool, optional
If True, automatically close files to avoid OS Error of too many files
being open. However, this option doesn't work with streams, e.g.,
BytesIO.
concat_characters : bool, optional
If True, concatenate along the last dimension of character arrays to
form string arrays. Dimensions will only be concatenated over (and
removed) if they have no corresponding variable and if they are only
used as the last dimension of character arrays.
decode_coords : bool, optional
If True, decode the 'coordinates' attribute to identify coordinates in
the resulting dataset.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", "cfgrib", \
"pseudonetcdf", "zarr"}, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
"netcdf4".
chunks : int or dict, optional
If chunks is provided, it is used to load the new dataset into dask
arrays. ``chunks={}`` loads the dataset with dask using a single
chunk for all arrays. When using ``engine="zarr"``, setting
``chunks='auto'`` will create dask chunks based on the variable's zarr
chunks.
lock : False or lock-like, optional
Resource lock to use when reading data from disk. Only relevant when
using dask or another form of parallelism. By default, appropriate
locks are chosen to safely read and write files with the currently
active dask scheduler.
cache : bool, optional
If True, cache data loaded from the underlying datastore in memory as
NumPy arrays when accessed to avoid reading from the underlying data-
store multiple times. Defaults to True unless you specify the `chunks`
argument to use dask, in which case it defaults to False. Does not
change the behavior of coordinates corresponding to dimensions, which
always load their data from disk into a ``pandas.Index``.
drop_variables: str or iterable, optional
A variable or list of variables to exclude from being parsed from the
dataset. This may be useful to drop variables with problems or
inconsistent values.
backend_kwargs: dict, optional
A dictionary of keyword arguments to pass on to the backend. This
may be useful when backend options would improve performance or
allow user control of dataset processing.
use_cftime: bool, optional
Only relevant if encoded dates come from a standard calendar
(e.g. "gregorian", "proleptic_gregorian", "standard", or not
specified). If None (default), attempt to decode times to
``np.datetime64[ns]`` objects; if this is not possible, decode times to
``cftime.datetime`` objects. If True, always decode times to
``cftime.datetime`` objects, regardless of whether or not they can be
represented using ``np.datetime64[ns]`` objects. If False, always
decode times to ``np.datetime64[ns]`` objects; if this is not possible
raise an error.
decode_timedelta : bool, optional
If True, decode variables and coordinates with time units in
{"days", "hours", "minutes", "seconds", "milliseconds", "microseconds"}
into timedelta objects. If False, leave them encoded as numbers.
If None (default), assume the same value of decode_time.

Returns
-------
dataset : Dataset
The newly created dataset.

Notes
-----
``open_dataset`` opens the file with read-only access. When you modify
values of a Dataset, even one linked to files on disk, only the in-memory
copy you are manipulating in xarray is modified: the original file on disk
is never touched.

See Also
--------
open_mfdataset
"""
if os.environ.get("XARRAY_BACKEND_API", "v1") == "v2":
kwargs = locals().copy()
from . import apiv2, plugins

if engine in plugins.ENGINES:
return apiv2.open_dataset(**kwargs)

if autoclose is not None:
warnings.warn(
"The autoclose argument is no longer used by "
"xarray.open_dataset() and is now ignored; it will be removed in "
"a future version of xarray. If necessary, you can control the "
"maximum number of simultaneous open files with "
"xarray.set_options(file_cache_maxsize=...).",
FutureWarning,
stacklevel=2,
)

if mask_and_scale is None:
mask_and_scale = not engine == "pseudonetcdf"

if not decode_cf:
mask_and_scale = False
decode_times = False
concat_characters = False
decode_coords = False
decode_timedelta = False

if cache is None:
cache = chunks is None

if backend_kwargs is None:
backend_kwargs = {}

def maybe_decode_store(store, chunks):
ds = conventions.decode_cf(
store,
mask_and_scale=mask_and_scale,
decode_times=decode_times,
concat_characters=concat_characters,
decode_coords=decode_coords,
drop_variables=drop_variables,
use_cftime=use_cftime,
decode_timedelta=decode_timedelta,
)

_protect_dataset_variables_inplace(ds, cache)

if chunks is not None and engine != "zarr":
from dask.base import tokenize

# if passed an actual file path, augment the token with
# the file modification time
if isinstance(filename_or_obj, str) and not is_remote_uri(filename_or_obj):
mtime = os.path.getmtime(filename_or_obj)
else:
mtime = None
token = tokenize(
filename_or_obj,
mtime,
group,
decode_cf,
mask_and_scale,
decode_times,
concat_characters,
decode_coords,
engine,
chunks,
drop_variables,
use_cftime,
decode_timedelta,
)
name_prefix = "open_dataset-%s" % token
ds2 = ds.chunk(chunks, name_prefix=name_prefix, token=token)

elif engine == "zarr":
# adapted from Dataset.Chunk() and taken from open_zarr
if not (isinstance(chunks, (int, dict)) or chunks is None):
if chunks != "auto":
raise ValueError(
"chunks must be an int, dict, 'auto', or None. "
"Instead found %s. " % chunks
)

if chunks == "auto":
try:
import dask.array # noqa
except ImportError:
chunks = None

# auto chunking needs to be here and not in ZarrStore because
# the variable chunks does not survive decode_cf
# return trivial case
if chunks is None:
return ds

if isinstance(chunks, int):
chunks = dict.fromkeys(ds.dims, chunks)

variables = {
k: _maybe_chunk(
k,
v,
store.get_chunk(k, v, chunks),
overwrite_encoded_chunks=overwrite_encoded_chunks,
)
for k, v in ds.variables.items()
}
ds2 = ds._replace(variables)

else:
ds2 = ds
ds2._file_obj = ds._file_obj
return ds2

filename_or_obj = _normalize_path(filename_or_obj)

if isinstance(filename_or_obj, AbstractDataStore):
store = filename_or_obj
else:
if engine is None:
engine = _autodetect_engine(filename_or_obj)

extra_kwargs = {}
if group is not None:
extra_kwargs["group"] = group
if lock is not None:
extra_kwargs["lock"] = lock

if engine == "zarr":
backend_kwargs = backend_kwargs.copy()
overwrite_encoded_chunks = backend_kwargs.pop(
"overwrite_encoded_chunks", None
)

opener = _get_backend_cls(engine)
> store = opener(filename_or_obj, **extra_kwargs, **backend_kwargs)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py:572:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

cls = <class 'xarray.backends.netCDF4_.NetCDF4DataStore'>
filename = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'
mode = 'r', format = 'NETCDF4', group = None, clobber = True, diskless = False
persist = False
lock = CombinedLock([<SerializableLock: 699a300e-8f5a-4345-91e9-203cb9fa349d>, <SerializableLock: eed4651e-796e-44f0-b068-cb59cf4773fa>])
lock_maker = None, autoclose = False

@classmethod
def open(
cls,
filename,
mode="r",
format="NETCDF4",
group=None,
clobber=True,
diskless=False,
persist=False,
lock=None,
lock_maker=None,
autoclose=False,
):
import netCDF4

if not isinstance(filename, str):
raise ValueError(
"can only read bytes or file-like objects "
"with engine='scipy' or 'h5netcdf'"
)

if format is None:
format = "NETCDF4"

if lock is None:
if mode == "r":
if is_remote_uri(filename):
lock = NETCDFC_LOCK
else:
lock = NETCDF4_PYTHON_LOCK
else:
if format is None or format.startswith("NETCDF4"):
base_lock = NETCDF4_PYTHON_LOCK
else:
base_lock = NETCDFC_LOCK
lock = combine_locks([base_lock, get_write_lock(filename)])

kwargs = dict(
clobber=clobber, diskless=diskless, persist=persist, format=format
)
manager = CachingFileManager(
netCDF4.Dataset, filename, mode=mode, kwargs=kwargs
)
> return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:364:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fafec2abb80>
manager = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
group = None, mode = 'r'
lock = CombinedLock([<SerializableLock: 699a300e-8f5a-4345-91e9-203cb9fa349d>, <SerializableLock: eed4651e-796e-44f0-b068-cb59cf4773fa>])
autoclose = False

def __init__(
self, manager, group=None, mode=None, lock=NETCDF4_PYTHON_LOCK, autoclose=False
):
import netCDF4

if isinstance(manager, netCDF4.Dataset):
if group is None:
root, group = find_root_and_group(manager)
else:
if not type(manager) is netCDF4.Dataset:
raise ValueError(
"must supply a root netCDF4.Dataset if the group "
"argument is provided"
)
root = manager
manager = DummyFileManager(root)

self._manager = manager
self._group = group
self._mode = mode
> self.format = self.ds.data_model

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:314:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fafec2abb80>

@property
def ds(self):
> return self._acquire()

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:373:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <xarray.backends.netCDF4_.NetCDF4DataStore object at 0x7fafec2abb80>
needs_lock = True

def _acquire(self, needs_lock=True):
> with self._manager.acquire_context(needs_lock) as root:

/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py:367:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <contextlib._GeneratorContextManager object at 0x7fafec328700>

def __enter__(self):
# do not keep args and kwds alive unnecessarily
# they are only needed for recreation, which is not possible anymore
del self.args, self.kwds, self.func
try:
> return next(self.gen)

/usr/lib64/python3.9/contextlib.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

@contextlib.contextmanager
def acquire_context(self, needs_lock=True):
"""Context manager for acquiring a file."""
> file, cached = self._acquire_with_cache_info(needs_lock)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:187:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = CachingFileManager(<class 'netCDF4._netCDF4.Dataset'>, '/builds/esde/machine-learning/mlair/test/test_run_modules/Test...aily/DEBW107_o3_temp.nc', mode='r', kwargs={'clobber': True, 'diskless': False, 'persist': False, 'format': 'NETCDF4'})
needs_lock = True

def _acquire_with_cache_info(self, needs_lock=True):
"""Acquire a file, returning the file and whether it was cached."""
with self._optional_lock(needs_lock):
try:
file = self._cache[self._key]
except KeyError:
kwargs = self._kwargs
if self._mode is not _DEFAULT_MODE:
kwargs = kwargs.copy()
kwargs["mode"] = self._mode
> file = self._opener(*self._args, **kwargs)

/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py:205:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

> ???

src/netCDF4/_netCDF4.pyx:2353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

> ???
E FileNotFoundError: [Errno 2] No such file or directory: b'/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/data/daily/DEBW107_o3_temp.nc'

src/netCDF4/_netCDF4.pyx:1963: FileNotFoundError

During handling of the above exception, another exception occurred:

self = <test_training.TestTraining object at 0x7fafec22b160>
path = '/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment'
window_history_size = 7, window_lead_time = 2
statistics_per_var = {'o3': 'dma8eu', 'temp': 'maximum'}
data_origin = {'o3': 'UBA', 'temp': 'UBA'}

@pytest.fixture
def data_collection(self, path, window_history_size, window_lead_time, statistics_per_var, data_origin):
> data_prep = DefaultDataHandler.build('DEBW107', data_path=os.path.join(path, 'data'),
experiment_path=os.path.join(path, 'exp_path'),
statistics_per_var=statistics_per_var, station_type="background",
sampling="daily", target_dim="variables",
target_var="o3", time_dim="datetime", data_origin=data_origin,
window_history_size=window_history_size,
window_lead_time=window_lead_time, name_affix="train")

test/test_run_modules/test_training.py:126:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
mlair/data_handler/default_data_handler.py:72: in build
sp = cls.data_handler(station, **sp_keys)
mlair/data_handler/data_handler_single_station.py:125: in __init__
self.setup_samples()
mlair/helpers/time_tracking.py:40: in __call__
return self.__wrapped__(*args, **kwargs)
mlair/data_handler/data_handler_single_station.py:271: in setup_samples
self.make_input_target()
mlair/data_handler/data_handler_single_station.py:312: in make_input_target
data, self.meta = self.load_data(self.path, self.station, stats_per_var, self.sampling,
mlair/data_handler/data_handler_single_station.py:368: in load_data
data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling,
mlair/helpers/data_sources/data_loader.py:69: in download_data
df_toar, meta_toar = data_sources.toar_data.download_toar(station=station, toar_stats=toar_stats,
mlair/helpers/data_sources/toar_data.py:19: in download_toar
df_toar_xr = xr.DataArray(df_toar, dims=[time_dim, target_dim])
/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py:403: in __init__
coords, dims = _infer_coords_and_dims(data.shape, coords, dims)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

shape = (), coords = None, dims = ['datetime', 'variables']

def _infer_coords_and_dims(
shape, coords, dims
) -> "Tuple[Dict[Any, Variable], Tuple[Hashable, ...]]":
"""All the logic for creating a new DataArray"""

if (
coords is not None
and not utils.is_dict_like(coords)
and len(coords) != len(shape)
):
raise ValueError(
"coords is not dict-like, but it has %s items, "
"which does not match the %s dimensions of the "
"data" % (len(coords), len(shape))
)

if isinstance(dims, str):
dims = (dims,)

if dims is None:
dims = ["dim_%s" % n for n in range(len(shape))]
if coords is not None and len(coords) == len(shape):
# try to infer dimensions from coords
if utils.is_dict_like(coords):
# deprecated in GH993, removed in GH1539
raise ValueError(
"inferring DataArray dimensions from "
"dictionary like ``coords`` is no longer "
"supported. Use an explicit list of "
"``dims`` instead."
)
for n, (dim, coord) in enumerate(zip(dims, coords)):
coord = as_variable(coord, name=dims[n]).to_index_variable()
dims[n] = coord.name
dims = tuple(dims)
elif len(dims) != len(shape):
> raise ValueError(
"different number of dimensions on data "
"and dims: %s vs %s" % (len(shape), len(dims))
)
E ValueError: different number of dimensions on data and dims: 0 vs 2

/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py:121: ValueError
Failed test/test_helpers/test_data_sources/test_join.py::TestDownloadJoin::test_download_single_var 17.15
self = <test_join.TestDownloadJoin object at 0x7fb3010966d0>

def test_download_single_var(self):
> data, meta = download_join("DEBW107", {"o3": "dma8eu"})

test/test_helpers/test_data_sources/test_join.py:16:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
mlair/helpers/data_sources/join.py:44: in download_join
vars_dict, data_origin = load_series_information(station_name, station_type, network_name, join_url_base, headers,
mlair/helpers/data_sources/join.py:217: in load_series_information
station_vars = data_loader.get_data(opts, headers)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

opts = {'as_dict': 'true', 'base': 'https://join.fz-juelich.de/services/rest/surfacedata/', 'columns': 'id,network_name,station_id,parameter_name,parameter_label,parameter_attribute', 'network_name': None, ...}
headers = {}, as_json = True, max_retries = 5, timeout_base = 60

def get_data(opts: Dict, headers: Dict, as_json: bool = True, max_retries=5, timeout_base=60) -> Union[Dict, List, str]:
"""
Download join data using requests framework.

Data is returned as json like structure. Depending on the response structure, this can lead to a list or dictionary.

:param opts: options to create the request url
:param headers: additional headers information like authorization, can be empty
:param as_json: extract response as json if true (default True)

:return: requested data (either as list or dictionary)
"""
url = create_url(**opts)
response_error = None
for retry in range(max_retries + 1):
time.sleep(random.random())
try:
timeout = timeout_base * (2 ** retry)
logging.info(f"connect (retry={retry}, timeout={timeout}) {url}")
with TimeTracking(name=url):
session = retries_session(max_retries=0)
response = session.get(url, headers=headers, timeout=(5, timeout)) # timeout=(open, read)
if response.status_code == 200:
return response.json() if as_json is True else response.text
else:
logging.debug(f"There was an error (STATUS {response.status_code}) for request {url}")
response_error = f"STATUS {response.status_code}"
except Exception as e:
time.sleep(2 * (2 ** retry))
logging.debug(f"There was an error for request {url}: {e}")
response_error = e
if retry + 1 >= max_retries:
> raise EmptyQueryResult(f"There was an RetryError for request {url}: {response_error}")
E mlair.helpers.data_sources.data_loader.EmptyQueryResult: There was an RetryError for request https://join.fz-juelich.de/services/rest/surfacedata/search/?station_id=DEBW107&as_dict=true&parameter_name=o3&columns=id,network_name,station_id,parameter_name,parameter_label,parameter_attribute: STATUS 400

mlair/helpers/data_sources/data_loader.py:166: EmptyQueryResult
Failed test/test_helpers/test_data_sources/test_join.py::TestDownloadJoin::test_download_empty 15.28
self = <test_join.TestDownloadJoin object at 0x7fb300c9eac0>

def test_download_empty(self):
with pytest.raises(EmptyQueryResult) as e:
download_join("DEBW107", {"o3": "dma8eu"}, "traffic")
> assert e.value.args[-1] == "No data found for variables {'o3'} and options station=['DEBW107'], type=traffic," \
" network=None, origin={} in JOIN."
E AssertionError: assert 'There was an...e: STATUS 400' == 'No data foun...n={} in JOIN.'
E - No data found for variables {'o3'} and options station=['DEBW107'], type=traffic, network=None, origin={} in JOIN.
E + There was an RetryError for request https://join.fz-juelich.de/services/rest/surfacedata/search/?station_id=DEBW107&station_type=traffic&as_dict=true&parameter_name=o3&columns=id,network_name,station_id,parameter_name,parameter_label,parameter_attribute: STATUS 400

test/test_helpers/test_data_sources/test_join.py:23: AssertionError
Failed test/test_helpers/test_data_sources/test_join.py::TestDownloadJoin::test_download_incomplete 15.32
self = <test_join.TestDownloadJoin object at 0x7fb301a8d190>

def test_download_incomplete(self):
with pytest.raises(EmptyQueryResult) as e:
download_join("DEBW107", {"o3": "dma8eu", "o10": "maximum"}, "background")
> assert e.value.args[-1] == "No data found for variables {'o10'} and options station=['DEBW107'], " \
"type=background, network=None, origin={} in JOIN."
E AssertionError: assert 'There was an...e: STATUS 400' == 'No data foun...n={} in JOIN.'
E - No data found for variables {'o10'} and options station=['DEBW107'], type=background, network=None, origin={} in JOIN.
E + There was an RetryError for request https://join.fz-juelich.de/services/rest/surfacedata/search/?station_id=DEBW107&station_type=background&as_dict=true&parameter_name=o3,o10&columns=id,network_name,station_id,parameter_name,parameter_label,parameter_attribute: STATUS 400

test/test_helpers/test_data_sources/test_join.py:29: AssertionError
Failed test/test_helpers/test_data_sources/test_join.py::TestLoadSeriesInformation::test_standard_query 15.31
self = <test_join.TestLoadSeriesInformation object at 0x7fb300c9e1c0>

def test_standard_query(self):
expected_subset = {'o3': 17057, 'no2': 17058, 'temp': 85587, 'wspeed': 17060}
> res, orig = load_series_information(['DEBW107'], None, None, join_settings()[0], {})

test/test_helpers/test_data_sources/test_join.py:54:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
mlair/helpers/data_sources/join.py:217: in load_series_information
station_vars = data_loader.get_data(opts, headers)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

opts = {'as_dict': 'true', 'base': 'https://join.fz-juelich.de/services/rest/surfacedata/', 'columns': 'id,network_name,station_id,parameter_name,parameter_label,parameter_attribute', 'network_name': None, ...}
headers = {}, as_json = True, max_retries = 5, timeout_base = 60

def get_data(opts: Dict, headers: Dict, as_json: bool = True, max_retries=5, timeout_base=60) -> Union[Dict, List, str]:
"""
Download join data using requests framework.

Data is returned as json like structure. Depending on the response structure, this can lead to a list or dictionary.

:param opts: options to create the request url
:param headers: additional headers information like authorization, can be empty
:param as_json: extract response as json if true (default True)

:return: requested data (either as list or dictionary)
"""
url = create_url(**opts)
response_error = None
for retry in range(max_retries + 1):
time.sleep(random.random())
try:
timeout = timeout_base * (2 ** retry)
logging.info(f"connect (retry={retry}, timeout={timeout}) {url}")
with TimeTracking(name=url):
session = retries_session(max_retries=0)
response = session.get(url, headers=headers, timeout=(5, timeout)) # timeout=(open, read)
if response.status_code == 200:
return response.json() if as_json is True else response.text
else:
logging.debug(f"There was an error (STATUS {response.status_code}) for request {url}")
response_error = f"STATUS {response.status_code}"
except Exception as e:
time.sleep(2 * (2 ** retry))
logging.debug(f"There was an error for request {url}: {e}")
response_error = e
if retry + 1 >= max_retries:
> raise EmptyQueryResult(f"There was an RetryError for request {url}: {response_error}")
E mlair.helpers.data_sources.data_loader.EmptyQueryResult: There was an RetryError for request https://join.fz-juelich.de/services/rest/surfacedata/search/?station_id=DEBW107&as_dict=true&columns=id,network_name,station_id,parameter_name,parameter_label,parameter_attribute: STATUS 400

mlair/helpers/data_sources/data_loader.py:166: EmptyQueryResult
Failed test/test_helpers/test_data_sources/test_join.py::TestLoadSeriesInformation::test_empty_result 15.28
self = <test_join.TestLoadSeriesInformation object at 0x7fb300df4b50>

def test_empty_result(self):
> res, orig = load_series_information(['DEBW107'], "traffic", None, join_settings()[0], {})

test/test_helpers/test_data_sources/test_join.py:58:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
mlair/helpers/data_sources/join.py:217: in load_series_information
station_vars = data_loader.get_data(opts, headers)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

opts = {'as_dict': 'true', 'base': 'https://join.fz-juelich.de/services/rest/surfacedata/', 'columns': 'id,network_name,station_id,parameter_name,parameter_label,parameter_attribute', 'network_name': None, ...}
headers = {}, as_json = True, max_retries = 5, timeout_base = 60

def get_data(opts: Dict, headers: Dict, as_json: bool = True, max_retries=5, timeout_base=60) -> Union[Dict, List, str]:
"""
Download join data using requests framework.

Data is returned as json like structure. Depending on the response structure, this can lead to a list or dictionary.

:param opts: options to create the request url
:param headers: additional headers information like authorization, can be empty
:param as_json: extract response as json if true (default True)

:return: requested data (either as list or dictionary)
"""
url = create_url(**opts)
response_error = None
for retry in range(max_retries + 1):
time.sleep(random.random())
try:
timeout = timeout_base * (2 ** retry)
logging.info(f"connect (retry={retry}, timeout={timeout}) {url}")
with TimeTracking(name=url):
session = retries_session(max_retries=0)
response = session.get(url, headers=headers, timeout=(5, timeout)) # timeout=(open, read)
if response.status_code == 200:
return response.json() if as_json is True else response.text
else:
logging.debug(f"There was an error (STATUS {response.status_code}) for request {url}")
response_error = f"STATUS {response.status_code}"
except Exception as e:
time.sleep(2 * (2 ** retry))
logging.debug(f"There was an error for request {url}: {e}")
response_error = e
if retry + 1 >= max_retries:
> raise EmptyQueryResult(f"There was an RetryError for request {url}: {response_error}")
E mlair.helpers.data_sources.data_loader.EmptyQueryResult: There was an RetryError for request https://join.fz-juelich.de/services/rest/surfacedata/search/?station_id=DEBW107&station_type=traffic&as_dict=true&columns=id,network_name,station_id,parameter_name,parameter_label,parameter_attribute: STATUS 400

mlair/helpers/data_sources/data_loader.py:166: EmptyQueryResult
Failed test/test_helpers/test_data_sources/test_toar_data.py::TestGetData::test 15.23
self = <test_toar_data.TestGetData object at 0x7fb1ac5edeb0>

def test(self):
opts = {"base": join_settings()[0], "service": "series", "station_id": 'DEBW107', "network_name": "UBA",
"parameter_name": "o3,no2"}
> assert get_data(opts, headers={}) == [[17057, 'UBA', 'DEBW107', 'O3'], [17058, 'UBA', 'DEBW107', 'NO2']]

test/test_helpers/test_data_sources/test_toar_data.py:10:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

opts = {'base': 'https://join.fz-juelich.de/services/rest/surfacedata/', 'network_name': 'UBA', 'parameter_name': 'o3,no2', 'service': 'series', ...}
headers = {}, as_json = True, max_retries = 5, timeout_base = 60

def get_data(opts: Dict, headers: Dict, as_json: bool = True, max_retries=5, timeout_base=60) -> Union[Dict, List, str]:
"""
Download join data using requests framework.

Data is returned as json like structure. Depending on the response structure, this can lead to a list or dictionary.

:param opts: options to create the request url
:param headers: additional headers information like authorization, can be empty
:param as_json: extract response as json if true (default True)

:return: requested data (either as list or dictionary)
"""
url = create_url(**opts)
response_error = None
for retry in range(max_retries + 1):
time.sleep(random.random())
try:
timeout = timeout_base * (2 ** retry)
logging.info(f"connect (retry={retry}, timeout={timeout}) {url}")
with TimeTracking(name=url):
session = retries_session(max_retries=0)
response = session.get(url, headers=headers, timeout=(5, timeout)) # timeout=(open, read)
if response.status_code == 200:
return response.json() if as_json is True else response.text
else:
logging.debug(f"There was an error (STATUS {response.status_code}) for request {url}")
response_error = f"STATUS {response.status_code}"
except Exception as e:
time.sleep(2 * (2 ** retry))
logging.debug(f"There was an error for request {url}: {e}")
response_error = e
if retry + 1 >= max_retries:
> raise EmptyQueryResult(f"There was an RetryError for request {url}: {response_error}")
E mlair.helpers.data_sources.data_loader.EmptyQueryResult: There was an RetryError for request https://join.fz-juelich.de/services/rest/surfacedata/series/?station_id=DEBW107&network_name=UBA&parameter_name=o3,no2: STATUS 400

mlair/helpers/data_sources/data_loader.py:166: EmptyQueryResult
Failed test/test_run_modules/test_pre_processing.py::TestPreProcessing::test_init 120.07
self = <test_pre_processing.TestPreProcessing object at 0x7fb0741fa730>
caplog = <_pytest.logging.LogCaptureFixture object at 0x7fb07421b430>

def test_init(self, caplog):
ExperimentSetup(stations=['DEBW107', 'DEBW013', 'DEBW087'],
statistics_per_var={'o3': 'dma8eu', 'temp': 'maximum'},
data_origin={'o3': 'UBA', 'temp': 'UBA'})
caplog.clear()
caplog.set_level(logging.INFO)
> with PreProcessing():

test/test_run_modules/test_pre_processing.py:46:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
mlair/run_modules/pre_processing.py:60: in __init__
self._run()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <mlair.run_modules.pre_processing.PreProcessing object at 0x7fb0741fad60>

def _run(self):
snapshot_load_path = self.data_store.get_default("snapshot_load_path", default=None)
if snapshot_load_path is None:
stations = self.data_store.get("stations")
data_handler = self.data_store.get("data_handler")
self._load_apriori()
_, valid_stations = self.validate_station(data_handler, stations,
"preprocessing") # , store_processed_data=False)
if len(valid_stations) == 0:
> raise ValueError("Couldn't find any valid data according to given parameters. Abort experiment run.")
E ValueError: Couldn't find any valid data according to given parameters. Abort experiment run.

mlair/run_modules/pre_processing.py:71: ValueError
------------------------------Captured stderr call------------------------------
2023-12-18 17:36:17,764 - INFO: PreProcessing started [run_environment.py:__init__:103] 2023-12-18 17:36:17,764 - INFO: PreProcessing started [run_environment.py:__init__:103] 2023-12-18 17:36:17,764 - INFO: PreProcessing started [run_environment.py:__init__:103] 2023-12-18 17:36:17,764 - INFO: PreProcessing started [run_environment.py:__init__:103] 2023-12-18 17:36:17,764 - INFO: PreProcessing started [run_environment.py:__init__:103] 2023-12-18 17:36:17,765 - INFO: check valid stations started (preprocessing) [pre_processing.py:validate_station:262] 2023-12-18 17:36:17,765 - INFO: check valid stations started (preprocessing) [pre_processing.py:validate_station:262] 2023-12-18 17:36:17,765 - INFO: check valid stations started (preprocessing) [pre_processing.py:validate_station:262] 2023-12-18 17:36:17,765 - INFO: check valid stations started (preprocessing) [pre_processing.py:validate_station:262] 2023-12-18 17:36:17,765 - INFO: check valid stations started (preprocessing) [pre_processing.py:validate_station:262] 2023-12-18 17:36:17,769 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:36:17,769 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:36:17,769 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:36:17,769 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:36:17,769 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:36:18,099 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:36:18,099 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:36:18,099 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:36:18,099 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:36:18,099 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:36:18,136 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:18,136 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:18,136 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:18,136 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:18,136 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:19,066 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:36:19,066 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:36:19,066 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:36:19,066 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:36:19,066 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:36:19,134 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:19,134 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:19,134 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:19,134 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:19,134 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:19,313 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:36:19,313 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:36:19,313 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:36:19,313 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:36:19,313 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:36:19,402 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:19,402 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:19,402 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:19,402 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:19,402 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:20,281 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:36:20,281 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:36:20,281 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:36:20,281 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:36:20,281 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:36:20,406 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:20,406 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:20,406 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:20,406 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:20,406 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:20,479 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:36:20,479 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:36:20,479 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:36:20,479 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:36:20,479 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:36:20,619 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:20,619 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:20,619 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:20,619 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:20,619 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:20,621 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:36:20,621 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:36:20,621 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:36:20,621 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:36:20,621 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:36:21,061 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:21,061 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:21,061 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:21,061 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:21,061 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:26,104 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:26,104 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:26,104 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:26,104 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:26,104 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:26,188 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:26,188 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:26,188 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:26,188 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:26,188 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:31,232 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:31,232 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:31,232 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:31,232 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:31,232 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:32,337 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:32,337 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:32,337 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:32,337 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:32,337 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:37,383 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:37,383 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:37,383 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:37,383 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:37,383 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:39,720 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:39,720 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:39,720 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:39,720 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:39,720 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:36:59,561 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:59,561 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:59,561 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:59,561 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:36:59,561 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:00,483 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:00,483 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:00,483 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:00,483 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:00,483 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:00,512 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:00,512 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:00,512 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:00,512 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:00,512 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:00,849 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:00,849 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:00,849 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:00,849 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:00,849 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:00,901 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:00,901 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:00,901 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:00,901 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:00,901 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:02,874 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:02,874 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:02,874 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:02,874 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:02,874 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:02,919 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:02,919 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:02,919 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:02,919 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:02,919 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:05,706 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:05,706 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:05,706 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:05,706 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:05,706 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:05,751 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:05,751 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:05,751 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:05,751 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:05,751 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:09,062 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:09,062 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:09,062 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:09,062 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:09,062 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:09,109 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:09,109 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:09,109 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:09,109 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:09,109 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:13,115 - INFO: setup_samples finished after 0:00:56 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:13,115 - INFO: setup_samples finished after 0:00:56 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:13,115 - INFO: setup_samples finished after 0:00:56 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:13,115 - INFO: setup_samples finished after 0:00:56 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:13,115 - INFO: setup_samples finished after 0:00:56 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:13,120 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:37:13,120 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:37:13,120 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:37:13,120 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:37:13,120 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:37:13,122 - INFO: ...finished: DEBW107 (33%) [pre_processing.py:validate_station:303] 2023-12-18 17:37:13,122 - INFO: ...finished: DEBW107 (33%) [pre_processing.py:validate_station:303] 2023-12-18 17:37:13,122 - INFO: ...finished: DEBW107 (33%) [pre_processing.py:validate_station:303] 2023-12-18 17:37:13,122 - INFO: ...finished: DEBW107 (33%) [pre_processing.py:validate_station:303] 2023-12-18 17:37:13,122 - INFO: ...finished: DEBW107 (33%) [pre_processing.py:validate_station:303] 2023-12-18 17:37:13,968 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:37:13,968 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:37:13,968 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:37:13,968 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:37:13,968 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:37:14,013 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,013 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,013 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,013 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,013 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,087 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:37:14,087 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:37:14,087 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:37:14,087 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:37:14,087 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:37:14,132 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,132 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,132 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,132 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,132 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,258 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:37:14,258 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:37:14,258 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:37:14,258 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:37:14,258 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:37:14,317 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,317 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,317 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,317 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,317 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:14,977 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:37:14,977 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:37:14,977 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:37:14,977 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:37:14,977 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:37:15,126 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:15,126 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:15,126 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:15,126 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:15,126 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:16,062 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:37:16,062 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:37:16,062 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:37:16,062 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:37:16,062 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:37:16,272 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:16,272 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:16,272 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:16,272 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:16,272 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:16,275 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:37:16,275 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:37:16,275 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:37:16,275 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:37:16,275 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:37:16,464 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:16,464 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:16,464 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:16,464 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:16,464 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:29,168 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:13 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:29,168 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:13 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:29,168 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:13 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:29,168 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:13 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:29,168 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:13 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:30,005 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:30,005 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:30,005 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:30,005 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:30,005 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:30,033 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:30,033 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:30,033 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:30,033 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:30,033 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:30,448 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:30,448 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:30,448 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:30,448 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:30,448 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:30,492 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:30,492 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:30,492 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:30,492 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:30,492 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:32,005 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:32,005 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:32,005 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:32,005 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:32,005 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:32,054 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:32,054 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:32,054 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:32,054 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:32,054 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:34,313 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:34,313 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:34,313 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:34,313 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:34,313 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:34,361 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:34,361 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:34,361 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:34,361 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:34,361 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:37,921 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:37,921 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:37,921 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:37,921 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:37,921 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:37,967 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:37,967 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:37,967 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:37,967 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:37,967 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:41,974 - INFO: setup_samples finished after 0:00:29 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:41,974 - INFO: setup_samples finished after 0:00:29 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:41,974 - INFO: setup_samples finished after 0:00:29 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:41,974 - INFO: setup_samples finished after 0:00:29 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:41,974 - INFO: setup_samples finished after 0:00:29 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:41,976 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:37:41,976 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:37:41,976 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:37:41,976 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:37:41,976 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:37:41,978 - INFO: ...finished: DEBW013 (66%) [pre_processing.py:validate_station:303] 2023-12-18 17:37:41,978 - INFO: ...finished: DEBW013 (66%) [pre_processing.py:validate_station:303] 2023-12-18 17:37:41,978 - INFO: ...finished: DEBW013 (66%) [pre_processing.py:validate_station:303] 2023-12-18 17:37:41,978 - INFO: ...finished: DEBW013 (66%) [pre_processing.py:validate_station:303] 2023-12-18 17:37:41,978 - INFO: ...finished: DEBW013 (66%) [pre_processing.py:validate_station:303] 2023-12-18 17:37:42,648 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:37:42,648 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:37:42,648 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:37:42,648 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:37:42,648 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:37:42,699 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:42,699 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:42,699 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:42,699 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:42,699 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:43,322 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:37:43,322 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:37:43,322 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:37:43,322 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:37:43,322 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:37:43,370 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:43,370 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:43,370 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:43,370 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:43,370 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:44,272 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:37:44,272 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:37:44,272 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:37:44,272 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:37:44,272 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:37:44,333 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:44,333 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:44,333 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:44,333 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:44,333 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:44,696 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:37:44,696 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:37:44,696 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:37:44,696 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:37:44,696 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:37:44,844 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:44,844 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:44,844 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:44,844 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:44,844 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:44,914 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:37:44,914 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:37:44,914 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:37:44,914 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:37:44,914 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:37:45,822 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:45,822 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:45,822 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:45,822 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:45,822 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:37:45,824 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:37:45,824 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:37:45,824 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:37:45,824 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:37:45,824 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:37:46,721 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:46,721 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:46,721 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:46,721 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:37:46,721 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:05,509 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:05,509 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:05,509 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:05,509 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:05,509 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:05,554 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:05,554 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:05,554 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:05,554 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:05,554 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:05,581 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:05,581 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:05,581 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:05,581 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:05,581 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:06,404 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:06,404 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:06,404 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:06,404 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:06,404 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:06,437 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:06,437 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:06,437 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:06,437 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:06,437 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:07,718 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:07,718 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:07,718 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:07,718 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:07,718 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:07,765 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:07,765 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:07,765 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:07,765 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:07,765 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:10,126 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:10,126 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:10,126 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:10,126 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:10,126 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:10,171 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:10,171 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:10,171 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:10,171 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:10,171 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:13,770 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:13,770 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:13,770 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:13,770 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:13,770 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:38:13,819 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:13,819 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:13,819 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:13,819 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:13,819 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:17,827 - INFO: setup_samples finished after 0:00:36 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:17,827 - INFO: setup_samples finished after 0:00:36 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:17,827 - INFO: setup_samples finished after 0:00:36 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:17,827 - INFO: setup_samples finished after 0:00:36 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:17,827 - INFO: setup_samples finished after 0:00:36 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:38:17,830 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:38:17,830 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:38:17,830 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:38:17,830 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:38:17,830 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:38:17,832 - INFO: ...finished: DEBW087 (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:38:17,832 - INFO: ...finished: DEBW087 (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:38:17,832 - INFO: ...finished: DEBW087 (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:38:17,832 - INFO: ...finished: DEBW087 (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:38:17,832 - INFO: ...finished: DEBW087 (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:38:17,832 - INFO: run for 0:02:01 (hh:mm:ss) to check 3 station(s). Found 0/3 valid stations (preprocessing). [pre_processing.py:validate_station:305] 2023-12-18 17:38:17,832 - INFO: run for 0:02:01 (hh:mm:ss) to check 3 station(s). Found 0/3 valid stations (preprocessing). [pre_processing.py:validate_station:305] 2023-12-18 17:38:17,832 - INFO: run for 0:02:01 (hh:mm:ss) to check 3 station(s). Found 0/3 valid stations (preprocessing). [pre_processing.py:validate_station:305] 2023-12-18 17:38:17,832 - INFO: run for 0:02:01 (hh:mm:ss) to check 3 station(s). Found 0/3 valid stations (preprocessing). [pre_processing.py:validate_station:305] 2023-12-18 17:38:17,832 - INFO: run for 0:02:01 (hh:mm:ss) to check 3 station(s). Found 0/3 valid stations (preprocessing). [pre_processing.py:validate_station:305]
-------------------------------Captured log call--------------------------------
INFO root:run_environment.py:103 PreProcessing started INFO root:pre_processing.py:262 check valid stations started (preprocessing) INFO root:pre_processing.py:296 use serial validate station approach INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW107 from TOAR-DATA INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:06 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:20 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:time_tracking.py:137 setup_samples finished after 0:00:56 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( INFO root:pre_processing.py:303 ...finished: DEBW107 (33%) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW013 from TOAR-DATA INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:13 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:time_tracking.py:137 setup_samples finished after 0:00:29 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( INFO root:pre_processing.py:303 ...finished: DEBW013 (66%) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW087 from TOAR-DATA INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:19 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:time_tracking.py:137 setup_samples finished after 0:00:36 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( INFO root:pre_processing.py:303 ...finished: DEBW087 (100%) INFO root:pre_processing.py:305 run for 0:02:01 (hh:mm:ss) to check 3 station(s). Found 0/3 valid stations (preprocessing).
Failed test/test_run_modules/test_pre_processing.py::TestPreProcessing::test_create_set_split_not_all_stations 41.85
self = <test_pre_processing.TestPreProcessing object at 0x7fb026c51d30>
caplog = <_pytest.logging.LogCaptureFixture object at 0x7fb026c51e80>
obj_with_exp_setup = <mlair.run_modules.pre_processing.PreProcessing object at 0x7fb026c51fd0>

def test_create_set_split_not_all_stations(self, caplog, obj_with_exp_setup):
caplog.set_level(logging.DEBUG)
obj_with_exp_setup.data_store.set("use_all_stations_on_all_data_sets", False, "general")
obj_with_exp_setup.create_set_split(slice(0, 2), "awesome")
assert ('root', 10, "Awesome stations (len=2): ['DEBW107', 'DEBW013']") in caplog.record_tuples
data_store = obj_with_exp_setup.data_store
assert isinstance(data_store.get("data_collection", "general.awesome"), DataCollection)
with pytest.raises(NameNotFoundInScope):
data_store.get("data_collection", "general")
> assert data_store.get("stations", "general.awesome") == ["DEBW107", "DEBW013"]
E AssertionError: assert [] == ['DEBW107', 'DEBW013']
E Right contains 2 more items, first extra item: 'DEBW107'
E Use -v to get the full diff

test/test_run_modules/test_pre_processing.py:87: AssertionError
------------------------------Captured stderr call------------------------------
2023-12-18 17:43:28,401 - INFO: check valid stations started (awesome) [pre_processing.py:validate_station:262] 2023-12-18 17:43:28,401 - INFO: check valid stations started (awesome) [pre_processing.py:validate_station:262] 2023-12-18 17:43:28,401 - INFO: check valid stations started (awesome) [pre_processing.py:validate_station:262] 2023-12-18 17:43:28,401 - INFO: check valid stations started (awesome) [pre_processing.py:validate_station:262] 2023-12-18 17:43:28,401 - INFO: check valid stations started (awesome) [pre_processing.py:validate_station:262] 2023-12-18 17:43:28,407 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:43:28,407 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:43:28,407 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:43:28,407 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:43:28,407 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:43:29,341 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:43:29,341 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:43:29,341 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:43:29,341 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:43:29,341 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:43:29,365 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:29,365 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:29,365 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:29,365 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:29,365 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,307 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:43:30,307 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:43:30,307 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:43:30,307 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:43:30,307 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:43:30,342 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,342 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,342 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,342 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,342 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,524 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:43:30,524 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:43:30,524 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:43:30,524 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:43:30,524 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:43:30,581 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,581 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,581 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,581 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,581 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,616 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:43:30,616 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:43:30,616 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:43:30,616 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:43:30,616 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:43:30,761 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,761 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,761 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,761 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:30,761 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:31,349 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:43:31,349 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:43:31,349 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:43:31,349 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:43:31,349 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:43:31,494 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:31,494 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:31,494 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:31,494 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:31,494 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:31,496 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:43:31,496 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:43:31,496 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:43:31,496 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:43:31,496 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:43:31,714 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:31,714 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:31,714 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:31,714 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:31,714 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:34,139 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:34,139 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:34,139 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:34,139 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:34,139 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:34,866 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:34,866 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:34,866 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:34,866 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:34,866 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:34,894 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:34,894 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:34,894 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:34,894 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:34,894 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:35,487 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:35,487 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:35,487 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:35,487 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:35,487 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:35,532 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:35,532 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:35,532 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:35,532 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:35,532 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:36,985 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:36,985 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:36,985 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:36,985 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:36,985 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:37,007 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:37,007 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:37,007 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:37,007 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:37,007 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:39,719 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:39,719 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:39,719 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:39,719 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:39,719 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:39,766 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:39,766 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:39,766 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:39,766 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:39,766 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:43,102 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:43,102 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:43,102 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:43,102 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:43,102 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:43,148 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:43,148 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:43,148 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:43,148 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:43,148 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:47,154 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:47,154 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:47,154 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:47,154 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:47,154 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:47,156 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:43:47,156 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:43:47,156 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:43:47,156 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:43:47,156 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:43:47,158 - INFO: ...finished: DEBW107 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:43:47,158 - INFO: ...finished: DEBW107 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:43:47,158 - INFO: ...finished: DEBW107 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:43:47,158 - INFO: ...finished: DEBW107 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:43:47,158 - INFO: ...finished: DEBW107 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:43:47,778 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:43:47,778 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:43:47,778 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:43:47,778 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:43:47,778 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:43:47,824 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:47,824 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:47,824 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:47,824 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:47,824 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:48,384 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:43:48,384 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:43:48,384 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:43:48,384 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:43:48,384 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:43:48,435 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:48,435 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:48,435 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:48,435 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:48,435 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:48,747 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:43:48,747 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:43:48,747 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:43:48,747 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:43:48,747 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:43:48,807 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:48,807 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:48,807 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:48,807 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:48,807 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:49,303 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:43:49,303 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:43:49,303 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:43:49,303 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:43:49,303 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:43:49,429 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:49,429 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:49,429 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:49,429 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:49,429 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:50,121 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:43:50,121 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:43:50,121 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:43:50,121 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:43:50,121 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:43:50,272 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:50,272 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:50,272 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:50,272 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:50,272 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:50,275 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:43:50,275 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:43:50,275 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:43:50,275 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:43:50,275 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:43:50,792 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:50,792 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:50,792 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:50,792 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:50,792 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:53,207 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:53,207 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:53,207 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:53,207 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:53,207 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:54,154 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:54,154 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:54,154 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:54,154 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:54,154 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:54,180 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:54,180 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:54,180 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:54,180 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:54,180 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:54,655 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:54,655 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:54,655 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:54,655 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:54,655 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:54,701 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:54,701 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:54,701 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:54,701 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:54,701 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:55,868 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:55,868 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:55,868 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:55,868 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:55,868 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:55,912 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:55,912 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:55,912 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:55,912 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:55,912 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:58,124 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:58,124 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:58,124 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:58,124 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:58,124 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:43:58,169 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:58,169 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:58,169 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:58,169 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:43:58,169 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:01,601 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:01,601 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:01,601 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:01,601 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:01,601 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:01,654 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:01,654 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:01,654 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:01,654 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:01,654 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:05,661 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:05,661 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:05,661 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:05,661 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:05,661 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:05,664 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:05,664 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:05,664 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:05,664 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:05,664 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:05,666 - INFO: ...finished: DEBW013 (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:05,666 - INFO: ...finished: DEBW013 (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:05,666 - INFO: ...finished: DEBW013 (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:05,666 - INFO: ...finished: DEBW013 (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:05,666 - INFO: ...finished: DEBW013 (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:05,667 - INFO: run for 0:00:38 (hh:mm:ss) to check 2 station(s). Found 0/2 valid stations (awesome). [pre_processing.py:validate_station:305] 2023-12-18 17:44:05,667 - INFO: run for 0:00:38 (hh:mm:ss) to check 2 station(s). Found 0/2 valid stations (awesome). [pre_processing.py:validate_station:305] 2023-12-18 17:44:05,667 - INFO: run for 0:00:38 (hh:mm:ss) to check 2 station(s). Found 0/2 valid stations (awesome). [pre_processing.py:validate_station:305] 2023-12-18 17:44:05,667 - INFO: run for 0:00:38 (hh:mm:ss) to check 2 station(s). Found 0/2 valid stations (awesome). [pre_processing.py:validate_station:305] 2023-12-18 17:44:05,667 - INFO: run for 0:00:38 (hh:mm:ss) to check 2 station(s). Found 0/2 valid stations (awesome). [pre_processing.py:validate_station:305]
-------------------------------Captured log call--------------------------------
DEBUG root:datastore.py:118 set: use_all_stations_on_all_data_sets(general)=False DEBUG root:datastore.py:118 get: stations(general.awesome)=['DEBW107', 'DEBW013', 'DEBW087', 'DEBW99X'] DEBUG root:datastore.py:118 get: use_all_stations_on_all_data_sets(general)=False DEBUG root:pre_processing.py:244 Awesome stations (len=2): ['DEBW107', 'DEBW013'] DEBUG root:datastore.py:118 get: data_handler(general)=<class 'mlair.data_handler.default_data_handler.DefaultDataHandler'> INFO root:pre_processing.py:262 check valid stations started (awesome) DEBUG root:datastore.py:120 get: ifs_file_names(general.awesome)=None DEBUG root:datastore.py:120 get: ifs_data_path(general.awesome)=None DEBUG root:datastore.py:120 get: min_length(general.awesome)=None DEBUG root:datastore.py:120 get: overwrite_local_data(general.awesome)=None DEBUG root:datastore.py:118 get: time_dim(general.awesome)=datetime DEBUG root:datastore.py:120 get: name_affix(general.awesome)=None DEBUG root:datastore.py:118 get: use_multiprocessing(general.awesome)=True DEBUG root:datastore.py:120 get: lazy_preprocessing(general.awesome)=None DEBUG root:datastore.py:118 get: target_var(general.awesome)=o3 DEBUG root:datastore.py:120 get: window_history_offset(general.awesome)=None DEBUG root:datastore.py:120 get: store_processed_data(general.awesome)=None DEBUG root:datastore.py:118 get: target_dim(general.awesome)=variables DEBUG root:datastore.py:118 get: interpolation_method(general.awesome)=linear DEBUG root:datastore.py:118 get: window_dim(general.awesome)=window DEBUG root:datastore.py:118 get: variables(general.awesome)=['o3', 'temp'] DEBUG root:datastore.py:120 get: extreme_values(general.awesome)=None DEBUG root:datastore.py:120 get: extremes_on_right_tail_only(general.awesome)=None DEBUG root:datastore.py:118 get: start(general.awesome)=1997-01-01 DEBUG root:datastore.py:118 get: data_path(general.awesome)=/home/root/mlair/data/ DEBUG root:datastore.py:118 get: interpolation_limit(general.awesome)=1 DEBUG root:datastore.py:120 get: window_history_end(general.awesome)=None DEBUG root:datastore.py:118 get: transformation(general.awesome)={} DEBUG root:datastore.py:118 get: window_history_size(general.awesome)=13 DEBUG root:datastore.py:118 get: iter_dim(general.awesome)=Stations DEBUG root:datastore.py:118 get: data_origin(general.awesome)={'o3': 'UBA', 'temp': 'UBA'} DEBUG root:datastore.py:118 get: window_lead_time(general.awesome)=3 DEBUG root:datastore.py:120 get: overwrite_lazy_data(general.awesome)=None DEBUG root:datastore.py:120 get: store_data_locally(general.awesome)=None DEBUG root:datastore.py:118 get: experiment_path(general.awesome)=/builds/esde/machine-learning/mlair/TestExperiment_daily DEBUG root:datastore.py:118 get: statistics_per_var(general.awesome)={'o3': 'dma8eu', 'temp': 'maximum'} DEBUG root:datastore.py:118 get: max_number_multiprocessing(general.awesome)=16 DEBUG root:datastore.py:118 get: end(general.awesome)=2017-12-31 DEBUG root:datastore.py:120 get: era5_file_names(general.awesome)=None DEBUG root:datastore.py:118 get: sampling(general.awesome)=daily DEBUG root:datastore.py:120 get: era5_data_path(general.awesome)=None DEBUG root:datastore.py:118 get: use_multiprocessing(general)=True DEBUG root:datastore.py:118 get: tmp_path(general)=/builds/esde/machine-learning/mlair/TestExperiment_daily/tmp DEBUG root:datastore.py:118 get: max_number_multiprocessing(general)=16 INFO root:pre_processing.py:296 use serial validate station approach DEBUG root:path_config.py:132 Path already exists: /home/root/mlair/data/daily DEBUG root:data_handler_single_station.py:360 DEBW107: try to load local data from: /home/root/mlair/data/daily/DEBW107_o3_temp.nc DEBUG root:data_handler_single_station.py:366 DEBW107: [Errno 2] No such file or directory: b'/home/root/mlair/data/daily/DEBW107_o3_temp.nc' DEBUG root:data_handler_single_station.py:367 DEBW107: load new data INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/variables/o3 HTTP/1.1" 200 156 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/variables/temp HTTP/1.1" 200 161 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/stationmeta/DEBW107 HTTP/1.1" 200 2769 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/search/?station_id=88&variable_id=5 HTTP/1.1" 200 8780 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/search/?station_id=88&variable_id=21 HTTP/1.1" 200 8518 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW107 from TOAR-DATA DEBUG root:toar_data_v2.py:70 load o3 INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily HTTP/1.1" 202 153 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/status/a1a1f038-89da-49f1-9ed1-b69192cabb6b HTTP/1.1" 200 153 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/status/a1a1f038-89da-49f1-9ed1-b69192cabb6b HTTP/1.1" 302 0 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/result/a1a1f038-89da-49f1-9ed1-b69192cabb6b HTTP/1.1" 200 76677 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) DEBUG root:toar_data_v2.py:70 load temp INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily: 'status' INFO root:time_tracking.py:137 setup_samples finished after 0:00:19 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( DEBUG root:pre_processing.py:491 detailed information for removal of station DEBW107: Traceback (most recent call last): File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 199, in _acquire_with_cache_info file = self._cache[self._key] File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/lru_cache.py", line 53, in __getitem__ value = self._cache[key] KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/home/root/mlair/data/daily/DEBW107_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))] During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 361, in load_data data = xr.open_dataarray(file_name) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py", line 701, in open_dataarray dataset = open_dataset( File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py", line 572, in open_dataset store = opener(filename_or_obj, **extra_kwargs, **backend_kwargs) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 364, in open return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 314, in __init__ self.format = self.ds.data_model File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 373, in ds return self._acquire() File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 367, in _acquire with self._manager.acquire_context(needs_lock) as root: File "/usr/lib64/python3.9/contextlib.py", line 119, in __enter__ return next(self.gen) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 187, in acquire_context file, cached = self._acquire_with_cache_info(needs_lock) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 205, in _acquire_with_cache_info file = self._opener(*self._args, **kwargs) File "src/netCDF4/_netCDF4.pyx", line 2353, in netCDF4._netCDF4.Dataset.__init__ File "src/netCDF4/_netCDF4.pyx", line 1963, in netCDF4._netCDF4._ensure_nc_success FileNotFoundError: [Errno 2] No such file or directory: b'/home/root/mlair/data/daily/DEBW107_o3_temp.nc' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/mlair/run_modules/pre_processing.py", line 486, in f_proc res = data_handler.build(station, name_affix=name_affix, store_processed_data=store, **kwargs) File "/builds/esde/machine-learning/mlair/mlair/data_handler/default_data_handler.py", line 72, in build sp = cls.data_handler(station, **sp_keys) File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 125, in __init__ self.setup_samples() File "/builds/esde/machine-learning/mlair/mlair/helpers/time_tracking.py", line 40, in __call__ return self.__wrapped__(*args, **kwargs) File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 271, in setup_samples self.make_input_target() File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 312, in make_input_target data, self.meta = self.load_data(self.path, self.station, stats_per_var, self.sampling, File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 368, in load_data data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling, File "/builds/esde/machine-learning/mlair/mlair/helpers/data_sources/data_loader.py", line 69, in download_data df_toar, meta_toar = data_sources.toar_data.download_toar(station=station, toar_stats=toar_stats, File "/builds/esde/machine-learning/mlair/mlair/helpers/data_sources/toar_data.py", line 19, in download_toar df_toar_xr = xr.DataArray(df_toar, dims=[time_dim, target_dim]) File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 403, in __init__ coords, dims = _infer_coords_and_dims(data.shape, coords, dims) File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims raise ValueError( ValueError: different number of dimensions on data and dims: 0 vs 2 INFO root:pre_processing.py:303 ...finished: DEBW107 (50%) DEBUG root:path_config.py:132 Path already exists: /home/root/mlair/data/daily DEBUG root:data_handler_single_station.py:360 DEBW013: try to load local data from: /home/root/mlair/data/daily/DEBW013_o3_temp.nc DEBUG root:data_handler_single_station.py:366 DEBW013: [Errno 2] No such file or directory: b'/home/root/mlair/data/daily/DEBW013_o3_temp.nc' DEBUG root:data_handler_single_station.py:367 DEBW013: load new data INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/variables/o3 HTTP/1.1" 200 156 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/variables/temp HTTP/1.1" 200 161 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/stationmeta/DEBW013 HTTP/1.1" 200 2725 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/search/?station_id=84&variable_id=5 HTTP/1.1" 200 8690 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/search/?station_id=84&variable_id=21 HTTP/1.1" 200 8428 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW013 from TOAR-DATA DEBUG root:toar_data_v2.py:70 load o3 INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily HTTP/1.1" 202 153 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/status/c508fb93-a900-468f-a6b5-712cf42611fa HTTP/1.1" 200 153 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/status/c508fb93-a900-468f-a6b5-712cf42611fa HTTP/1.1" 302 0 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/result/c508fb93-a900-468f-a6b5-712cf42611fa HTTP/1.1" 200 122474 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) DEBUG root:toar_data_v2.py:70 load temp INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily: 'status' INFO root:time_tracking.py:137 setup_samples finished after 0:00:19 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( DEBUG root:pre_processing.py:491 detailed information for removal of station DEBW013: Traceback (most recent call last): File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 199, in _acquire_with_cache_info file = self._cache[self._key] File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/lru_cache.py", line 53, in __getitem__ value = self._cache[key] KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/home/root/mlair/data/daily/DEBW013_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))] During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 361, in load_data data = xr.open_dataarray(file_name) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py", line 701, in open_dataarray dataset = open_dataset( File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py", line 572, in open_dataset store = opener(filename_or_obj, **extra_kwargs, **backend_kwargs) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 364, in open return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 314, in __init__ self.format = self.ds.data_model File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 373, in ds return self._acquire() File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 367, in _acquire with self._manager.acquire_context(needs_lock) as root: File "/usr/lib64/python3.9/contextlib.py", line 119, in __enter__ return next(self.gen) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 187, in acquire_context file, cached = self._acquire_with_cache_info(needs_lock) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 205, in _acquire_with_cache_info file = self._opener(*self._args, **kwargs) File "src/netCDF4/_netCDF4.pyx", line 2353, in netCDF4._netCDF4.Dataset.__init__ File "src/netCDF4/_netCDF4.pyx", line 1963, in netCDF4._netCDF4._ensure_nc_success FileNotFoundError: [Errno 2] No such file or directory: b'/home/root/mlair/data/daily/DEBW013_o3_temp.nc' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/mlair/run_modules/pre_processing.py", line 486, in f_proc res = data_handler.build(station, name_affix=name_affix, store_processed_data=store, **kwargs) File "/builds/esde/machine-learning/mlair/mlair/data_handler/default_data_handler.py", line 72, in build sp = cls.data_handler(station, **sp_keys) File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 125, in __init__ self.setup_samples() File "/builds/esde/machine-learning/mlair/mlair/helpers/time_tracking.py", line 40, in __call__ return self.__wrapped__(*args, **kwargs) File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 271, in setup_samples self.make_input_target() File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 312, in make_input_target data, self.meta = self.load_data(self.path, self.station, stats_per_var, self.sampling, File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 368, in load_data data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling, File "/builds/esde/machine-learning/mlair/mlair/helpers/data_sources/data_loader.py", line 69, in download_data df_toar, meta_toar = data_sources.toar_data.download_toar(station=station, toar_stats=toar_stats, File "/builds/esde/machine-learning/mlair/mlair/helpers/data_sources/toar_data.py", line 19, in download_toar df_toar_xr = xr.DataArray(df_toar, dims=[time_dim, target_dim]) File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 403, in __init__ coords, dims = _infer_coords_and_dims(data.shape, coords, dims) File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims raise ValueError( ValueError: different number of dimensions on data and dims: 0 vs 2 INFO root:pre_processing.py:303 ...finished: DEBW013 (100%) INFO root:pre_processing.py:305 run for 0:00:38 (hh:mm:ss) to check 2 station(s). Found 0/2 valid stations (awesome). DEBUG root:datastore.py:118 set: stations(general.awesome)=[] DEBUG root:datastore.py:118 set: data_collection(general.awesome)=<mlair.data_handler.iterator.DataCollection object at 0x7fb026c5c370> DEBUG root:datastore.py:118 get: data_collection(general.awesome)=<mlair.data_handler.iterator.DataCollection object at 0x7fb026c5c370> DEBUG root:datastore.py:120 get: data_collection(general)=None DEBUG root:datastore.py:118 get: stations(general.awesome)=[]
----------------------------Captured stderr teardown----------------------------
2023-12-18 17:44:05,704 - INFO: RunEnvironment finished after 0:00:38 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:44:05,704 - INFO: RunEnvironment finished after 0:00:38 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:44:05,704 - INFO: RunEnvironment finished after 0:00:38 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:44:05,704 - INFO: RunEnvironment finished after 0:00:38 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:44:05,704 - INFO: RunEnvironment finished after 0:00:38 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:44:10,247 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_002.json [run_environment.py:__save_tracking:159] 2023-12-18 17:44:10,247 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_002.json [run_environment.py:__save_tracking:159] 2023-12-18 17:44:10,247 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_002.json [run_environment.py:__save_tracking:159] 2023-12-18 17:44:10,247 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_002.json [run_environment.py:__save_tracking:159] 2023-12-18 17:44:10,247 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_002.json [run_environment.py:__save_tracking:159] 2023-12-18 17:44:10,271 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:44:10,271 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:44:10,271 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:44:10,271 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:44:10,271 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147]
-----------------------------Captured log teardown------------------------------
INFO root:run_environment.py:120 RunEnvironment finished after 0:00:38 (hh:mm:ss) DEBUG root:datastore.py:118 get: logging_path(general)=/builds/esde/machine-learning/mlair/TestExperiment_daily/logging DEBUG root:datastore.py:118 get: logging_path(general)=/builds/esde/machine-learning/mlair/TestExperiment_daily/logging INFO root:run_environment.py:159 Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_002.json DEBUG root:datastore.py:118 get: logging_path(general)=/builds/esde/machine-learning/mlair/TestExperiment_daily/logging INFO root:run_environment.py:147 Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log
Failed test/test_run_modules/test_pre_processing.py::TestPreProcessing::test_create_set_split_all_stations 69.56
self = <test_pre_processing.TestPreProcessing object at 0x7fb026c717f0>
caplog = <_pytest.logging.LogCaptureFixture object at 0x7fb026c713d0>
obj_with_exp_setup = <mlair.run_modules.pre_processing.PreProcessing object at 0x7fb026c71460>

def test_create_set_split_all_stations(self, caplog, obj_with_exp_setup):
caplog.set_level(logging.DEBUG)
obj_with_exp_setup.create_set_split(slice(0, 2), "awesome")
message = "Awesome stations (len=4): ['DEBW107', 'DEBW013', 'DEBW087', 'DEBW99X']"
assert ('root', 10, message) in caplog.record_tuples
data_store = obj_with_exp_setup.data_store
assert isinstance(data_store.get("data_collection", "general.awesome"), DataCollection)
with pytest.raises(NameNotFoundInScope):
data_store.get("data_collection", "general")
> assert data_store.get("stations", "general.awesome") == ['DEBW107', 'DEBW013', 'DEBW087']
E AssertionError: assert [] == ['DEBW107', '...3', 'DEBW087']
E Right contains 3 more items, first extra item: 'DEBW107'
E Use -v to get the full diff

test/test_run_modules/test_pre_processing.py:98: AssertionError
------------------------------Captured stderr call------------------------------
2023-12-18 17:44:10,282 - INFO: check valid stations started (awesome) [pre_processing.py:validate_station:262] 2023-12-18 17:44:10,282 - INFO: check valid stations started (awesome) [pre_processing.py:validate_station:262] 2023-12-18 17:44:10,282 - INFO: check valid stations started (awesome) [pre_processing.py:validate_station:262] 2023-12-18 17:44:10,282 - INFO: check valid stations started (awesome) [pre_processing.py:validate_station:262] 2023-12-18 17:44:10,282 - INFO: check valid stations started (awesome) [pre_processing.py:validate_station:262] 2023-12-18 17:44:10,289 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:44:10,289 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:44:10,289 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:44:10,289 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:44:10,289 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:44:11,259 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:11,259 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:11,259 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:11,259 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:11,259 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:11,281 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:11,281 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:11,281 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:11,281 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:11,281 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:12,235 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:12,235 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:12,235 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:12,235 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:12,235 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:12,268 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:12,268 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:12,268 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:12,268 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:12,268 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:13,164 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:44:13,164 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:44:13,164 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:44:13,164 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:44:13,164 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:44:13,222 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:13,222 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:13,222 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:13,222 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:13,222 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:14,172 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:14,172 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:14,172 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:14,172 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:14,172 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:14,324 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:14,324 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:14,324 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:14,324 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:14,324 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:14,924 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:14,924 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:14,924 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:14,924 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:14,924 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:15,073 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:15,073 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:15,073 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:15,073 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:15,073 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:15,076 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:15,076 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:15,076 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:15,076 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:15,076 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:16,003 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:16,003 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:16,003 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:16,003 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:16,003 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:18,421 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:18,421 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:18,421 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:18,421 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:18,421 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:19,364 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:19,364 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:19,364 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:19,364 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:19,364 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:19,390 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:19,390 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:19,390 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:19,390 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:19,390 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:20,319 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:20,319 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:20,319 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:20,319 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:20,319 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:20,365 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:20,365 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:20,365 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:20,365 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:20,365 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:22,260 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:22,260 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:22,260 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:22,260 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:22,260 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:22,306 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:22,306 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:22,306 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:22,306 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:22,306 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:24,428 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:24,428 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:24,428 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:24,428 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:24,428 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:24,473 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:24,473 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:24,473 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:24,473 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:24,473 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:27,904 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:27,904 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:27,904 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:27,904 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:27,904 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:27,950 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:27,950 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:27,950 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:27,950 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:27,950 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:31,956 - INFO: setup_samples finished after 0:00:22 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:31,956 - INFO: setup_samples finished after 0:00:22 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:31,956 - INFO: setup_samples finished after 0:00:22 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:31,956 - INFO: setup_samples finished after 0:00:22 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:31,956 - INFO: setup_samples finished after 0:00:22 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:31,959 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:31,959 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:31,959 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:31,959 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:31,959 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:31,961 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:31,961 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:31,961 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:31,961 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:31,961 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:32,375 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:32,375 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:32,375 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:32,375 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:32,375 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:32,421 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:32,421 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:32,421 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:32,421 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:32,421 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:33,214 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:33,214 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:33,214 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:33,214 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:33,214 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:33,260 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:33,260 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:33,260 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:33,260 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:33,260 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:33,996 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:44:33,996 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:44:33,996 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:44:33,996 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:44:33,996 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:44:34,058 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:34,058 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:34,058 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:34,058 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:34,058 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:34,570 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:34,570 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:34,570 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:34,570 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:34,570 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:34,716 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:34,716 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:34,716 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:34,716 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:34,716 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:34,912 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:34,912 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:34,912 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:34,912 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:34,912 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:35,057 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:35,057 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:35,057 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:35,057 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:35,057 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:35,059 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:35,059 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:35,059 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:35,059 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:35,059 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:35,067 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:35,067 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:35,067 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:35,067 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:35,067 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:37,511 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:37,511 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:37,511 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:37,511 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:37,511 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:38,000 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:38,000 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:38,000 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:38,000 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:38,000 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:38,025 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:38,025 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:38,025 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:38,025 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:38,025 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:38,814 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:38,814 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:38,814 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:38,814 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:38,814 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:38,864 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:38,864 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:38,864 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:38,864 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:38,864 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:40,761 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:40,761 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:40,761 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:40,761 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:40,761 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:40,800 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:40,800 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:40,800 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:40,800 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:40,800 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:43,432 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:43,432 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:43,432 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:43,432 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:43,432 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:43,478 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:43,478 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:43,478 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:43,478 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:43,478 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:46,854 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:46,854 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:46,854 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:46,854 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:46,854 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:46,904 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:46,904 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:46,904 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:46,904 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:46,904 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:50,912 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:50,912 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:50,912 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:50,912 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:50,912 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:50,915 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:50,915 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:50,915 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:50,915 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:50,915 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:44:50,917 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:50,917 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:50,917 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:50,917 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:50,917 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:44:51,365 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:51,365 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:51,365 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:51,365 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:51,365 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:44:51,411 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,411 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,411 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,411 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,411 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,421 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:51,421 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:51,421 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:51,421 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:51,421 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:44:51,467 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,467 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,467 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,467 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,467 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,927 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:44:51,927 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:44:51,927 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:44:51,927 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:44:51,927 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:44:51,988 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,988 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,988 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,988 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:51,988 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:52,406 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:52,406 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:52,406 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:52,406 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:52,406 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:44:52,555 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:52,555 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:52,555 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:52,555 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:52,555 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:53,482 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:53,482 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:53,482 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:53,482 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:53,482 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:44:53,629 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:53,629 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:53,629 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:53,629 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:53,629 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:53,632 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:53,632 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:53,632 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:53,632 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:53,632 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:44:54,435 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:54,435 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:54,435 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:54,435 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:54,435 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:56,860 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:56,860 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:56,860 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:56,860 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:56,860 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:57,305 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:57,305 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:57,305 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:57,305 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:57,305 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:57,330 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:57,330 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:57,330 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:57,330 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:57,330 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:58,052 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:58,052 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:58,052 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:58,052 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:58,052 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:58,098 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:58,098 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:58,098 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:58,098 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:58,098 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:59,851 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:59,851 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:59,851 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:59,851 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:59,851 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:44:59,896 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:59,896 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:59,896 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:59,896 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:44:59,896 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:02,593 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:02,593 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:02,593 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:02,593 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:02,593 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:02,643 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:02,643 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:02,643 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:02,643 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:02,643 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:06,042 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:06,042 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:06,042 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:06,042 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:06,042 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:06,087 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:06,087 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:06,087 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:06,087 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:06,087 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:10,094 - INFO: setup_samples finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:10,094 - INFO: setup_samples finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:10,094 - INFO: setup_samples finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:10,094 - INFO: setup_samples finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:10,094 - INFO: setup_samples finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:10,096 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:10,096 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:10,096 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:10,096 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:10,096 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:10,098 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:10,098 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:10,098 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:10,098 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:10,098 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:10,389 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:10,389 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:10,389 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:10,389 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:10,389 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:10,496 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:10,496 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:10,496 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:10,496 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:10,496 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:11,496 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:11,496 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:11,496 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:11,496 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:11,496 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:11,542 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:11,542 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:11,542 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:11,542 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:11,542 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:11,918 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:11,918 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:11,918 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:11,918 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:11,918 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:11,976 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:11,976 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:11,976 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:11,976 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:11,976 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:12,973 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:12,973 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:12,973 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:12,973 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:12,973 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:13,026 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:13,026 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:13,026 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:13,026 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:13,026 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:13,428 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:13,428 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:13,428 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:13,428 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:13,428 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:13,480 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:13,480 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:13,480 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:13,480 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:13,480 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:14,320 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:14,320 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:14,320 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:14,320 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:14,320 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:14,372 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:14,372 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:14,372 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:14,372 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:14,372 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:15,291 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:15,291 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:15,291 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:15,291 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:15,291 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:45:15,347 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:15,347 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:15,347 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:15,347 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:15,347 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:15,354 - INFO: setup_samples finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:15,354 - INFO: setup_samples finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:15,354 - INFO: setup_samples finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:15,354 - INFO: setup_samples finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:15,354 - INFO: setup_samples finished after 0:00:06 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:15,356 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:15,356 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:15,356 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:15,356 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:15,356 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:15,358 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:15,358 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:15,358 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:15,358 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:15,358 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:15,359 - INFO: run for 0:01:06 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (awesome). [pre_processing.py:validate_station:305] 2023-12-18 17:45:15,359 - INFO: run for 0:01:06 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (awesome). [pre_processing.py:validate_station:305] 2023-12-18 17:45:15,359 - INFO: run for 0:01:06 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (awesome). [pre_processing.py:validate_station:305] 2023-12-18 17:45:15,359 - INFO: run for 0:01:06 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (awesome). [pre_processing.py:validate_station:305] 2023-12-18 17:45:15,359 - INFO: run for 0:01:06 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (awesome). [pre_processing.py:validate_station:305]
-------------------------------Captured log call--------------------------------
DEBUG root:datastore.py:118 get: stations(general.awesome)=['DEBW107', 'DEBW013', 'DEBW087', 'DEBW99X'] DEBUG root:datastore.py:118 get: use_all_stations_on_all_data_sets(general)=True DEBUG root:pre_processing.py:244 Awesome stations (len=4): ['DEBW107', 'DEBW013', 'DEBW087', 'DEBW99X'] DEBUG root:datastore.py:118 get: data_handler(general)=<class 'mlair.data_handler.default_data_handler.DefaultDataHandler'> INFO root:pre_processing.py:262 check valid stations started (awesome) DEBUG root:datastore.py:120 get: ifs_file_names(general.awesome)=None DEBUG root:datastore.py:120 get: ifs_data_path(general.awesome)=None DEBUG root:datastore.py:120 get: min_length(general.awesome)=None DEBUG root:datastore.py:120 get: overwrite_local_data(general.awesome)=None DEBUG root:datastore.py:118 get: time_dim(general.awesome)=datetime DEBUG root:datastore.py:120 get: name_affix(general.awesome)=None DEBUG root:datastore.py:118 get: use_multiprocessing(general.awesome)=True DEBUG root:datastore.py:120 get: lazy_preprocessing(general.awesome)=None DEBUG root:datastore.py:118 get: target_var(general.awesome)=o3 DEBUG root:datastore.py:120 get: window_history_offset(general.awesome)=None DEBUG root:datastore.py:120 get: store_processed_data(general.awesome)=None DEBUG root:datastore.py:118 get: target_dim(general.awesome)=variables DEBUG root:datastore.py:118 get: interpolation_method(general.awesome)=linear DEBUG root:datastore.py:118 get: window_dim(general.awesome)=window DEBUG root:datastore.py:118 get: variables(general.awesome)=['o3', 'temp'] DEBUG root:datastore.py:120 get: extreme_values(general.awesome)=None DEBUG root:datastore.py:120 get: extremes_on_right_tail_only(general.awesome)=None DEBUG root:datastore.py:118 get: start(general.awesome)=1997-01-01 DEBUG root:datastore.py:118 get: data_path(general.awesome)=/home/root/mlair/data/ DEBUG root:datastore.py:118 get: interpolation_limit(general.awesome)=1 DEBUG root:datastore.py:120 get: window_history_end(general.awesome)=None DEBUG root:datastore.py:118 get: transformation(general.awesome)={} DEBUG root:datastore.py:118 get: window_history_size(general.awesome)=13 DEBUG root:datastore.py:118 get: iter_dim(general.awesome)=Stations DEBUG root:datastore.py:118 get: data_origin(general.awesome)={'o3': 'UBA', 'temp': 'UBA'} DEBUG root:datastore.py:118 get: window_lead_time(general.awesome)=3 DEBUG root:datastore.py:120 get: overwrite_lazy_data(general.awesome)=None DEBUG root:datastore.py:120 get: store_data_locally(general.awesome)=None DEBUG root:datastore.py:118 get: experiment_path(general.awesome)=/builds/esde/machine-learning/mlair/TestExperiment_daily DEBUG root:datastore.py:118 get: statistics_per_var(general.awesome)={'o3': 'dma8eu', 'temp': 'maximum'} DEBUG root:datastore.py:118 get: max_number_multiprocessing(general.awesome)=16 DEBUG root:datastore.py:118 get: end(general.awesome)=2017-12-31 DEBUG root:datastore.py:120 get: era5_file_names(general.awesome)=None DEBUG root:datastore.py:118 get: sampling(general.awesome)=daily DEBUG root:datastore.py:120 get: era5_data_path(general.awesome)=None DEBUG root:datastore.py:118 get: use_multiprocessing(general)=True DEBUG root:datastore.py:118 get: tmp_path(general)=/builds/esde/machine-learning/mlair/TestExperiment_daily/tmp DEBUG root:datastore.py:118 get: max_number_multiprocessing(general)=16 INFO root:pre_processing.py:296 use serial validate station approach DEBUG root:path_config.py:132 Path already exists: /home/root/mlair/data/daily DEBUG root:data_handler_single_station.py:360 DEBW107: try to load local data from: /home/root/mlair/data/daily/DEBW107_o3_temp.nc DEBUG root:data_handler_single_station.py:366 DEBW107: [Errno 2] No such file or directory: b'/home/root/mlair/data/daily/DEBW107_o3_temp.nc' DEBUG root:data_handler_single_station.py:367 DEBW107: load new data INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/variables/o3 HTTP/1.1" 200 156 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/variables/temp HTTP/1.1" 200 161 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/stationmeta/DEBW107 HTTP/1.1" 200 2769 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/search/?station_id=88&variable_id=5 HTTP/1.1" 200 8780 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/search/?station_id=88&variable_id=21 HTTP/1.1" 200 8518 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW107 from TOAR-DATA DEBUG root:toar_data_v2.py:70 load o3 INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily HTTP/1.1" 202 153 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/status/9df6c63e-833e-48fb-b27f-ddde912e6947 HTTP/1.1" 200 153 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/status/9df6c63e-833e-48fb-b27f-ddde912e6947 HTTP/1.1" 302 0 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/result/9df6c63e-833e-48fb-b27f-ddde912e6947 HTTP/1.1" 200 76690 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) DEBUG root:toar_data_v2.py:70 load temp INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily: 'status' INFO root:time_tracking.py:137 setup_samples finished after 0:00:22 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( DEBUG root:pre_processing.py:491 detailed information for removal of station DEBW107: Traceback (most recent call last): File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 199, in _acquire_with_cache_info file = self._cache[self._key] File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/lru_cache.py", line 53, in __getitem__ value = self._cache[key] KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/home/root/mlair/data/daily/DEBW107_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))] During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 361, in load_data data = xr.open_dataarray(file_name) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py", line 701, in open_dataarray dataset = open_dataset( File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py", line 572, in open_dataset store = opener(filename_or_obj, **extra_kwargs, **backend_kwargs) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 364, in open return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 314, in __init__ self.format = self.ds.data_model File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 373, in ds return self._acquire() File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 367, in _acquire with self._manager.acquire_context(needs_lock) as root: File "/usr/lib64/python3.9/contextlib.py", line 119, in __enter__ return next(self.gen) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 187, in acquire_context file, cached = self._acquire_with_cache_info(needs_lock) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 205, in _acquire_with_cache_info file = self._opener(*self._args, **kwargs) File "src/netCDF4/_netCDF4.pyx", line 2353, in netCDF4._netCDF4.Dataset.__init__ File "src/netCDF4/_netCDF4.pyx", line 1963, in netCDF4._netCDF4._ensure_nc_success FileNotFoundError: [Errno 2] No such file or directory: b'/home/root/mlair/data/daily/DEBW107_o3_temp.nc' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/mlair/run_modules/pre_processing.py", line 486, in f_proc res = data_handler.build(station, name_affix=name_affix, store_processed_data=store, **kwargs) File "/builds/esde/machine-learning/mlair/mlair/data_handler/default_data_handler.py", line 72, in build sp = cls.data_handler(station, **sp_keys) File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 125, in __init__ self.setup_samples() File "/builds/esde/machine-learning/mlair/mlair/helpers/time_tracking.py", line 40, in __call__ return self.__wrapped__(*args, **kwargs) File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 271, in setup_samples self.make_input_target() File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 312, in make_input_target data, self.meta = self.load_data(self.path, self.station, stats_per_var, self.sampling, File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 368, in load_data data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling, File "/builds/esde/machine-learning/mlair/mlair/helpers/data_sources/data_loader.py", line 69, in download_data df_toar, meta_toar = data_sources.toar_data.download_toar(station=station, toar_stats=toar_stats, File "/builds/esde/machine-learning/mlair/mlair/helpers/data_sources/toar_data.py", line 19, in download_toar df_toar_xr = xr.DataArray(df_toar, dims=[time_dim, target_dim]) File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 403, in __init__ coords, dims = _infer_coords_and_dims(data.shape, coords, dims) File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims raise ValueError( ValueError: different number of dimensions on data and dims: 0 vs 2 INFO root:pre_processing.py:303 ...finished: DEBW107 (25%) DEBUG root:path_config.py:132 Path already exists: /home/root/mlair/data/daily DEBUG root:data_handler_single_station.py:360 DEBW013: try to load local data from: /home/root/mlair/data/daily/DEBW013_o3_temp.nc DEBUG root:data_handler_single_station.py:366 DEBW013: [Errno 2] No such file or directory: b'/home/root/mlair/data/daily/DEBW013_o3_temp.nc' DEBUG root:data_handler_single_station.py:367 DEBW013: load new data INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/variables/o3 HTTP/1.1" 200 156 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/variables/temp HTTP/1.1" 200 161 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/stationmeta/DEBW013 HTTP/1.1" 200 2725 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/search/?station_id=84&variable_id=5 HTTP/1.1" 200 8690 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/search/?station_id=84&variable_id=21 HTTP/1.1" 200 8428 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW013 from TOAR-DATA DEBUG root:toar_data_v2.py:70 load o3 INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily HTTP/1.1" 202 153 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/status/7275c873-c625-45e4-b061-cc326ebbb898 HTTP/1.1" 200 153 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/status/7275c873-c625-45e4-b061-cc326ebbb898 HTTP/1.1" 302 0 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/result/7275c873-c625-45e4-b061-cc326ebbb898 HTTP/1.1" 200 122474 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) DEBUG root:toar_data_v2.py:70 load temp INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily: 'status' INFO root:time_tracking.py:137 setup_samples finished after 0:00:19 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( DEBUG root:pre_processing.py:491 detailed information for removal of station DEBW013: Traceback (most recent call last): File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 199, in _acquire_with_cache_info file = self._cache[self._key] File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/lru_cache.py", line 53, in __getitem__ value = self._cache[key] KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/home/root/mlair/data/daily/DEBW013_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))] During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 361, in load_data data = xr.open_dataarray(file_name) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py", line 701, in open_dataarray dataset = open_dataset( File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py", line 572, in open_dataset store = opener(filename_or_obj, **extra_kwargs, **backend_kwargs) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 364, in open return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 314, in __init__ self.format = self.ds.data_model File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 373, in ds return self._acquire() File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 367, in _acquire with self._manager.acquire_context(needs_lock) as root: File "/usr/lib64/python3.9/contextlib.py", line 119, in __enter__ return next(self.gen) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 187, in acquire_context file, cached = self._acquire_with_cache_info(needs_lock) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 205, in _acquire_with_cache_info file = self._opener(*self._args, **kwargs) File "src/netCDF4/_netCDF4.pyx", line 2353, in netCDF4._netCDF4.Dataset.__init__ File "src/netCDF4/_netCDF4.pyx", line 1963, in netCDF4._netCDF4._ensure_nc_success FileNotFoundError: [Errno 2] No such file or directory: b'/home/root/mlair/data/daily/DEBW013_o3_temp.nc' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/mlair/run_modules/pre_processing.py", line 486, in f_proc res = data_handler.build(station, name_affix=name_affix, store_processed_data=store, **kwargs) File "/builds/esde/machine-learning/mlair/mlair/data_handler/default_data_handler.py", line 72, in build sp = cls.data_handler(station, **sp_keys) File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 125, in __init__ self.setup_samples() File "/builds/esde/machine-learning/mlair/mlair/helpers/time_tracking.py", line 40, in __call__ return self.__wrapped__(*args, **kwargs) File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 271, in setup_samples self.make_input_target() File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 312, in make_input_target data, self.meta = self.load_data(self.path, self.station, stats_per_var, self.sampling, File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 368, in load_data data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling, File "/builds/esde/machine-learning/mlair/mlair/helpers/data_sources/data_loader.py", line 69, in download_data df_toar, meta_toar = data_sources.toar_data.download_toar(station=station, toar_stats=toar_stats, File "/builds/esde/machine-learning/mlair/mlair/helpers/data_sources/toar_data.py", line 19, in download_toar df_toar_xr = xr.DataArray(df_toar, dims=[time_dim, target_dim]) File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 403, in __init__ coords, dims = _infer_coords_and_dims(data.shape, coords, dims) File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims raise ValueError( ValueError: different number of dimensions on data and dims: 0 vs 2 INFO root:pre_processing.py:303 ...finished: DEBW013 (50%) DEBUG root:path_config.py:132 Path already exists: /home/root/mlair/data/daily DEBUG root:data_handler_single_station.py:360 DEBW087: try to load local data from: /home/root/mlair/data/daily/DEBW087_o3_temp.nc DEBUG root:data_handler_single_station.py:366 DEBW087: [Errno 2] No such file or directory: b'/home/root/mlair/data/daily/DEBW087_o3_temp.nc' DEBUG root:data_handler_single_station.py:367 DEBW087: load new data INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/variables/o3 HTTP/1.1" 200 156 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/variables/temp HTTP/1.1" 200 161 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/stationmeta/DEBW087 HTTP/1.1" 200 2761 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/search/?station_id=83&variable_id=5 HTTP/1.1" 200 8764 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/search/?station_id=83&variable_id=21 HTTP/1.1" 200 8500 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW087 from TOAR-DATA DEBUG root:toar_data_v2.py:70 load o3 INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily HTTP/1.1" 202 153 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/status/f6081dde-ec47-4719-a6aa-1f1b7384556d HTTP/1.1" 200 153 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/status/f6081dde-ec47-4719-a6aa-1f1b7384556d HTTP/1.1" 302 0 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/result/f6081dde-ec47-4719-a6aa-1f1b7384556d HTTP/1.1" 200 101975 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) DEBUG root:toar_data_v2.py:70 load temp INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily: 'status' INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily HTTP/1.1" 422 634 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) DEBUG root:data_loader.py:128 There was an error for request https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily: 'status' INFO root:time_tracking.py:137 setup_samples finished after 0:00:20 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( DEBUG root:pre_processing.py:491 detailed information for removal of station DEBW087: Traceback (most recent call last): File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 199, in _acquire_with_cache_info file = self._cache[self._key] File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/lru_cache.py", line 53, in __getitem__ value = self._cache[key] KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/home/root/mlair/data/daily/DEBW087_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))] During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 361, in load_data data = xr.open_dataarray(file_name) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py", line 701, in open_dataarray dataset = open_dataset( File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py", line 572, in open_dataset store = opener(filename_or_obj, **extra_kwargs, **backend_kwargs) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 364, in open return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 314, in __init__ self.format = self.ds.data_model File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 373, in ds return self._acquire() File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 367, in _acquire with self._manager.acquire_context(needs_lock) as root: File "/usr/lib64/python3.9/contextlib.py", line 119, in __enter__ return next(self.gen) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 187, in acquire_context file, cached = self._acquire_with_cache_info(needs_lock) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 205, in _acquire_with_cache_info file = self._opener(*self._args, **kwargs) File "src/netCDF4/_netCDF4.pyx", line 2353, in netCDF4._netCDF4.Dataset.__init__ File "src/netCDF4/_netCDF4.pyx", line 1963, in netCDF4._netCDF4._ensure_nc_success FileNotFoundError: [Errno 2] No such file or directory: b'/home/root/mlair/data/daily/DEBW087_o3_temp.nc' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/mlair/run_modules/pre_processing.py", line 486, in f_proc res = data_handler.build(station, name_affix=name_affix, store_processed_data=store, **kwargs) File "/builds/esde/machine-learning/mlair/mlair/data_handler/default_data_handler.py", line 72, in build sp = cls.data_handler(station, **sp_keys) File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 125, in __init__ self.setup_samples() File "/builds/esde/machine-learning/mlair/mlair/helpers/time_tracking.py", line 40, in __call__ return self.__wrapped__(*args, **kwargs) File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 271, in setup_samples self.make_input_target() File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 312, in make_input_target data, self.meta = self.load_data(self.path, self.station, stats_per_var, self.sampling, File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 368, in load_data data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling, File "/builds/esde/machine-learning/mlair/mlair/helpers/data_sources/data_loader.py", line 69, in download_data df_toar, meta_toar = data_sources.toar_data.download_toar(station=station, toar_stats=toar_stats, File "/builds/esde/machine-learning/mlair/mlair/helpers/data_sources/toar_data.py", line 19, in download_toar df_toar_xr = xr.DataArray(df_toar, dims=[time_dim, target_dim]) File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 403, in __init__ coords, dims = _infer_coords_and_dims(data.shape, coords, dims) File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims raise ValueError( ValueError: different number of dimensions on data and dims: 0 vs 2 INFO root:pre_processing.py:303 ...finished: DEBW087 (75%) DEBUG root:path_config.py:132 Path already exists: /home/root/mlair/data/daily DEBUG root:data_handler_single_station.py:360 DEBW99X: try to load local data from: /home/root/mlair/data/daily/DEBW99X_o3_temp.nc DEBUG root:data_handler_single_station.py:366 DEBW99X: [Errno 2] No such file or directory: b'/home/root/mlair/data/daily/DEBW99X_o3_temp.nc' DEBUG root:data_handler_single_station.py:367 DEBW99X: load new data INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/variables/o3 HTTP/1.1" 200 156 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/variables/temp HTTP/1.1" 200 161 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/stationmeta/DEBW99X HTTP/1.1" 404 54 DEBUG root:data_loader.py:159 There was an error (STATUS 404) for request https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/stationmeta/DEBW99X HTTP/1.1" 404 54 DEBUG root:data_loader.py:159 There was an error (STATUS 404) for request https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/stationmeta/DEBW99X HTTP/1.1" 404 54 DEBUG root:data_loader.py:159 There was an error (STATUS 404) for request https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/stationmeta/DEBW99X HTTP/1.1" 404 54 DEBUG root:data_loader.py:159 There was an error (STATUS 404) for request https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X DEBUG urllib3.connectionpool:connectionpool.py:1003 Starting new HTTPS connection (1): toar-data.fz-juelich.de:443 DEBUG urllib3.connectionpool:connectionpool.py:456 https://toar-data.fz-juelich.de:443 "GET /api/v2/stationmeta/DEBW99X HTTP/1.1" 404 54 DEBUG root:data_loader.py:159 There was an error (STATUS 404) for request https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:time_tracking.py:137 setup_samples finished after 0:00:06 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( DEBUG root:pre_processing.py:491 detailed information for removal of station DEBW99X: Traceback (most recent call last): File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 199, in _acquire_with_cache_info file = self._cache[self._key] File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/lru_cache.py", line 53, in __getitem__ value = self._cache[key] KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/home/root/mlair/data/daily/DEBW99X_o3_temp.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))] During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 361, in load_data data = xr.open_dataarray(file_name) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py", line 701, in open_dataarray dataset = open_dataset( File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/api.py", line 572, in open_dataset store = opener(filename_or_obj, **extra_kwargs, **backend_kwargs) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 364, in open return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 314, in __init__ self.format = self.ds.data_model File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 373, in ds return self._acquire() File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/netCDF4_.py", line 367, in _acquire with self._manager.acquire_context(needs_lock) as root: File "/usr/lib64/python3.9/contextlib.py", line 119, in __enter__ return next(self.gen) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 187, in acquire_context file, cached = self._acquire_with_cache_info(needs_lock) File "/opt/venv/lib64/python3.9/site-packages/xarray/backends/file_manager.py", line 205, in _acquire_with_cache_info file = self._opener(*self._args, **kwargs) File "src/netCDF4/_netCDF4.pyx", line 2353, in netCDF4._netCDF4.Dataset.__init__ File "src/netCDF4/_netCDF4.pyx", line 1963, in netCDF4._netCDF4._ensure_nc_success FileNotFoundError: [Errno 2] No such file or directory: b'/home/root/mlair/data/daily/DEBW99X_o3_temp.nc' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/builds/esde/machine-learning/mlair/mlair/run_modules/pre_processing.py", line 486, in f_proc res = data_handler.build(station, name_affix=name_affix, store_processed_data=store, **kwargs) File "/builds/esde/machine-learning/mlair/mlair/data_handler/default_data_handler.py", line 72, in build sp = cls.data_handler(station, **sp_keys) File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 125, in __init__ self.setup_samples() File "/builds/esde/machine-learning/mlair/mlair/helpers/time_tracking.py", line 40, in __call__ return self.__wrapped__(*args, **kwargs) File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 271, in setup_samples self.make_input_target() File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 312, in make_input_target data, self.meta = self.load_data(self.path, self.station, stats_per_var, self.sampling, File "/builds/esde/machine-learning/mlair/mlair/data_handler/data_handler_single_station.py", line 368, in load_data data, meta = data_sources.download_data(file_name, meta_file, station, statistics_per_var, sampling, File "/builds/esde/machine-learning/mlair/mlair/helpers/data_sources/data_loader.py", line 69, in download_data df_toar, meta_toar = data_sources.toar_data.download_toar(station=station, toar_stats=toar_stats, File "/builds/esde/machine-learning/mlair/mlair/helpers/data_sources/toar_data.py", line 19, in download_toar df_toar_xr = xr.DataArray(df_toar, dims=[time_dim, target_dim]) File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 403, in __init__ coords, dims = _infer_coords_and_dims(data.shape, coords, dims) File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims raise ValueError( ValueError: different number of dimensions on data and dims: 0 vs 2 INFO root:pre_processing.py:303 ...finished: DEBW99X (100%) INFO root:pre_processing.py:305 run for 0:01:06 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (awesome). DEBUG root:datastore.py:118 set: stations(general.awesome)=[] DEBUG root:datastore.py:118 set: data_collection(general.awesome)=<mlair.data_handler.iterator.DataCollection object at 0x7fb026c710d0> DEBUG root:datastore.py:118 get: data_collection(general.awesome)=<mlair.data_handler.iterator.DataCollection object at 0x7fb026c710d0> DEBUG root:datastore.py:120 get: data_collection(general)=None DEBUG root:datastore.py:118 get: stations(general.awesome)=[]
----------------------------Captured stderr teardown----------------------------
2023-12-18 17:45:15,390 - INFO: RunEnvironment finished after 0:01:06 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:45:15,390 - INFO: RunEnvironment finished after 0:01:06 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:45:15,390 - INFO: RunEnvironment finished after 0:01:06 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:45:15,390 - INFO: RunEnvironment finished after 0:01:06 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:45:15,390 - INFO: RunEnvironment finished after 0:01:06 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:45:19,832 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_003.json [run_environment.py:__save_tracking:159] 2023-12-18 17:45:19,832 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_003.json [run_environment.py:__save_tracking:159] 2023-12-18 17:45:19,832 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_003.json [run_environment.py:__save_tracking:159] 2023-12-18 17:45:19,832 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_003.json [run_environment.py:__save_tracking:159] 2023-12-18 17:45:19,832 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_003.json [run_environment.py:__save_tracking:159] 2023-12-18 17:45:19,856 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:45:19,856 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:45:19,856 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:45:19,856 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:45:19,856 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147]
-----------------------------Captured log teardown------------------------------
INFO root:run_environment.py:120 RunEnvironment finished after 0:01:06 (hh:mm:ss) DEBUG root:datastore.py:118 get: logging_path(general)=/builds/esde/machine-learning/mlair/TestExperiment_daily/logging DEBUG root:datastore.py:118 get: logging_path(general)=/builds/esde/machine-learning/mlair/TestExperiment_daily/logging INFO root:run_environment.py:159 Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_003.json DEBUG root:datastore.py:118 get: logging_path(general)=/builds/esde/machine-learning/mlair/TestExperiment_daily/logging INFO root:run_environment.py:147 Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log
Failed test/test_run_modules/test_pre_processing.py::TestPreProcessing::test_validate_station_serial[None] 63.44
self = <test_pre_processing.TestPreProcessing object at 0x7fb026abed90>
caplog = <_pytest.logging.LogCaptureFixture object at 0x7fb02669e0d0>
obj_with_exp_setup = <mlair.run_modules.pre_processing.PreProcessing object at 0x7fb02669e3d0>
name = None

@pytest.mark.parametrize("name", (None, "tester"))
def test_validate_station_serial(self, caplog, obj_with_exp_setup, name):
pre = obj_with_exp_setup
caplog.set_level(logging.INFO)
stations = pre.data_store.get("stations", "general")
data_preparation = pre.data_store.get("data_handler")
collection, valid_stations = pre.validate_station(data_preparation, stations, set_name=name)
assert isinstance(collection, DataCollection)
assert len(valid_stations) < len(stations)
> assert valid_stations == stations[:-1]
E AssertionError: assert [] == ['DEBW107', '...3', 'DEBW087']
E Right contains 3 more items, first extra item: 'DEBW107'
E Use -v to get the full diff

test/test_run_modules/test_pre_processing.py:109: AssertionError
------------------------------Captured stderr call------------------------------
2023-12-18 17:45:19,867 - INFO: check valid stations started (all) [pre_processing.py:validate_station:262] 2023-12-18 17:45:19,867 - INFO: check valid stations started (all) [pre_processing.py:validate_station:262] 2023-12-18 17:45:19,867 - INFO: check valid stations started (all) [pre_processing.py:validate_station:262] 2023-12-18 17:45:19,867 - INFO: check valid stations started (all) [pre_processing.py:validate_station:262] 2023-12-18 17:45:19,867 - INFO: check valid stations started (all) [pre_processing.py:validate_station:262] 2023-12-18 17:45:19,871 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:45:19,871 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:45:19,871 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:45:19,871 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:45:19,871 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:45:20,461 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:20,461 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:20,461 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:20,461 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:20,461 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:20,483 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:20,483 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:20,483 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:20,483 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:20,483 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:20,704 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:20,704 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:20,704 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:20,704 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:20,704 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:20,743 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:20,743 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:20,743 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:20,743 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:20,743 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:21,517 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:45:21,517 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:45:21,517 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:45:21,517 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:45:21,517 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:45:21,575 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:21,575 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:21,575 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:21,575 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:21,575 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:22,521 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:22,521 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:22,521 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:22,521 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:22,521 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:22,677 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:22,677 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:22,677 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:22,677 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:22,677 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:23,489 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:23,489 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:23,489 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:23,489 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:23,489 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:23,635 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:23,635 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:23,635 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:23,635 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:23,635 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:23,638 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:23,638 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:23,638 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:23,638 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:23,638 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:24,563 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:24,563 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:24,563 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:24,563 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:24,563 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:26,953 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:26,953 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:26,953 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:26,953 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:26,953 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:27,524 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:27,524 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:27,524 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:27,524 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:27,524 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:27,555 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:27,555 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:27,555 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:27,555 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:27,555 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:28,410 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:28,410 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:28,410 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:28,410 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:28,410 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:28,454 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:28,454 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:28,454 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:28,454 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:28,454 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:30,042 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:30,042 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:30,042 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:30,042 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:30,042 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:30,087 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:30,087 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:30,087 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:30,087 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:30,087 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:32,106 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:32,106 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:32,106 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:32,106 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:32,106 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:32,150 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:32,150 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:32,150 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:32,150 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:32,150 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:35,489 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:35,489 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:35,489 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:35,489 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:35,489 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:35,534 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:35,534 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:35,534 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:35,534 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:35,534 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:39,540 - INFO: setup_samples finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:39,540 - INFO: setup_samples finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:39,540 - INFO: setup_samples finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:39,540 - INFO: setup_samples finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:39,540 - INFO: setup_samples finished after 0:00:20 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:39,543 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:39,543 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:39,543 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:39,543 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:39,543 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:39,544 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:39,544 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:39,544 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:39,544 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:39,544 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:39,793 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:39,793 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:39,793 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:39,793 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:39,793 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:39,840 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:39,840 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:39,840 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:39,840 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:39,840 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:40,497 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:40,497 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:40,497 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:40,497 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:40,497 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:40,543 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:40,543 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:40,543 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:40,543 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:40,543 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:40,765 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:45:40,765 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:45:40,765 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:45:40,765 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:45:40,765 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:45:40,828 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:40,828 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:40,828 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:40,828 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:40,828 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:41,368 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:41,368 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:41,368 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:41,368 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:41,368 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:41,525 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:41,525 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:41,525 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:41,525 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:41,525 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:41,608 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:41,608 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:41,608 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:41,608 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:41,608 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:41,770 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:41,770 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:41,770 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:41,770 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:41,770 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:41,774 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:41,774 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:41,774 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:41,774 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:41,774 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:42,520 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:42,520 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:42,520 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:42,520 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:42,520 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:44,855 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:44,855 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:44,855 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:44,855 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:44,855 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:45,334 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:45,334 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:45,334 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:45,334 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:45,334 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:45,359 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:45,359 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:45,359 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:45,359 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:45,359 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:46,053 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:46,053 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:46,053 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:46,053 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:46,053 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:46,098 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:46,098 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:46,098 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:46,098 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:46,098 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:47,546 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:47,546 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:47,546 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:47,546 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:47,546 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:47,591 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:47,591 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:47,591 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:47,591 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:47,591 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:50,122 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:50,122 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:50,122 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:50,122 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:50,122 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:50,166 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:50,166 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:50,166 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:50,166 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:50,166 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:53,689 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:53,689 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:53,689 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:53,689 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:53,689 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:53,733 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:53,733 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:53,733 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:53,733 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:53,733 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:57,739 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:57,739 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:57,739 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:57,739 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:57,739 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:57,742 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:57,742 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:57,742 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:57,742 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:57,742 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:45:57,743 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:57,743 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:57,743 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:57,743 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:57,743 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:45:57,777 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:57,777 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:57,777 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:57,777 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:57,777 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:45:57,825 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:57,825 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:57,825 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:57,825 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:57,825 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:58,363 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:58,363 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:58,363 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:58,363 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:58,363 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:45:58,409 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:58,409 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:58,409 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:58,409 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:58,409 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:58,549 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:45:58,549 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:45:58,549 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:45:58,549 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:45:58,549 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:45:58,614 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:58,614 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:58,614 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:58,614 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:58,614 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:58,891 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:58,891 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:58,891 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:58,891 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:58,891 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:45:59,034 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:59,034 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:59,034 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:59,034 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:59,034 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:59,080 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:59,080 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:59,080 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:59,080 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:59,080 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:45:59,228 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:59,228 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:59,228 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:59,228 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:59,228 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:45:59,234 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:59,234 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:59,234 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:59,234 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:59,234 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:45:59,450 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:59,450 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:59,450 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:59,450 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:45:59,450 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:01,965 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:01,965 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:01,965 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:01,965 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:01,965 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:02,975 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:02,975 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:02,975 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:02,975 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:02,975 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:03,001 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:03,001 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:03,001 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:03,001 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:03,001 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:03,650 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:03,650 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:03,650 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:03,650 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:03,650 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:03,695 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:03,695 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:03,695 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:03,695 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:03,695 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:05,525 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:05,525 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:05,525 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:05,525 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:05,525 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:05,570 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:05,570 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:05,570 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:05,570 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:05,570 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:08,469 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:08,469 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:08,469 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:08,469 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:08,469 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:08,518 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:08,518 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:08,518 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:08,518 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:08,518 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:11,645 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:11,645 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:11,645 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:11,645 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:11,645 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:11,690 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:11,690 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:11,690 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:11,690 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:11,690 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:15,696 - INFO: setup_samples finished after 0:00:18 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:15,696 - INFO: setup_samples finished after 0:00:18 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:15,696 - INFO: setup_samples finished after 0:00:18 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:15,696 - INFO: setup_samples finished after 0:00:18 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:15,696 - INFO: setup_samples finished after 0:00:18 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:15,698 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:15,698 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:15,698 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:15,698 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:15,698 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:15,700 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:15,700 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:15,700 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:15,700 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:15,700 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:15,952 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:15,952 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:15,952 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:15,952 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:15,952 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:16,005 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,005 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,005 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,005 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,005 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,076 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:16,076 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:16,076 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:16,076 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:16,076 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:16,120 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,120 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,120 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,120 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,120 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,491 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:16,491 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:16,491 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:16,491 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:16,491 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:16,539 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,539 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,539 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,539 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,539 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,897 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:16,897 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:16,897 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:16,897 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:16,897 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:16,947 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,947 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,947 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,947 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:16,947 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:17,112 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:17,112 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:17,112 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:17,112 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:17,112 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:17,168 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:17,168 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:17,168 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:17,168 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:17,168 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:17,688 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:17,688 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:17,688 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:17,688 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:17,688 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:17,738 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:17,738 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:17,738 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:17,738 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:17,738 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:18,272 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:18,272 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:18,272 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:18,272 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:18,272 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:46:18,324 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:18,324 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:18,324 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:18,324 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:18,324 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:18,331 - INFO: setup_samples finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:18,331 - INFO: setup_samples finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:18,331 - INFO: setup_samples finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:18,331 - INFO: setup_samples finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:18,331 - INFO: setup_samples finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:18,333 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:18,333 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:18,333 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:18,333 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:18,333 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:18,335 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:18,335 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:18,335 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:18,335 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:18,335 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:18,336 - INFO: run for 0:00:59 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (None). [pre_processing.py:validate_station:305] 2023-12-18 17:46:18,336 - INFO: run for 0:00:59 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (None). [pre_processing.py:validate_station:305] 2023-12-18 17:46:18,336 - INFO: run for 0:00:59 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (None). [pre_processing.py:validate_station:305] 2023-12-18 17:46:18,336 - INFO: run for 0:00:59 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (None). [pre_processing.py:validate_station:305] 2023-12-18 17:46:18,336 - INFO: run for 0:00:59 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (None). [pre_processing.py:validate_station:305]
-------------------------------Captured log call--------------------------------
INFO root:pre_processing.py:262 check valid stations started (all) INFO root:pre_processing.py:296 use serial validate station approach INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW107 from TOAR-DATA INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:time_tracking.py:137 setup_samples finished after 0:00:20 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( INFO root:pre_processing.py:303 ...finished: DEBW107 (25%) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW013 from TOAR-DATA INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:time_tracking.py:137 setup_samples finished after 0:00:19 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( INFO root:pre_processing.py:303 ...finished: DEBW013 (50%) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW087 from TOAR-DATA INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:time_tracking.py:137 setup_samples finished after 0:00:18 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( INFO root:pre_processing.py:303 ...finished: DEBW087 (75%) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:time_tracking.py:137 setup_samples finished after 0:00:03 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( INFO root:pre_processing.py:303 ...finished: DEBW99X (100%) INFO root:pre_processing.py:305 run for 0:00:59 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (None).
----------------------------Captured stderr teardown----------------------------
2023-12-18 17:46:18,362 - INFO: RunEnvironment finished after 0:00:59 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:18,362 - INFO: RunEnvironment finished after 0:00:59 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:18,362 - INFO: RunEnvironment finished after 0:00:59 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:18,362 - INFO: RunEnvironment finished after 0:00:59 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:18,362 - INFO: RunEnvironment finished after 0:00:59 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:19,806 - INFO: PreProcessing finished after 0:02:52 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:19,806 - INFO: PreProcessing finished after 0:02:52 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:19,806 - INFO: PreProcessing finished after 0:02:52 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:19,806 - INFO: PreProcessing finished after 0:02:52 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:19,806 - INFO: PreProcessing finished after 0:02:52 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:19,807 - INFO: PreProcessing finished after 0:02:10 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:19,807 - INFO: PreProcessing finished after 0:02:10 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:19,807 - INFO: PreProcessing finished after 0:02:10 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:19,807 - INFO: PreProcessing finished after 0:02:10 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:19,807 - INFO: PreProcessing finished after 0:02:10 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:46:23,299 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_004.json [run_environment.py:__save_tracking:159] 2023-12-18 17:46:23,299 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_004.json [run_environment.py:__save_tracking:159] 2023-12-18 17:46:23,299 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_004.json [run_environment.py:__save_tracking:159] 2023-12-18 17:46:23,299 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_004.json [run_environment.py:__save_tracking:159] 2023-12-18 17:46:23,299 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_004.json [run_environment.py:__save_tracking:159] 2023-12-18 17:46:23,324 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:46:23,324 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:46:23,324 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:46:23,324 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:46:23,324 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147]
-----------------------------Captured log teardown------------------------------
INFO root:run_environment.py:120 RunEnvironment finished after 0:00:59 (hh:mm:ss) INFO root:run_environment.py:120 PreProcessing finished after 0:02:52 (hh:mm:ss) INFO root:run_environment.py:120 PreProcessing finished after 0:02:10 (hh:mm:ss) INFO root:run_environment.py:159 Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_004.json INFO root:run_environment.py:147 Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log
Failed test/test_run_modules/test_pre_processing.py::TestPreProcessing::test_validate_station_serial[tester] 65.86
self = <test_pre_processing.TestPreProcessing object at 0x7fb025dd7f70>
caplog = <_pytest.logging.LogCaptureFixture object at 0x7fb02669e040>
obj_with_exp_setup = <mlair.run_modules.pre_processing.PreProcessing object at 0x7fb025ec5f70>
name = 'tester'

@pytest.mark.parametrize("name", (None, "tester"))
def test_validate_station_serial(self, caplog, obj_with_exp_setup, name):
pre = obj_with_exp_setup
caplog.set_level(logging.INFO)
stations = pre.data_store.get("stations", "general")
data_preparation = pre.data_store.get("data_handler")
collection, valid_stations = pre.validate_station(data_preparation, stations, set_name=name)
assert isinstance(collection, DataCollection)
assert len(valid_stations) < len(stations)
> assert valid_stations == stations[:-1]
E AssertionError: assert [] == ['DEBW107', '...3', 'DEBW087']
E Right contains 3 more items, first extra item: 'DEBW107'
E Use -v to get the full diff

test/test_run_modules/test_pre_processing.py:109: AssertionError
------------------------------Captured stderr call------------------------------
2023-12-18 17:46:23,335 - INFO: check valid stations started (tester) [pre_processing.py:validate_station:262] 2023-12-18 17:46:23,335 - INFO: check valid stations started (tester) [pre_processing.py:validate_station:262] 2023-12-18 17:46:23,335 - INFO: check valid stations started (tester) [pre_processing.py:validate_station:262] 2023-12-18 17:46:23,335 - INFO: check valid stations started (tester) [pre_processing.py:validate_station:262] 2023-12-18 17:46:23,335 - INFO: check valid stations started (tester) [pre_processing.py:validate_station:262] 2023-12-18 17:46:23,339 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:46:23,339 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:46:23,339 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:46:23,339 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:46:23,339 - INFO: use serial validate station approach [pre_processing.py:validate_station:296] 2023-12-18 17:46:24,105 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:24,105 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:24,105 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:24,105 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:24,105 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:24,126 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:24,126 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:24,126 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:24,126 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:24,126 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:24,748 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:24,748 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:24,748 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:24,748 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:24,748 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:24,786 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:24,786 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:24,786 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:24,786 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:24,786 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:25,593 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:46:25,593 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:46:25,593 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:46:25,593 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:46:25,593 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 [data_loader.py:get_data:152] 2023-12-18 17:46:25,653 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:25,653 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:25,653 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:25,653 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:25,653 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:26,120 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:46:26,120 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:46:26,120 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:46:26,120 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:46:26,120 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:46:26,269 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:26,269 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:26,269 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:26,269 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:26,269 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:27,235 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:46:27,235 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:46:27,235 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:46:27,235 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:46:27,235 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:46:27,383 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:27,383 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:27,383 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:27,383 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:27,383 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:27,386 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:46:27,386 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:46:27,386 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:46:27,386 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:46:27,386 - INFO: load data for DEBW107 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:46:28,184 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:28,184 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:28,184 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:28,184 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:28,184 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:30,605 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:30,605 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:30,605 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:30,605 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:30,605 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:30,668 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:30,668 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:30,668 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:30,668 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:30,668 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:30,693 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:30,693 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:30,693 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:30,693 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:30,693 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:31,419 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:31,419 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:31,419 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:31,419 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:31,419 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:31,455 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:31,455 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:31,455 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:31,455 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:31,455 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:33,325 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:33,325 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:33,325 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:33,325 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:33,325 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:33,370 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:33,370 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:33,370 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:33,370 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:33,370 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:35,969 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:35,969 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:35,969 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:35,969 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:35,969 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:36,014 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:36,014 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:36,014 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:36,014 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:36,014 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:39,507 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:39,507 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:39,507 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:39,507 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:39,507 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:39,552 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:39,552 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:39,552 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:39,552 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:39,552 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:43,558 - INFO: setup_samples finished after 0:00:21 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:43,558 - INFO: setup_samples finished after 0:00:21 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:43,558 - INFO: setup_samples finished after 0:00:21 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:43,558 - INFO: setup_samples finished after 0:00:21 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:43,558 - INFO: setup_samples finished after 0:00:21 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:43,560 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:43,560 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:43,560 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:43,560 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:43,560 - INFO: remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:46:43,562 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:43,562 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:43,562 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:43,562 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:43,562 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:303] 2023-12-18 17:46:43,754 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:43,754 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:43,754 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:43,754 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:43,754 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:46:43,801 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:43,801 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:43,801 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:43,801 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:43,801 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:44,302 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:44,302 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:44,302 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:44,302 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:44,302 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:46:44,348 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:44,348 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:44,348 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:44,348 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:44,348 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:44,725 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:46:44,725 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:46:44,725 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:46:44,725 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:46:44,725 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 [data_loader.py:get_data:152] 2023-12-18 17:46:44,796 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:44,796 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:44,796 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:44,796 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:44,796 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:45,514 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:46:45,514 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:46:45,514 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:46:45,514 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:46:45,514 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:46:45,673 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:45,673 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:45,673 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:45,673 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:45,673 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:46,297 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:46:46,297 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:46:46,297 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:46:46,297 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:46:46,297 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:46:46,449 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:46,449 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:46,449 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:46,449 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:46,449 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:46,452 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:46:46,452 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:46:46,452 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:46:46,452 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:46:46,452 - INFO: load data for DEBW013 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:46:47,195 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:47,195 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:47,195 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:47,195 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:47,195 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:49,556 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:49,556 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:49,556 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:49,556 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:49,556 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:50,376 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:50,376 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:50,376 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:50,376 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:50,376 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:50,405 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:50,405 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:50,405 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:50,405 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:50,405 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:51,261 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:51,261 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:51,261 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:51,261 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:51,261 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:51,306 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:51,306 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:51,306 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:51,306 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:51,306 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:52,352 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:52,352 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:52,352 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:52,352 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:52,352 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:52,411 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:52,411 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:52,411 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:52,411 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:52,411 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:54,671 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:54,671 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:54,671 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:54,671 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:54,671 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:54,723 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:54,723 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:54,723 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:54,723 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:54,723 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:58,031 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:58,031 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:58,031 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:58,031 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:58,031 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:46:58,078 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:58,078 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:58,078 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:58,078 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:46:58,078 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,085 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,085 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,085 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,085 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,085 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,088 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:02,088 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:02,088 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:02,088 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:02,088 - INFO: remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:02,089 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:02,089 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:02,089 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:02,089 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:02,089 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:02,285 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:47:02,285 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:47:02,285 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:47:02,285 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:47:02,285 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:47:02,335 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,335 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,335 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,335 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,335 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,637 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:47:02,637 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:47:02,637 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:47:02,637 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:47:02,637 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:47:02,688 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,688 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,688 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,688 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,688 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,904 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:47:02,904 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:47:02,904 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:47:02,904 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:47:02,904 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 [data_loader.py:get_data:152] 2023-12-18 17:47:02,947 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,947 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,947 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,947 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:02,947 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:03,552 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:47:03,552 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:47:03,552 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:47:03,552 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:47:03,552 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 [data_loader.py:get_data:152] 2023-12-18 17:47:03,698 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:03,698 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:03,698 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:03,698 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:03,698 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:04,229 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:47:04,229 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:47:04,229 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:47:04,229 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:47:04,229 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 [data_loader.py:get_data:152] 2023-12-18 17:47:04,375 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:04,375 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:04,375 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:04,375 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:04,375 - INFO: https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:04,378 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:47:04,378 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:47:04,378 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:47:04,378 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:47:04,378 - INFO: load data for DEBW087 from TOAR-DATA [toar_data_v2.py:download_toar:64] 2023-12-18 17:47:04,905 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:04,905 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:04,905 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:04,905 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:04,905 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:07,239 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:07,239 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:07,239 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:07,239 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:07,239 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:07,376 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:07,376 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:07,376 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:07,376 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:07,376 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:07,402 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:07,402 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:07,402 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:07,402 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:07,402 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:08,369 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:08,369 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:08,369 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:08,369 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:08,369 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:08,413 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:08,413 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:08,413 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:08,413 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:08,413 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:10,320 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:10,320 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:10,320 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:10,320 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:10,320 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:10,366 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:10,366 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:10,366 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:10,366 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:10,366 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:12,715 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:12,715 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:12,715 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:12,715 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:12,715 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:12,769 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:12,769 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:12,769 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:12,769 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:12,769 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:16,373 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:16,373 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:16,373 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:16,373 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:16,373 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily [data_loader.py:get_data_with_query:115] 2023-12-18 17:47:16,419 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:16,419 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:16,419 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:16,419 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:16,419 - INFO: https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:20,425 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:20,425 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:20,425 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:20,425 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:20,425 - INFO: setup_samples finished after 0:00:19 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:20,428 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:20,428 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:20,428 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:20,428 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:20,428 - INFO: remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:20,429 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:20,429 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:20,429 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:20,429 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:20,429 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:20,794 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:47:20,794 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:47:20,794 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:47:20,794 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:47:20,794 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 [data_loader.py:get_data:152] 2023-12-18 17:47:20,855 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:20,855 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:20,855 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:20,855 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:20,855 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:21,086 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:47:21,086 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:47:21,086 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:47:21,086 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:47:21,086 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp [data_loader.py:get_data:152] 2023-12-18 17:47:21,131 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:21,131 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:21,131 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:21,131 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:21,131 - INFO: https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:21,451 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:21,451 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:21,451 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:21,451 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:21,451 - INFO: connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:21,500 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:21,500 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:21,500 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:21,500 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:21,500 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:22,278 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:22,278 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:22,278 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:22,278 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:22,278 - INFO: connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:22,330 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:22,330 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:22,330 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:22,330 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:22,330 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:22,942 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:22,942 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:22,942 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:22,942 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:22,942 - INFO: connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:22,992 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:22,992 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:22,992 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:22,992 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:22,992 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:23,683 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:23,683 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:23,683 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:23,683 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:23,683 - INFO: connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:23,735 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:23,735 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:23,735 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:23,735 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:23,735 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:24,007 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:24,007 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:24,007 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:24,007 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:24,007 - INFO: connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X [data_loader.py:get_data:152] 2023-12-18 17:47:24,059 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:24,059 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:24,059 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:24,059 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:24,059 - INFO: https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:24,065 - INFO: setup_samples finished after 0:00:04 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:24,065 - INFO: setup_samples finished after 0:00:04 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:24,065 - INFO: setup_samples finished after 0:00:04 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:24,065 - INFO: setup_samples finished after 0:00:04 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:24,065 - INFO: setup_samples finished after 0:00:04 (hh:mm:ss) [time_tracking.py:__exit__:137] 2023-12-18 17:47:24,067 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:24,067 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:24,067 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:24,067 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:24,067 - INFO: remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( [pre_processing.py:f_proc:489] 2023-12-18 17:47:24,069 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:24,069 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:24,069 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:24,069 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:24,069 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:303] 2023-12-18 17:47:24,070 - INFO: run for 0:01:01 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (tester). [pre_processing.py:validate_station:305] 2023-12-18 17:47:24,070 - INFO: run for 0:01:01 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (tester). [pre_processing.py:validate_station:305] 2023-12-18 17:47:24,070 - INFO: run for 0:01:01 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (tester). [pre_processing.py:validate_station:305] 2023-12-18 17:47:24,070 - INFO: run for 0:01:01 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (tester). [pre_processing.py:validate_station:305] 2023-12-18 17:47:24,070 - INFO: run for 0:01:01 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (tester). [pre_processing.py:validate_station:305]
-------------------------------Captured log call--------------------------------
INFO root:pre_processing.py:262 check valid stations started (tester) INFO root:pre_processing.py:296 use serial validate station approach INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW107 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=88&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW107 from TOAR-DATA INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18650&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=23329&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:time_tracking.py:137 setup_samples finished after 0:00:21 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW107 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( INFO root:pre_processing.py:303 ...finished: DEBW107 (25%) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW013 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=84&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW013 from TOAR-DATA INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18629&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22604&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:time_tracking.py:137 setup_samples finished after 0:00:19 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW013 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( INFO root:pre_processing.py:303 ...finished: DEBW013 (50%) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW087 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=5 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/search/?station_id=83&variable_id=21 finished after 0:00:01 (hh:mm:ss) INFO root:toar_data_v2.py:64 load data for DEBW087 from TOAR-DATA INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=18649&statistics=dma8eu&sampling=daily finished after 0:00:03 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:115 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/analysis/statistics/?id=22643&statistics=maximum&sampling=daily finished after 0:00:01 (hh:mm:ss) INFO root:time_tracking.py:137 setup_samples finished after 0:00:19 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW087 because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( INFO root:pre_processing.py:303 ...finished: DEBW087 (75%) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/o3 INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/o3 finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/variables/temp INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/variables/temp finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=0, timeout=60) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=1, timeout=120) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=2, timeout=240) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=3, timeout=480) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:data_loader.py:152 connect (retry=4, timeout=960) https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X INFO root:time_tracking.py:137 https://toar-data.fz-juelich.de/api/v2/stationmeta/DEBW99X finished after 0:00:01 (hh:mm:ss) INFO root:time_tracking.py:137 setup_samples finished after 0:00:04 (hh:mm:ss) INFO root:pre_processing.py:489 remove station DEBW99X because it raised an error: different number of dimensions on data and dims: 0 vs 2 -> coords, dims = _infer_coords_and_dims(data.shape, coords, dims) | File "/opt/venv/lib64/python3.9/site-packages/xarray/core/dataarray.py", line 121, in _infer_coords_and_dims | raise ValueError( INFO root:pre_processing.py:303 ...finished: DEBW99X (100%) INFO root:pre_processing.py:305 run for 0:01:01 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (tester).
----------------------------Captured stderr teardown----------------------------
2023-12-18 17:47:24,096 - INFO: RunEnvironment finished after 0:01:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:47:24,096 - INFO: RunEnvironment finished after 0:01:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:47:24,096 - INFO: RunEnvironment finished after 0:01:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:47:24,096 - INFO: RunEnvironment finished after 0:01:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:47:24,096 - INFO: RunEnvironment finished after 0:01:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:47:29,188 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_005.json [run_environment.py:__save_tracking:159] 2023-12-18 17:47:29,188 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_005.json [run_environment.py:__save_tracking:159] 2023-12-18 17:47:29,188 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_005.json [run_environment.py:__save_tracking:159] 2023-12-18 17:47:29,188 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_005.json [run_environment.py:__save_tracking:159] 2023-12-18 17:47:29,188 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_005.json [run_environment.py:__save_tracking:159] 2023-12-18 17:47:29,215 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:47:29,215 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:47:29,215 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:47:29,215 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:47:29,215 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147]
-----------------------------Captured log teardown------------------------------
INFO root:run_environment.py:120 RunEnvironment finished after 0:01:01 (hh:mm:ss) INFO root:run_environment.py:159 Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_005.json INFO root:run_environment.py:147 Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log
Failed test/test_run_modules/test_pre_processing.py::TestPreProcessing::test_validate_station_parallel 25.95
self = <test_pre_processing.TestPreProcessing object at 0x7fb025dab760>
mock_pool = <MagicMock name='Pool' id='140394526280720'>
mock_cpu = <MagicMock name='cpu_count' id='140394508818800'>
caplog = <_pytest.logging.LogCaptureFixture object at 0x7fb02669e6a0>
obj_with_exp_setup = <mlair.run_modules.pre_processing.PreProcessing object at 0x7fb026b706d0>

@mock.patch("psutil.cpu_count", return_value=3)
@mock.patch("multiprocessing.Pool", return_value=multiprocessing.Pool(3))
def test_validate_station_parallel(self, mock_pool, mock_cpu, caplog, obj_with_exp_setup):
pre = obj_with_exp_setup
caplog.clear()
caplog.set_level(logging.INFO)
stations = pre.data_store.get("stations", "general")
data_preparation = pre.data_store.get("data_handler")
collection, valid_stations = pre.validate_station(data_preparation, stations, set_name=None)
assert isinstance(collection, DataCollection)
assert len(valid_stations) < len(stations)
> assert valid_stations == stations[:-1]
E AssertionError: assert [] == ['DEBW107', '...3', 'DEBW087']
E Right contains 3 more items, first extra item: 'DEBW107'
E Use -v to get the full diff

test/test_run_modules/test_pre_processing.py:127: AssertionError
------------------------------Captured stderr call------------------------------
2023-12-18 17:47:29,227 - INFO: check valid stations started (all) [pre_processing.py:validate_station:262] 2023-12-18 17:47:29,227 - INFO: check valid stations started (all) [pre_processing.py:validate_station:262] 2023-12-18 17:47:29,227 - INFO: check valid stations started (all) [pre_processing.py:validate_station:262] 2023-12-18 17:47:29,227 - INFO: check valid stations started (all) [pre_processing.py:validate_station:262] 2023-12-18 17:47:29,227 - INFO: check valid stations started (all) [pre_processing.py:validate_station:262] 2023-12-18 17:47:29,231 - INFO: use parallel validate station approach [pre_processing.py:validate_station:277] 2023-12-18 17:47:29,231 - INFO: use parallel validate station approach [pre_processing.py:validate_station:277] 2023-12-18 17:47:29,231 - INFO: use parallel validate station approach [pre_processing.py:validate_station:277] 2023-12-18 17:47:29,231 - INFO: use parallel validate station approach [pre_processing.py:validate_station:277] 2023-12-18 17:47:29,231 - INFO: use parallel validate station approach [pre_processing.py:validate_station:277] 2023-12-18 17:47:29,231 - INFO: running 3 processes in parallel [pre_processing.py:validate_station:279] 2023-12-18 17:47:29,231 - INFO: running 3 processes in parallel [pre_processing.py:validate_station:279] 2023-12-18 17:47:29,231 - INFO: running 3 processes in parallel [pre_processing.py:validate_station:279] 2023-12-18 17:47:29,231 - INFO: running 3 processes in parallel [pre_processing.py:validate_station:279] 2023-12-18 17:47:29,231 - INFO: running 3 processes in parallel [pre_processing.py:validate_station:279] 2023-12-18 17:47:48,687 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,687 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,687 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,687 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,687 - INFO: ...finished: DEBW107 (25%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,687 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,687 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,687 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,687 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,687 - INFO: ...finished: DEBW013 (50%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,688 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,688 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,688 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,688 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:48,688 - INFO: ...finished: DEBW087 (75%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:50,436 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:50,436 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:50,436 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:50,436 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:50,436 - INFO: ...finished: DEBW99X (100%) [pre_processing.py:validate_station:286] 2023-12-18 17:47:50,488 - INFO: run for 0:00:22 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (None). [pre_processing.py:validate_station:305] 2023-12-18 17:47:50,488 - INFO: run for 0:00:22 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (None). [pre_processing.py:validate_station:305] 2023-12-18 17:47:50,488 - INFO: run for 0:00:22 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (None). [pre_processing.py:validate_station:305] 2023-12-18 17:47:50,488 - INFO: run for 0:00:22 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (None). [pre_processing.py:validate_station:305] 2023-12-18 17:47:50,488 - INFO: run for 0:00:22 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (None). [pre_processing.py:validate_station:305]
-------------------------------Captured log call--------------------------------
INFO root:pre_processing.py:262 check valid stations started (all) INFO root:pre_processing.py:277 use parallel validate station approach INFO root:pre_processing.py:279 running 3 processes in parallel INFO root:pre_processing.py:286 ...finished: DEBW107 (25%) INFO root:pre_processing.py:286 ...finished: DEBW013 (50%) INFO root:pre_processing.py:286 ...finished: DEBW087 (75%) INFO root:pre_processing.py:286 ...finished: DEBW99X (100%) INFO root:pre_processing.py:305 run for 0:00:22 (hh:mm:ss) to check 4 station(s). Found 0/4 valid stations (None).
----------------------------Captured stderr teardown----------------------------
2023-12-18 17:47:50,497 - INFO: RunEnvironment finished after 0:00:22 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:47:50,497 - INFO: RunEnvironment finished after 0:00:22 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:47:50,497 - INFO: RunEnvironment finished after 0:00:22 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:47:50,497 - INFO: RunEnvironment finished after 0:00:22 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:47:50,497 - INFO: RunEnvironment finished after 0:00:22 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:47:55,148 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_006.json [run_environment.py:__save_tracking:159] 2023-12-18 17:47:55,148 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_006.json [run_environment.py:__save_tracking:159] 2023-12-18 17:47:55,148 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_006.json [run_environment.py:__save_tracking:159] 2023-12-18 17:47:55,148 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_006.json [run_environment.py:__save_tracking:159] 2023-12-18 17:47:55,148 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_006.json [run_environment.py:__save_tracking:159] 2023-12-18 17:47:55,176 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:47:55,176 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:47:55,176 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:47:55,176 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:47:55,176 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147]
-----------------------------Captured log teardown------------------------------
INFO root:run_environment.py:120 RunEnvironment finished after 0:00:22 (hh:mm:ss) INFO root:run_environment.py:159 Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_006.json INFO root:run_environment.py:147 Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log
Passed test/test_configuration/test_defaults.py::TestGetDefaults::test_get_defaults 0.00
No log output captured.
Passed test/test_configuration/test_defaults.py::TestAllDefaults::test_training_parameters 0.00
No log output captured.
Passed test/test_configuration/test_defaults.py::TestAllDefaults::test_data_handler_parameters 0.00
No log output captured.
Passed test/test_configuration/test_defaults.py::TestAllDefaults::test_subset_parameters 0.00
No log output captured.
Passed test/test_configuration/test_defaults.py::TestAllDefaults::test_hpc_parameters 0.00
No log output captured.
Passed test/test_configuration/test_defaults.py::TestAllDefaults::test_postprocessing_parameters 0.00
No log output captured.
Passed test/test_configuration/test_join_settings.py::TestJoinSettings::test_no_args 0.00
No log output captured.
Passed test/test_configuration/test_join_settings.py::TestJoinSettings::test_daily 0.00
No log output captured.
Passed test/test_configuration/test_join_settings.py::TestJoinSettings::test_hourly 0.00
No log output captured.
Passed test/test_configuration/test_join_settings.py::TestJoinSettings::test_unknown_sampling 0.00
No log output captured.
Passed test/test_configuration/test_path_config.py::TestPrepareHost::test_prepare_host 0.00
No log output captured.
Passed test/test_configuration/test_path_config.py::TestPrepareHost::test_prepare_host_unknown 0.00
No log output captured.
Passed test/test_configuration/test_path_config.py::TestPrepareHost::test_prepare_host_given_path 0.00
No log output captured.
Passed test/test_configuration/test_path_config.py::TestPrepareHost::test_error_handling 0.00
No log output captured.
Passed test/test_configuration/test_path_config.py::TestPrepareHost::test_os_path_exists 0.00
No log output captured.
Passed test/test_configuration/test_path_config.py::TestSetExperimentName::test_set_experiment_name 0.00
No log output captured.
Passed test/test_configuration/test_path_config.py::TestSetExperimentName::test_set_experiment_name_sampling 0.00
No log output captured.
Passed test/test_configuration/test_path_config.py::TestSetExperimentName::test_set_experiment_name_tuple_sampling 0.00
No log output captured.
Passed test/test_configuration/test_path_config.py::TestSetExperimentName::test_set_experiment_path 0.00
No log output captured.
Passed test/test_configuration/test_path_config.py::TestSetExperimentName::test_set_experiment_path_given_path 0.00
No log output captured.
Passed test/test_configuration/test_path_config.py::TestSetBootstrapPath::test_bootstrap_path_is_none 0.00
No log output captured.
Passed test/test_configuration/test_path_config.py::TestSetBootstrapPath::test_bootstap_path_is_given 0.00
No log output captured.
Passed test/test_configuration/test_path_config.py::TestCheckPath::test_check_path_and_create 0.00
-------------------------------Captured log call--------------------------------
DEBUG root:path_config.py:125 Created path: data/test DEBUG root:path_config.py:132 Path already exists: data/test
Passed test/test_data_handler/test_abstract_data_handler.py::TestAbstractDataHandler::test_required_attributes 0.00
No log output captured.
Passed test/test_data_handler/test_abstract_data_handler.py::TestAbstractDataHandler::test_init 0.00
No log output captured.
Passed test/test_data_handler/test_abstract_data_handler.py::TestAbstractDataHandler::test_build 0.00
No log output captured.
Passed test/test_data_handler/test_abstract_data_handler.py::TestAbstractDataHandler::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_abstract_data_handler.py::TestAbstractDataHandler::test_own_args 0.00
No log output captured.
Passed test/test_data_handler/test_abstract_data_handler.py::TestAbstractDataHandler::test_skip_args 0.00
No log output captured.
Passed test/test_data_handler/test_abstract_data_handler.py::TestAbstractDataHandler::test_transformation 0.00
No log output captured.
Passed test/test_data_handler/test_abstract_data_handler.py::TestAbstractDataHandler::test_get_X 0.00
No log output captured.
Passed test/test_data_handler/test_abstract_data_handler.py::TestAbstractDataHandler::test_get_Y 0.00
No log output captured.
Passed test/test_data_handler/test_abstract_data_handler.py::TestAbstractDataHandler::test_get_data 0.00
No log output captured.
Passed test/test_data_handler/test_abstract_data_handler.py::TestAbstractDataHandler::test_get_coordinates 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSampling::test_data_handler 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSampling::test_data_handler_transformation 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSampling::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingSingleStation::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingSingleStation::test_init 0.18
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingSingleStation::test_update_kwargs_single_to_tuple 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingSingleStation::test_update_kwargs_tuple 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingSingleStation::test_update_kwargs_default 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingSingleStation::test_update_kwargs_assert_failure 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingSingleStation::test_setup_samples 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingSingleStation::test_load_and_interpolate 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingSingleStation::test_set_inputs_and_targets 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingSingleStation::test_setup_data_path 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingWithFilterSingleStation::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingWithFirFilter::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingWithFirFilterSingleStation::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingWithClimateFirFilter::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_mixed_sampling.py::TestDataHandlerMixedSamplingWithClimateFirFilterSingleStation::test_requirements 0.01
No log output captured.
Passed test/test_data_handler/test_data_handler_with_filter.py::TestDataHandlerFilter::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_with_filter.py::TestDataHandlerFilterSingleStation::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_with_filter.py::TestDataHandlerFirFilter::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_with_filter.py::TestDataHandlerFirFilterSingleStation::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_with_filter.py::TestDataHandlerClimateFirFilter::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_data_handler_with_filter.py::TestDataHandlerClimateFirFilterSingleStation::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_default_data_handler.py::TestDefaultDataHandler::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_default_data_handler_single_station.py::TestDataHandlerSingleStation::test_requirements 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestStandardIterator::test_blank 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestStandardIterator::test_init 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestStandardIterator::test_next 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestDataCollection::test_init 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestDataCollection::test_iter 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestDataCollection::test_add 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestDataCollection::test_name 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestDataCollection::test_set_mapping 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestDataCollection::test_getitem 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestDataCollection::test_keys 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_init 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_cleanup_path 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_len 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_concatenate 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_prepare_batches 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_prepare_batches_upsampling 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_prepare_batches_no_remaining 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_data_generation 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_getitem 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_on_epoch_end 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_get_model_rank_no_model 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_get_model_rank_single_output_branch 0.07
------------------------------Captured stdout call------------------------------
Model: "model" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_1 (InputLayer) [(None, 14, 1, 2)] 0 _________________________________________________________________ flatten (Flatten) (None, 28) 0 _________________________________________________________________ dense (Dense) (None, 64) 1856 _________________________________________________________________ prelu_1 (PReLU) (None, 64) 64 _________________________________________________________________ dense_1 (Dense) (None, 32) 2080 _________________________________________________________________ prelu_2 (PReLU) (None, 32) 32 _________________________________________________________________ dense_2 (Dense) (None, 16) 528 _________________________________________________________________ prelu_3 (PReLU) (None, 16) 16 _________________________________________________________________ dense_3 (Dense) (None, 3) 51 _________________________________________________________________ linear_output (Activation) (None, 3) 0 ================================================================= Total params: 4,627 Trainable params: 4,627 Non-trainable params: 0 _________________________________________________________________ None
------------------------------Captured stderr call------------------------------
2023-12-18 17:33:31.805945: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory 2023-12-18 17:33:31.805996: W tensorflow/stream_executor/cuda/cuda_driver.cc:269] failed call to cuInit: UNKNOWN ERROR (303) 2023-12-18 17:33:31.806038: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:156] kernel driver does not appear to be running on this host (runner-r3wxqhgr-project-2411-concurrent-0): /proc/driver/nvidia/version does not exist 2023-12-18 17:33:31.806298: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 AVX512F FMA To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_get_model_rank_multiple_output_branch 0.08
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestKerasIterator::test_get_model_rank_error 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestGetNumberOfMiniBatches::test_get_number_of_mini_batches 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestGetBatch::test_get_batch 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestSaveToPickle::test_save_to_pickle 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestPermuteData::test_permute_data 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestFProc::test_f_proc 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestFProc::test_f_proc_no_remaining 0.00
No log output captured.
Passed test/test_data_handler/test_iterator.py::TestFProc::test_f_proc_X_Y 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestAbstractDataStore::test_init 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestAbstractDataStore::test_clear_data_store 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_put 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_get 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_get_with_sub_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_get_with_not_existing_sub_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_raise_not_in_data_store 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_get_default 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_search 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_raise_not_in_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_list_all_scopes 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_search_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_search_empty_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_list_all_names 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_search_scope_and_all_superiors 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_search_scope_return_all 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_search_scope_and_all_superiors_return_all 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_create_args_dict_default_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_create_args_dict_given_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_create_args_dict_missing_entry 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_set_args_from_dict 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByVariable::test_no_scope_given 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_put_with_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_get 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_get_with_sub_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_get_with_not_existing_sub_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_raise_not_in_data_store 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_get_default 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_search 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_raise_not_in_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_list_all_scopes 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_search_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_search_empty_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_list_all_names 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_search_scope_and_all_superiors 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_search_scope_return_all 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_search_scope_and_all_superiors_return_all 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_create_args_dict_default_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_create_args_dict_given_scope 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_create_args_dict_missing_entry 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_set_args_from_dict 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestDataStoreByScope::test_no_scope_given 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestCorrectScope::test_init 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestTracking::test_init 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestTracking::test_track_first_entry 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestTracking::test_track_second_entry 0.00
No log output captured.
Passed test/test_helpers/test_datastore.py::TestTracking::test_decrypt_args 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_combine_observation_and_apriori_no_new_dim 0.03
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_combine_observation_and_apriori_with_new_dim 0.02
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_shift_data 0.01
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_create_index_array 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_create_tmp_dimension 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_create_tmp_dimension_iter_limit 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_next_order 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_next_order_with_kzf 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_calculate_filter_coefficients 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_create_monthly_mean 0.06
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_create_monthly_mean_sampling 0.10
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_create_monthly_mean_sel_opts 0.03
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_compute_hourly_mean_per_month 0.14
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_compute_hourly_mean_per_month_no_anomaly 0.14
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_create_seasonal_cycle_of_hourly_mean 0.23
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_create_seasonal_hourly_mean 1.06
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_create_seasonal_hourly_mean_sel_opts 0.53
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_create_unity_array 0.02
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_extend_apriori_at_end 0.02
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_extend_apriori_at_start 0.02
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_get_year_interval 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_create_time_range_extend 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_properties 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_trim_data_to_minimum_length 0.03
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_create_full_filter_result_array 0.01
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_clim_filter 11.83
No log output captured.
Passed test/test_helpers/test_filter.py::TestClimateFIRFilter::test_clim_filter_kwargs 29.52
No log output captured.
Passed test/test_helpers/test_filter.py::TestFirFilterConvolve::test_fir_filter_convolve 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestFirwinKzf::test_firwin_kzf 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestFilterWidthKzf::test_filter_width_kzf 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestOmegaNullKzf::test_omega_null_kzf 0.00
No log output captured.
Passed test/test_helpers/test_filter.py::TestOmegaNullKzf::test_omega_null_kzf_alpha 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestDeg2RadAllPoints::test_deg2rad_all_points_scalar_inputs[value0] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestDeg2RadAllPoints::test_deg2rad_all_points_scalar_inputs[value1] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestDeg2RadAllPoints::test_deg2rad_all_points_scalar_inputs[custom_np_arrays] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestDeg2RadAllPoints::test_deg2rad_all_points_xr_arr_inputs 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestHaversineDist::test_haversine_dist_on_unit_sphere_scalars[90.0-0.0--90.0-0.0-True-3.141592653589793] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestHaversineDist::test_haversine_dist_on_unit_sphere_scalars[1.5707963267948966-0.0--1.5707963267948966-0.0-False-3.141592653589793] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestHaversineDist::test_haversine_dist_on_unit_sphere_scalars[0.0-0.0-0.0-180.0-True-3.141592653589793] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestHaversineDist::test_haversine_dist_on_unit_sphere_scalars[0.0-0.0-0.0-3.141592653589793-False-3.141592653589793] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestHaversineDist::test_haversine_dist_on_unit_sphere_scalars[0.0-0.0--45.0-0-True-0.7853981633974483] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestHaversineDist::test_haversine_dist_on_unit_sphere_scalars[0.0-0.0--0.7853981633974483-0-False-0.7853981633974483] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestHaversineDist::test_haversine_dist_on_unit_sphere_fields_and_scalars[lat10-lon10--90.0-0.0-True-expected_dist0] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestHaversineDist::test_haversine_dist_on_unit_sphere_fields_and_scalars[lat11-lon11-lat21-lon21-True-expected_dist1] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestHaversineDist::test_haversine_dist_on_unit_sphere_missmatch_dimensions[lat10-0.0-0.0-0.0-True] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestHaversineDist::test_haversine_dist_on_unit_sphere_missmatch_dimensions[0.0-lon11-0.0-0.0-True] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestHaversineDist::test_haversine_dist_on_unit_sphere_missmatch_dimensions[0.0-0.0-lat22-0.0-True] 0.00
No log output captured.
Passed test/test_helpers/test_geofunctions.py::TestHaversineDist::test_haversine_dist_on_unit_sphere_missmatch_dimensions[0.0-0.0-0.0-lon23-True] 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestToList::test_to_list 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestTimeTracking::test_init 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestTimeTracking::test__start 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestTimeTracking::test__end 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestTimeTracking::test__duration 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestTimeTracking::test_repr 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestTimeTracking::test_run 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestTimeTracking::test_stop 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestTimeTracking::test_duration 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestTimeTracking::test_enter_exit 0.00
-------------------------------Captured log call--------------------------------
INFO root:time_tracking.py:137 undefined job finished after 0:00:01 (hh:mm:ss)
Passed test/test_helpers/test_helpers.py::TestTimeTracking::test_name_enter_exit 0.00
-------------------------------Captured log call--------------------------------
INFO root:time_tracking.py:137 my job finished after 0:00:01 (hh:mm:ss)
Passed test/test_helpers/test_helpers.py::TestPytestRegex::test_pytest_regex_init 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestPytestRegex::test_pytest_regex_eq 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestPytestRegex::test_pytest_regex_repr 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestDictToXarray::test_dict_to_xarray 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestDictToXarray::test_dict_to_xarray_single_entry 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestFloatRound::test_float_round_ceil 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestFloatRound::test_float_round_decimals 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestFloatRound::test_float_round_type 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestFloatRound::test_float_round_negative 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRelativeRound::test_relative_round_big_numbers 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRelativeRound::test_relative_round_float_numbers 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRelativeRound::test_relative_round_small_numbers 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRelativeRound::test_relative_round_zero 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRelativeRound::test_relative_round_negative_numbers 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRelativeRound::test_relative_round_wrong_significance 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRelativeRound::test_relative_round_floor 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRelativeRound::test_relative_round_floor_neg_numbers 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRelativeRound::test_relative_round_ceil 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRelativeRound::test_relative_round_ceil_neg_numbers 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRelativeRound::test_relative_round_ceil_floor 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestGetOrder::test_get_order 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestGetOrder::test_get_order_neg_orders 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestGetOrder::test_get_order_neg_numbers 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestSelectFromDict::test_select 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestSelectFromDict::test_select_no_dict_given 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestSelectFromDict::test_select_remove_none 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestSelectFromDict::test_select_condition 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRemoveItems::test_dict_remove_single 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRemoveItems::test_dict_remove_multiple 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRemoveItems::test_list_remove_single 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRemoveItems::test_list_remove_multiple 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRemoveItems::test_remove_missing_argument 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestRemoveItems::test_remove_not_supported_type 0.02
No log output captured.
Passed test/test_helpers/test_helpers.py::TestLogger::test_init_default 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestLogger::test_setup_logging_path_none 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestLogger::test_setup_logging_path_given 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestLogger::test_logger_console_level0 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestLogger::test_logger_console_level1 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestLogger::test_logger_console_level_wrong_type 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestExtractValue::test_extract 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestExtractValue::test_extract_multiple_elements 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestIsXarray::test_is_xarray_xr_input 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestIsXarray::test_is_xarray_other_input 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestConvert2xrDa::test_convert2xrda_xrdata_in 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestConvert2xrDa::test_convert2xrda_npdata_in_nokwargs 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestConvert2xrDa::test_convert2xrda_npdata_in_nokwargs_default_true 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestConvert2xrDa::test_convert2xrda_npdata_in_kwargs[False] 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestConvert2xrDa::test_convert2xrda_npdata_in_kwargs[True] 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestConvert2xrDa::test_convert2xrda_int_float_in_nokwargs_default_true[1] 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestConvert2xrDa::test_convert2xrda_int_float_in_nokwargs_default_true[2.0] 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestConvert2xrDa::test_convert2xrda_wrong_type_in_default_true_nokwargs[wrong_input0] 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestConvert2xrDa::test_convert2xrda_wrong_type_in_default_true_nokwargs[wrong_input1] 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestConvert2xrDa::test_convert2xrda_wrong_type_in_default_true_nokwargs[abc] 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestConvert2xrDa::test_convert2xrda_dask_in_default_true_nokwargs 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestSortLike::test_sort_like 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestSortLike::test_sort_like_not_unique 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestSortLike::test_sort_like_missing_element 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestFilterDictByValue::test_filter_dict_by_value 0.00
No log output captured.
Passed test/test_helpers/test_helpers.py::TestFilterDictByValue::test_filter_dict_by_value_not_avail 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestStandardise::test_standardise[pandas-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestStandardise::test_standardise[xarray-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestStandardise::test_standardise_inverse[pandas-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestStandardise::test_standardise_inverse[xarray-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestStandardise::test_apply_standardise_inverse[pandas-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestStandardise::test_apply_standardise_inverse[xarray-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestStandardise::test_standardise_apply[pandas-pd_mean-pd_std-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestStandardise::test_standardise_apply[xarray-xr_mean-xr_std-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestCentre::test_centre[pandas-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestCentre::test_centre[xarray-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestCentre::test_centre_inverse[pandas-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestCentre::test_centre_inverse[xarray-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestCentre::test_apply_centre_inverse[pandas-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestCentre::test_apply_centre_inverse[xarray-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestCentre::test_centre_apply[pandas-pd_mean-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestCentre::test_centre_apply[xarray-xr_mean-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMinMax::test_min_max[pandas-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMinMax::test_min_max[xarray-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMinMax::test_min_max_inverse[pandas-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMinMax::test_min_max_inverse[xarray-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMinMax::test_apply_min_max_inverse[pandas-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMinMax::test_apply_min_max_inverse[xarray-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMinMax::test_min_max_apply[pandas-pd_min-pd_max-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMinMax::test_min_max_apply[xarray-xr_min-xr_max-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestLog::test_standardise[pandas_gamma-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestLog::test_standardise[xarray_gamma-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestLog::test_standardise_inverse[pandas_gamma-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestLog::test_standardise_inverse[xarray_gamma-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestLog::test_apply_standardise_inverse[pandas_gamma-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestLog::test_apply_standardise_inverse[xarray_gamma-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestLog::test_standardise_apply[pandas-0] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestLog::test_standardise_apply[xarray-index] 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestCreateBootstrapRealizations::test_create_single_bootstrap_realization 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestCreateBootstrapRealizations::test_calculate_average 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestCreateBootstrapRealizations::test_create_n_bootstrap_realizations 1.49
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMeanSquaredError::test_mean_squared_error 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMeanSquaredError::test_mean_squared_error_xarray 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMeanAbsoluteError::test_mean_absolute_error 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMeanAbsoluteError::test_mean_absolute_error_xarray 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestIndexOfAgreement::test_index_of_agreement 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestIndexOfAgreement::test_index_of_agreement_xarray 0.01
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMNMB::test_modified_normalized_mean_bias 0.00
No log output captured.
Passed test/test_helpers/test_statistics.py::TestMNMB::test_modified_normalized_mean_bias_xarray 0.01
No log output captured.
Passed test/test_helpers/test_statistics.py::TestCalculateErrorMetrics::test_calculate_error_metrics 0.25
No log output captured.
Passed test/test_helpers/test_tables.py::TestTables::test_create_column_format_for_tex 0.00
No log output captured.
Passed test/test_helpers/test_testing_helpers.py::TestPyTestRegex::test_init 0.00
No log output captured.
Passed test/test_helpers/test_testing_helpers.py::TestPyTestRegex::test_eq 0.00
No log output captured.
Passed test/test_helpers/test_testing_helpers.py::TestPyTestRegex::test_repr 0.00
No log output captured.
Passed test/test_helpers/test_testing_helpers.py::TestPyTestAllEqual::test_numpy 0.00
No log output captured.
Passed test/test_helpers/test_testing_helpers.py::TestPyTestAllEqual::test_xarray 0.00
No log output captured.
Passed test/test_helpers/test_testing_helpers.py::TestPyTestAllEqual::test_other 0.00
No log output captured.
Passed test/test_helpers/test_testing_helpers.py::TestPyTestAllEqual::test_encapsulated 0.00
No log output captured.
Passed test/test_helpers/test_testing_helpers.py::TestNestedEquality::test_nested_equality_single_entries 0.00
No log output captured.
Passed test/test_helpers/test_testing_helpers.py::TestNestedEquality::test_nested_equality_xarray 0.02
No log output captured.
Passed test/test_helpers/test_testing_helpers.py::TestNestedEquality::test_nested_equality_numpy 0.00
No log output captured.
Passed test/test_helpers/test_testing_helpers.py::TestNestedEquality::test_nested_equality_list_tuple 0.00
No log output captured.
Passed test/test_helpers/test_testing_helpers.py::TestNestedEquality::test_nested_equality_dict 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestCorrectDataFormat::test_correct_data_format 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSelectDistinctDataOrigin::test_no_origin_given 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSelectDistinctDataOrigin::test_different_origins 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSelectDistinctNetwork::test_no_network_given 0.00
------------------------------Captured stderr call------------------------------
2023-12-18 17:35:36,927 - INFO: Could not find a valid match for variable no2 and networks []! Therefore, use first answer from JOIN: {'id': 16686, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'no2', 'parameter_label': 'NO2', 'parameter_attribute': ''} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,927 - INFO: Could not find a valid match for variable no2 and networks []! Therefore, use first answer from JOIN: {'id': 16686, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'no2', 'parameter_label': 'NO2', 'parameter_attribute': ''} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,927 - INFO: Could not find a valid match for variable no2 and networks []! Therefore, use first answer from JOIN: {'id': 16686, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'no2', 'parameter_label': 'NO2', 'parameter_attribute': ''} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,927 - INFO: Could not find a valid match for variable no2 and networks []! Therefore, use first answer from JOIN: {'id': 16686, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'no2', 'parameter_label': 'NO2', 'parameter_attribute': ''} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,927 - INFO: Could not find a valid match for variable o3 and networks []! Therefore, use first answer from JOIN: {'id': 16687, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'o3', 'parameter_label': 'O3', 'parameter_attribute': ''} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,927 - INFO: Could not find a valid match for variable o3 and networks []! Therefore, use first answer from JOIN: {'id': 16687, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'o3', 'parameter_label': 'O3', 'parameter_attribute': ''} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,927 - INFO: Could not find a valid match for variable o3 and networks []! Therefore, use first answer from JOIN: {'id': 16687, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'o3', 'parameter_label': 'O3', 'parameter_attribute': ''} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,927 - INFO: Could not find a valid match for variable o3 and networks []! Therefore, use first answer from JOIN: {'id': 16687, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'o3', 'parameter_label': 'O3', 'parameter_attribute': ''} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,927 - INFO: Could not find a valid match for variable cloudcover and networks []! Therefore, use first answer from JOIN: {'id': 54036, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'cloudcover', 'parameter_label': 'CLOUDCOVER', 'parameter_attribute': 'REA'} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,927 - INFO: Could not find a valid match for variable cloudcover and networks []! Therefore, use first answer from JOIN: {'id': 54036, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'cloudcover', 'parameter_label': 'CLOUDCOVER', 'parameter_attribute': 'REA'} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,927 - INFO: Could not find a valid match for variable cloudcover and networks []! Therefore, use first answer from JOIN: {'id': 54036, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'cloudcover', 'parameter_label': 'CLOUDCOVER', 'parameter_attribute': 'REA'} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,927 - INFO: Could not find a valid match for variable cloudcover and networks []! Therefore, use first answer from JOIN: {'id': 54036, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'cloudcover', 'parameter_label': 'CLOUDCOVER', 'parameter_attribute': 'REA'} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,928 - INFO: Could not find a valid match for variable temp and networks []! Therefore, use first answer from JOIN: {'id': 88491, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'temp', 'parameter_label': 'TEMP-REA-MIUB', 'parameter_attribute': 'REA'} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,928 - INFO: Could not find a valid match for variable temp and networks []! Therefore, use first answer from JOIN: {'id': 88491, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'temp', 'parameter_label': 'TEMP-REA-MIUB', 'parameter_attribute': 'REA'} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,928 - INFO: Could not find a valid match for variable temp and networks []! Therefore, use first answer from JOIN: {'id': 88491, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'temp', 'parameter_label': 'TEMP-REA-MIUB', 'parameter_attribute': 'REA'} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,928 - INFO: Could not find a valid match for variable temp and networks []! Therefore, use first answer from JOIN: {'id': 88491, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'temp', 'parameter_label': 'TEMP-REA-MIUB', 'parameter_attribute': 'REA'} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,928 - INFO: Could not find a valid match for variable press and networks []! Therefore, use first answer from JOIN: {'id': 102660, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'press', 'parameter_label': 'PRESS-REA-MIUB', 'parameter_attribute': 'REA'} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,928 - INFO: Could not find a valid match for variable press and networks []! Therefore, use first answer from JOIN: {'id': 102660, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'press', 'parameter_label': 'PRESS-REA-MIUB', 'parameter_attribute': 'REA'} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,928 - INFO: Could not find a valid match for variable press and networks []! Therefore, use first answer from JOIN: {'id': 102660, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'press', 'parameter_label': 'PRESS-REA-MIUB', 'parameter_attribute': 'REA'} [join.py:_select_distinct_network:288] 2023-12-18 17:35:36,928 - INFO: Could not find a valid match for variable press and networks []! Therefore, use first answer from JOIN: {'id': 102660, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'press', 'parameter_label': 'PRESS-REA-MIUB', 'parameter_attribute': 'REA'} [join.py:_select_distinct_network:288]
-------------------------------Captured log call--------------------------------
INFO root:join.py:288 Could not find a valid match for variable no2 and networks []! Therefore, use first answer from JOIN: {'id': 16686, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'no2', 'parameter_label': 'NO2', 'parameter_attribute': ''} INFO root:join.py:288 Could not find a valid match for variable o3 and networks []! Therefore, use first answer from JOIN: {'id': 16687, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'o3', 'parameter_label': 'O3', 'parameter_attribute': ''} INFO root:join.py:288 Could not find a valid match for variable cloudcover and networks []! Therefore, use first answer from JOIN: {'id': 54036, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'cloudcover', 'parameter_label': 'CLOUDCOVER', 'parameter_attribute': 'REA'} INFO root:join.py:288 Could not find a valid match for variable temp and networks []! Therefore, use first answer from JOIN: {'id': 88491, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'temp', 'parameter_label': 'TEMP-REA-MIUB', 'parameter_attribute': 'REA'} INFO root:join.py:288 Could not find a valid match for variable press and networks []! Therefore, use first answer from JOIN: {'id': 102660, 'network_name': 'UBA', 'station_id': 'DENW053', 'parameter_name': 'press', 'parameter_label': 'PRESS-REA-MIUB', 'parameter_attribute': 'REA'}
Passed test/test_helpers/test_data_sources/test_join.py::TestSelectDistinctNetwork::test_single_network_given 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSelectDistinctNetwork::test_single_network_given_no_match 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSelectDistinctNetwork::test_multiple_networks_given 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSelectDistinctNetwork::test_multiple_networks_given_by_dict 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSelectDistinctSeries::test_no_origin_given 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSelectDistinctSeries::test_different_origins 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSelectDistinctSeries::test_different_networks 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSelectDistinctSeries::test_network_not_available 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSelectDistinctSeries::test_different_network_and_origin 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSaveToPandas::test_empty_df 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSaveToPandas::test_not_empty_df 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestSaveToPandas::test_alternative_date_format 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_join.py::TestLowerList::test_string_lowering 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_toar_data.py::TestCreateUrl::test_minimal_args_given 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_toar_data.py::TestCreateUrl::test_given_kwargs 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_toar_data.py::TestCreateUrl::test_single_kwargs 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_toar_data.py::TestCreateUrl::test_none_kwargs 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_toar_data.py::TestCreateUrl::test_param_id 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_toar_data.py::TestCreateUrl::test_param_id_kwargs 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_toar_data.py::TestCorrectStatName::test_nothing_to_do 0.00
No log output captured.
Passed test/test_helpers/test_data_sources/test_toar_data.py::TestCorrectStatName::test_correct_string 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_init 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_model_property 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_compile_options_setter_all_empty 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_compile_options_setter_as_attr 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_compile_options_setter_as_mix_attr_dict_no_duplicates 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_compile_options_setter_as_mix_attr_dict_valid_duplicates_none_optimizer 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_compile_options_property_type_error 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_compile_options_setter_as_mix_attr_dict_invalid_duplicates_other_optimizer 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_compile_options_setter_as_mix_attr_dict_invalid_duplicates_same_optimizer_other_args 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_compile_options_setter_as_dict_invalid_keys 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_getattr 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_get_settings 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_custom_objects 0.00
No log output captured.
Passed test/test_model_modules/test_abstract_model_class.py::TestAbstractModelClass::test_set_custom_objects 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadUtils::test_get_padding_for_same_negative_kernel_size 0.00
------------------------------Captured stdout call------------------------------
In test_get_padding_for_same_negative_kernel_size
Passed test/test_model_modules/test_advanced_paddings.py::TestPadUtils::test_get_padding_for_same_strides_greater_one 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadUtils::test_get_padding_for_same_non_int_kernel 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadUtils::test_get_padding_for_same_stride_3d 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadUtils::test_get_padding_for_same_even_pad 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadUtils::test_check_padding_format_negative_pads 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadUtils::test_check_padding_format_len_of_pad_tuple 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadUtils::test_check_padding_format_tuple_of_none_integer 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadUtils::test_check_padding_format_tuple_of_tuple_none_integer_first 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadUtils::test_check_padding_format_tuple_of_tuple_none_integer_second 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadUtils::test_check_padding_format_valid_mix_of_int_and_tuple 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadUtils::test_check_padding_format_invalid_mixed_tuple_and_int 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadUtils::test_check_padding_format_invalid_mixed_int_and_tuple 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestReflectionPadding2D::test_init_tuple_of_valid_int 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestReflectionPadding2D::test_init_tuple_of_negative_int 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestReflectionPadding2D::test_init_tuple_of_invalid_format_float 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestReflectionPadding2D::test_init_tuple_of_invalid_format_string 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestReflectionPadding2D::test_init_int 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestReflectionPadding2D::test_init_tuple_of_tuple_of_valid_int 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestReflectionPadding2D::test_init_tuple_of_tuple_of_invalid_int 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestReflectionPadding2D::test_init_tuple_of_tuple_of_invalid_format 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestReflectionPadding2D::test_call 0.39
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestSymmerticPadding2D::test_init_tuple_of_valid_int 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestSymmerticPadding2D::test_init_tuple_of_negative_int 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestSymmerticPadding2D::test_init_tuple_of_invalid_format_float 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestSymmerticPadding2D::test_init_tuple_of_invalid_format_string 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestSymmerticPadding2D::test_init_int 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestSymmerticPadding2D::test_init_tuple_of_tuple_of_valid_int 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestSymmerticPadding2D::test_init_tuple_of_tuple_of_invalid_int 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestSymmerticPadding2D::test_init_tuple_of_tuple_of_invalid_format 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestSymmerticPadding2D::test_call 0.05
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_init 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_check_and_get_padding_zero_padding 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_check_and_get_padding_sym_padding 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_check_and_get_padding_ref_padding 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_check_and_get_padding_raises 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_call[SymPad2D] 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_call[SymmetricPadding2D0] 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_call[SymmetricPadding2D1] 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_call[RefPad2D] 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_call[ReflectionPadding2D0] 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_call[ReflectionPadding2D1] 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_call[ZeroPad2D] 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_call[ZeroPadding2D0] 0.00
No log output captured.
Passed test/test_model_modules/test_advanced_paddings.py::TestPadding2D::test_call[ZeroPadding2D1] 0.00
No log output captured.
Passed test/test_model_modules/test_flatten_tail.py::TestGetActivation::test_string_act 0.00
No log output captured.
Passed test/test_model_modules/test_flatten_tail.py::TestGetActivation::test_sting_act_unknown 0.00
No log output captured.
Passed test/test_model_modules/test_flatten_tail.py::TestGetActivation::test_layer_act 0.00
No log output captured.
Passed test/test_model_modules/test_flatten_tail.py::TestGetActivation::test_layer_act_invalid 0.00
No log output captured.
Passed test/test_model_modules/test_flatten_tail.py::TestFlattenTail::test_flatten_tail_no_bound_no_regul_no_drop 0.01
No log output captured.
Passed test/test_model_modules/test_flatten_tail.py::TestFlattenTail::test_flatten_tail_all_settings 0.03
No log output captured.
Passed test/test_model_modules/test_inception_model.py::TestInceptionModelBase::test_init 0.00
No log output captured.
Passed test/test_model_modules/test_inception_model.py::TestInceptionModelBase::test_block_part_name 0.00
No log output captured.
Passed test/test_model_modules/test_inception_model.py::TestInceptionModelBase::test_create_conv_tower_3x3 0.02
No log output captured.
Passed test/test_model_modules/test_inception_model.py::TestInceptionModelBase::test_create_conv_tower_3x3_batch_norm 0.03
No log output captured.
Passed test/test_model_modules/test_inception_model.py::TestInceptionModelBase::test_create_conv_tower_3x3_activation 0.04
No log output captured.
Passed test/test_model_modules/test_inception_model.py::TestInceptionModelBase::test_create_conv_tower_1x1 0.00
No log output captured.
Passed test/test_model_modules/test_inception_model.py::TestInceptionModelBase::test_create_conv_towers 0.03
No log output captured.
Passed test/test_model_modules/test_inception_model.py::TestInceptionModelBase::test_create_pool_tower 0.02
No log output captured.
Passed test/test_model_modules/test_inception_model.py::TestInceptionModelBase::test_inception_block 0.13
No log output captured.
Passed test/test_model_modules/test_inception_model.py::TestInceptionModelBase::test_inception_block_invalid_batchnorm 0.04
No log output captured.
Passed test/test_model_modules/test_inception_model.py::TestInceptionModelBase::test_batch_normalisation 0.01
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestHistoryAdvanced::test_init 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestHistoryAdvanced::test_on_train_begin 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestLearningRateDecay::test_init 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestLearningRateDecay::test_check_param 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestLearningRateDecay::test_on_epoch_begin 0.41
------------------------------Captured stdout call------------------------------
Epoch 1/5 1/1 [==============================] - ETA: 0s - loss: 0.5234 1/1 [==============================] - 0s 305ms/step - loss: 0.5234 Epoch 2/5 1/1 [==============================] - ETA: 0s - loss: 0.4940 1/1 [==============================] - 0s 2ms/step - loss: 0.4940 Epoch 3/5 1/1 [==============================] - ETA: 0s - loss: 0.4680 1/1 [==============================] - 0s 2ms/step - loss: 0.4680 Epoch 4/5 1/1 [==============================] - ETA: 0s - loss: 0.4464 1/1 [==============================] - 0s 2ms/step - loss: 0.4464 Epoch 5/5 1/1 [==============================] - ETA: 0s - loss: 0.4277 1/1 [==============================] - 0s 2ms/step - loss: 0.4277
------------------------------Captured stderr call------------------------------
2023-12-18 17:35:53.382653: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:185] None of the MLIR Optimization Passes are enabled (registered 2)
Passed test/test_model_modules/test_keras_extensions.py::TestModelCheckpointAdvanced::test_init 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestModelCheckpointAdvanced::test_update_best 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestModelCheckpointAdvanced::test_update_callbacks 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestModelCheckpointAdvanced::test_on_epoch_end 0.00
------------------------------Captured stdout call------------------------------
Epoch 00001: val_loss did not improve from 6.00000 Epoch 00010: val_loss did not improve from 6.00000 Epoch 00011: val_loss improved from 6.00000 to 4.00000, saving model to ckpt.test Epoch 00011: save to /builds/esde/machine-learning/mlair/test/test_model_modules/callback_lr Epoch 00011: save to /builds/esde/machine-learning/mlair/test/test_model_modules/callback_hist Epoch 00011: saving model to ckpt.test Epoch 00011: save to /builds/esde/machine-learning/mlair/test/test_model_modules/callback_lr Epoch 00011: save to /builds/esde/machine-learning/mlair/test/test_model_modules/callback_hist
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test_init 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test_callbacks_set 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test_callbacks_get 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test_update_callback 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test_add_callback 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test_add_callback_raise 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test_get_callbacks_as_dict 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test_get_callbacks_no_dict 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test_get_callback_by_name 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test__get_callbacks 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test_get_checkpoint 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test_create_model_checkpoint 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test_load_callbacks 0.00
No log output captured.
Passed test/test_model_modules/test_keras_extensions.py::TestCallbackHandler::test_update_checkpoint 0.00
No log output captured.
Passed test/test_model_modules/test_linear_model.py::TestOrdinaryLeastSquareModel::test_constant_input_variable 0.00
No log output captured.
Passed test/test_model_modules/test_loss.py::TestLPLoss::test_l_p_loss 0.25
------------------------------Captured stdout call------------------------------
1/1 [==============================] - ETA: 0s - loss: 1.2500 1/1 [==============================] - 0s 76ms/step - loss: 1.2500 1/1 [==============================] - ETA: 0s - loss: 2.2500 1/1 [==============================] - 0s 82ms/step - loss: 2.2500
Passed test/test_model_modules/test_loss.py::TestVarLoss::test_var_loss 0.18
------------------------------Captured stdout call------------------------------
1/1 [==============================] - ETA: 0s - loss: 0.1406 1/1 [==============================] - 0s 139ms/step - loss: 0.1406
Passed test/test_model_modules/test_loss.py::TestCustomLoss::test_custom_loss_no_weights 0.20
------------------------------Captured stdout call------------------------------
1/1 [==============================] - ETA: 0s - loss: 0.6953 1/1 [==============================] - 0s 150ms/step - loss: 0.6953
------------------------------Captured stderr call------------------------------
2023-12-18 17:35:54,413 - WARNING: 5 out of the last 9 calls to <function Model.make_train_function.<locals>.train_function at 0x7fb1ac140670> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. [def_function.py:called_with_tracing:147] 2023-12-18 17:35:54,413 - WARNING: 5 out of the last 9 calls to <function Model.make_train_function.<locals>.train_function at 0x7fb1ac140670> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. [def_function.py:called_with_tracing:147] 2023-12-18 17:35:54,413 - WARNING: 5 out of the last 9 calls to <function Model.make_train_function.<locals>.train_function at 0x7fb1ac140670> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. [def_function.py:called_with_tracing:147] 2023-12-18 17:35:54,413 - WARNING: 5 out of the last 9 calls to <function Model.make_train_function.<locals>.train_function at 0x7fb1ac140670> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. [def_function.py:called_with_tracing:147]
-------------------------------Captured log call--------------------------------
WARNING tensorflow:def_function.py:147 5 out of the last 9 calls to <function Model.make_train_function.<locals>.train_function at 0x7fb1ac140670> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details.
Passed test/test_model_modules/test_loss.py::TestCustomLoss::test_custom_loss_with_weights[weights0] 0.15
------------------------------Captured stdout call------------------------------
1/1 [==============================] - ETA: 0s - loss: 0.4734 1/1 [==============================] - 0s 106ms/step - loss: 0.4734
------------------------------Captured stderr call------------------------------
2023-12-18 17:35:54,567 - WARNING: 6 out of the last 10 calls to <function Model.make_train_function.<locals>.train_function at 0x7fb1ac10d160> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. [def_function.py:called_with_tracing:147] 2023-12-18 17:35:54,567 - WARNING: 6 out of the last 10 calls to <function Model.make_train_function.<locals>.train_function at 0x7fb1ac10d160> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. [def_function.py:called_with_tracing:147] 2023-12-18 17:35:54,567 - WARNING: 6 out of the last 10 calls to <function Model.make_train_function.<locals>.train_function at 0x7fb1ac10d160> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. [def_function.py:called_with_tracing:147] 2023-12-18 17:35:54,567 - WARNING: 6 out of the last 10 calls to <function Model.make_train_function.<locals>.train_function at 0x7fb1ac10d160> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details. [def_function.py:called_with_tracing:147]
-------------------------------Captured log call--------------------------------
WARNING tensorflow:def_function.py:147 6 out of the last 10 calls to <function Model.make_train_function.<locals>.train_function at 0x7fb1ac10d160> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details.
Passed test/test_model_modules/test_loss.py::TestCustomLoss::test_custom_loss_with_weights[weights1] 0.16
------------------------------Captured stdout call------------------------------
1/1 [==============================] - ETA: 0s - loss: 0.6953 1/1 [==============================] - 0s 118ms/step - loss: 0.6953
Passed test/test_model_modules/test_loss.py::TestCustomLoss::test_custom_loss_with_weights[weights2] 0.15
------------------------------Captured stdout call------------------------------
1/1 [==============================] - ETA: 0s - loss: 0.6953 1/1 [==============================] - 0s 105ms/step - loss: 0.6953
Passed test/test_model_modules/test_loss.py::TestCustomLoss::test_custom_loss_with_weights[weights3] 0.16
------------------------------Captured stdout call------------------------------
1/1 [==============================] - ETA: 0s - loss: 1.0281 1/1 [==============================] - 0s 113ms/step - loss: 1.0281
Passed test/test_model_modules/test_loss.py::TestCustomLoss::test_custom_loss_invalid_weights 0.00
No log output captured.
Passed test/test_model_modules/test_model_class.py::TestIntelliO3_ts_architecture::test_init 0.28
No log output captured.
Passed test/test_model_modules/test_model_class.py::TestIntelliO3_ts_architecture::test_set_model 0.48
No log output captured.
Passed test/test_model_modules/test_model_class.py::TestIntelliO3_ts_architecture::test_set_compile_options 0.27
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackObject::test_init 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackObject::test_repr 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackObject::test_x_property 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackObject::test_y_property 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackObject::test_add_precursor 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackObject::test_add_successor 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_init 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_get_all_scopes 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_get_unique_scopes 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_get_unique_scopes_no_general 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_get_all_dims 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_create_track_chain 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_control_dict 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test__create_track_chain 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_add_precursor 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_add_track_object_same_stage 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_add_track_object_different_stage 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_update_control 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_add_set_object 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_add_set_object_no_control_obj 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_add_get_object_no_new_track_obj 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_add_get_object_no_control_obj 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_add_get_object_skip_update 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_recursive_decent_avail_in_1_up 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_recursive_decent_avail_in_2_up 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_recursive_decent_avail_from_chain 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_recursive_decent_avail_from_chain_get 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_recursive_decent_avail_from_chain_multiple_get 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackChain::test_clean_control 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackPlot::test_init 0.31
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackPlot::test_plot 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackPlot::test_line 0.01
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackPlot::test_step 0.02
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackPlot::test_rect 0.01
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackPlot::test_set_ypos_anchor 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackPlot::test_plot_track_chain 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackPlot::test_add_variable_names 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackPlot::test_add_stages 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackPlot::test_create_track_chain_plot_run_env 0.00
No log output captured.
Passed test/test_plotting/test_tracker_plot.py::TestTrackPlot::test_set_lims 0.01
No log output captured.
Passed test/test_plotting/test_training_monitoring.py::TestPlotModelHistory::test_get_plot_metric 0.00
No log output captured.
Passed test/test_plotting/test_training_monitoring.py::TestPlotModelHistory::test_get_plot_metric_short_metric 0.00
No log output captured.
Passed test/test_plotting/test_training_monitoring.py::TestPlotModelHistory::test_get_plot_metric_main_branch 0.00
No log output captured.
Passed test/test_plotting/test_training_monitoring.py::TestPlotModelHistory::test_filter_columns 0.00
No log output captured.
Passed test/test_plotting/test_training_monitoring.py::TestPlotModelHistory::test_plot_from_hist_obj 0.37
No log output captured.
Passed test/test_plotting/test_training_monitoring.py::TestPlotModelHistory::test_plot_from_hist_dict 0.24
No log output captured.
Passed test/test_plotting/test_training_monitoring.py::TestPlotModelHistory::test_plot_additional_loss 0.48
No log output captured.
Passed test/test_plotting/test_training_monitoring.py::TestPlotModelLearningRate::test_plot_from_lr_obj 0.15
No log output captured.
Passed test/test_plotting/test_training_monitoring.py::TestPlotModelLearningRate::test_plot_from_lr_dict 0.15
No log output captured.
Passed test/test_reference_models/test_abstract_reference_model.py::TestAbstractReferenceDataHandler::test_init 0.00
No log output captured.
Passed test/test_reference_models/test_abstract_reference_model.py::TestAbstractReferenceDataHandler::test_make_reference_available_locally 0.00
No log output captured.
Passed test/test_reference_models/test_abstract_reference_model.py::TestAbstractReferenceDataHandler::test_is_reference_available_locally 0.00
No log output captured.
Passed test/test_reference_models/test_abstract_reference_model.py::TestAbstractReferenceb2share::test_inheritance 0.00
No log output captured.
Passed test/test_reference_models/test_abstract_reference_model.py::TestAbstractReferenceb2share::test_init 0.00
No log output captured.
Passed test/test_reference_models/test_abstract_reference_model.py::TestAbstractReferenceb2share::test_b2share_url 0.00
No log output captured.
Passed test/test_reference_models/test_abstract_reference_model.py::TestAbstractReferenceb2share::test_bar_custom 0.00
No log output captured.
Passed test/test_reference_models/test_abstract_reference_model.py::TestAbstractReferenceb2share::test_download_from_b2share 0.00
No log output captured.
Passed test/test_reference_models/test_reference_model_intellio3_v1.py::TestIntelliO3Reference::test_init_none_path 0.00
No log output captured.
Passed test/test_reference_models/test_reference_model_intellio3_v1.py::TestIntelliO3Reference::test_init_extra_path 0.00
No log output captured.
Passed test/test_reference_models/test_reference_model_intellio3_v1.py::TestIntelliO3Reference::test_inheritance 0.00
No log output captured.
Passed test/test_reference_models/test_reference_model_intellio3_v1.py::TestIntelliO3Reference::test_untar_forecasts 0.00
No log output captured.
Passed test/test_reference_models/test_reference_model_intellio3_v1.py::TestIntelliO3Reference::test_file_list 0.00
No log output captured.
Passed test/test_reference_models/test_reference_model_intellio3_v1.py::TestIntelliO3Reference::test_read_and_drop 0.00
No log output captured.
Passed test/test_reference_models/test_reference_model_intellio3_v1.py::TestIntelliO3Reference::test_make_reference_available_locally 0.00
No log output captured.
Passed test/test_run_modules/test_experiment_setup.py::TestExperimentSetup::test_set_param_by_value 0.00
-------------------------------Captured log call--------------------------------
DEBUG root:datastore.py:118 set: 23tester(general)=23 DEBUG root:experiment_setup.py:461 set experiment attribute: 23tester(general)=23 DEBUG root:datastore.py:118 get: 23tester(general)=23
Passed test/test_run_modules/test_experiment_setup.py::TestExperimentSetup::test_set_param_by_value_and_scope 0.00
-------------------------------Captured log call--------------------------------
DEBUG root:datastore.py:118 set: 109tester(general)=109 DEBUG root:experiment_setup.py:461 set experiment attribute: 109tester(general)=109 DEBUG root:datastore.py:118 get: 109tester(general.tester)=109
Passed test/test_run_modules/test_experiment_setup.py::TestExperimentSetup::test_set_param_with_default 0.00
-------------------------------Captured log call--------------------------------
DEBUG root:datastore.py:118 set: NoneTester(general.testing)=notNone DEBUG root:experiment_setup.py:461 set experiment attribute: NoneTester(general.testing)=notNone DEBUG root:datastore.py:118 get: NoneTester(general.testing)=notNone DEBUG root:datastore.py:118 set: AnotherNoneTester(general)=None DEBUG root:experiment_setup.py:461 set experiment attribute: AnotherNoneTester(general)=None DEBUG root:datastore.py:118 get: AnotherNoneTester(general)=None
Passed test/test_run_modules/test_experiment_setup.py::TestExperimentSetup::test_set_param_with_apply 0.00
-------------------------------Captured log call--------------------------------
DEBUG root:datastore.py:118 set: NoneTester(general)=notNone DEBUG root:experiment_setup.py:461 set experiment attribute: NoneTester(general)=notNone DEBUG root:datastore.py:118 get: NoneTester(general)=notNone DEBUG root:datastore.py:118 set: NoneTester(general)=['notNone'] DEBUG root:experiment_setup.py:461 set experiment attribute: NoneTester(general)=['notNone'] DEBUG root:datastore.py:118 get: NoneTester(general)=['notNone'] DEBUG root:datastore.py:118 set: NoneTester(general)=[None] DEBUG root:experiment_setup.py:461 set experiment attribute: NoneTester(general)=[None] DEBUG root:datastore.py:118 get: NoneTester(general)=[None] DEBUG root:datastore.py:118 set: NoneTester(general)=2 DEBUG root:experiment_setup.py:461 set experiment attribute: NoneTester(general)=2 DEBUG root:datastore.py:118 get: NoneTester(general)=2
Passed test/test_run_modules/test_experiment_setup.py::TestExperimentSetup::test_init_default 0.00
No log output captured.
Passed test/test_run_modules/test_experiment_setup.py::TestExperimentSetup::test_init_no_default 0.00
No log output captured.
Passed test/test_run_modules/test_experiment_setup.py::TestExperimentSetup::test_init_train_model_behaviour 0.02
No log output captured.
Passed test/test_run_modules/test_experiment_setup.py::TestExperimentSetup::test_compare_variables_and_statistics 0.01
No log output captured.
Passed test/test_run_modules/test_experiment_setup.py::TestExperimentSetup::test_multiprocessing_no_debug 0.01
No log output captured.
Passed test/test_run_modules/test_experiment_setup.py::TestExperimentSetup::test_multiprocessing_debug 0.01
No log output captured.
Passed test/test_run_modules/test_model_setup.py::TestModelSetup::test_set_callbacks 0.45
No log output captured.
Passed test/test_run_modules/test_model_setup.py::TestModelSetup::test_set_callbacks_no_lr_decay 1.41
No log output captured.
Passed test/test_run_modules/test_model_setup.py::TestModelSetup::test_get_model_settings 2.66
No log output captured.
Passed test/test_run_modules/test_model_setup.py::TestModelSetup::test_build_model 3.68
------------------------------Captured stdout call------------------------------
Model: "model_7" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_32 (InputLayer) [(None, 14, 1, 5)] 0 _________________________________________________________________ flatten_1 (Flatten) (None, 70) 0 _________________________________________________________________ dense_5 (Dense) (None, 64) 4544 _________________________________________________________________ prelu_1 (PReLU) (None, 64) 64 _________________________________________________________________ dense_6 (Dense) (None, 32) 2080 _________________________________________________________________ prelu_2 (PReLU) (None, 32) 32 _________________________________________________________________ dense_7 (Dense) (None, 16) 528 _________________________________________________________________ prelu_3 (PReLU) (None, 16) 16 _________________________________________________________________ dense_8 (Dense) (None, 5) 85 _________________________________________________________________ linear_output (Activation) (None, 5) 0 ================================================================= Total params: 7,349 Trainable params: 7,349 Non-trainable params: 0 _________________________________________________________________ None
Passed test/test_run_modules/test_model_setup.py::TestModelSetup::test_set_shapes 3.74
No log output captured.
Passed test/test_run_modules/test_model_setup.py::TestModelSetup::test_load_weights 0.00
No log output captured.
Passed test/test_run_modules/test_model_setup.py::TestModelSetup::test_compile_model 0.00
No log output captured.
Passed test/test_run_modules/test_model_setup.py::TestModelSetup::test_run 0.00
No log output captured.
Passed test/test_run_modules/test_model_setup.py::TestModelSetup::test_init 0.00
No log output captured.
Passed test/test_run_modules/test_model_setup.py::TestModelSetup::test_clean_name 1.74
No log output captured.
Passed test/test_run_modules/test_partition_check.py::TestPartitionCheck::test_init 1.84
------------------------------Captured stderr call------------------------------
2023-12-18 17:36:11,838 - INFO: PartitionCheck started [run_environment.py:__init__:103] 2023-12-18 17:36:11,838 - INFO: PartitionCheck started [run_environment.py:__init__:103] 2023-12-18 17:36:11,838 - INFO: PartitionCheck started [run_environment.py:__init__:103] 2023-12-18 17:36:11,838 - INFO: PartitionCheck started [run_environment.py:__init__:103] 2023-12-18 17:36:11,838 - INFO: PartitionCheck started [run_environment.py:__init__:103] 2023-12-18 17:36:11,839 - INFO: PartitionCheck finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:36:11,839 - INFO: PartitionCheck finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:36:11,839 - INFO: PartitionCheck finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:36:11,839 - INFO: PartitionCheck finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:36:11,839 - INFO: PartitionCheck finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:36:11,839 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:36:11,839 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:36:11,839 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:36:11,839 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:36:11,839 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:36:11,839 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:36:11,839 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:36:11,839 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:36:11,839 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:36:11,839 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:36:13,663 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:36:13,663 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:36:13,663 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:36:13,663 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:36:13,663 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:36:13,676 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:36:13,676 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:36:13,676 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:36:13,676 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:36:13,676 - INFO: Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log [run_environment.py:__move_log_file:147]
-------------------------------Captured log call--------------------------------
INFO root:run_environment.py:103 PartitionCheck started INFO root:run_environment.py:120 PartitionCheck finished after 0:00:01 (hh:mm:ss) INFO root:run_environment.py:103 RunEnvironment started INFO root:run_environment.py:120 RunEnvironment finished after 0:00:01 (hh:mm:ss) INFO root:run_environment.py:159 Copy tracker file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/tracking_000.json INFO root:run_environment.py:147 Move log file to /builds/esde/machine-learning/mlair/TestExperiment_daily/logging/logging_000.log
Passed test/test_run_modules/test_partition_check.py::TestPartitionCheck::test_run_login 1.84
No log output captured.
Passed test/test_run_modules/test_partition_check.py::TestPartitionCheck::test_run_compute 2.23
No log output captured.
Passed test/test_run_modules/test_pre_processing.py::TestPreProcessing::test_split_train_val_test 307.56
No log output captured.
Passed test/test_run_modules/test_pre_processing.py::TestPreProcessing::test_split_set_indices 5.10
No log output captured.
Passed test/test_run_modules/test_pre_processing.py::TestPreProcessing::test_transformation 0.00
No log output captured.
Passed test/test_run_modules/test_run_environment.py::TestRunEnvironment::test_enter 4.39
------------------------------Captured stderr call------------------------------
2023-12-18 17:48:00,292 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:00,292 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:00,292 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:00,292 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:00,292 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:00,292 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:00,292 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:00,292 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:00,292 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:00,292 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:04,651 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:04,651 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:04,651 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:04,651 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:04,651 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:04,679 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:04,679 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:04,679 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:04,679 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:04,679 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147]
-------------------------------Captured log call--------------------------------
INFO root:run_environment.py:103 RunEnvironment started INFO root:run_environment.py:120 RunEnvironment finished after 0:00:01 (hh:mm:ss) INFO root:run_environment.py:159 Copy tracker file to /builds/esde/machine-learning/mlair/tracking_000.json INFO root:run_environment.py:147 Move log file to /builds/esde/machine-learning/mlair/logging_000.log
Passed test/test_run_modules/test_run_environment.py::TestRunEnvironment::test_exit 4.45
------------------------------Captured stderr call------------------------------
2023-12-18 17:48:04,684 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:04,684 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:04,684 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:04,684 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:04,684 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:04,785 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:04,785 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:04,785 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:04,785 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:04,785 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:09,107 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_001.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:09,107 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_001.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:09,107 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_001.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:09,107 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_001.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:09,107 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_001.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:09,132 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:09,132 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:09,132 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:09,132 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:09,132 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147]
-------------------------------Captured log call--------------------------------
INFO root:run_environment.py:103 RunEnvironment started INFO root:run_environment.py:120 RunEnvironment finished after 0:00:01 (hh:mm:ss) INFO root:run_environment.py:159 Copy tracker file to /builds/esde/machine-learning/mlair/tracking_001.json INFO root:run_environment.py:147 Move log file to /builds/esde/machine-learning/mlair/logging_000.log
Passed test/test_run_modules/test_run_environment.py::TestRunEnvironment::test_init 4.46
------------------------------Captured stderr call------------------------------
2023-12-18 17:48:09,135 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:09,135 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:09,135 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:09,135 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:09,135 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:09,135 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:09,135 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:09,135 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:09,135 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:09,135 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:13,567 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_002.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:13,567 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_002.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:13,567 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_002.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:13,567 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_002.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:13,567 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_002.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:13,595 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:13,595 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:13,595 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:13,595 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:13,595 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147]
-------------------------------Captured log call--------------------------------
INFO root:run_environment.py:103 RunEnvironment started INFO root:run_environment.py:120 RunEnvironment finished after 0:00:01 (hh:mm:ss) INFO root:run_environment.py:159 Copy tracker file to /builds/esde/machine-learning/mlair/tracking_002.json INFO root:run_environment.py:147 Move log file to /builds/esde/machine-learning/mlair/logging_000.log
Passed test/test_run_modules/test_run_environment.py::TestRunEnvironment::test_del 5.20
------------------------------Captured stderr call------------------------------
2023-12-18 17:48:13,599 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:13,599 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:13,599 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:13,599 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:13,599 - INFO: RunEnvironment started [run_environment.py:__init__:103] 2023-12-18 17:48:13,800 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:13,800 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:13,800 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:13,800 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:13,800 - INFO: RunEnvironment finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:48:18,781 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_003.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:18,781 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_003.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:18,781 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_003.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:18,781 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_003.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:18,781 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_003.json [run_environment.py:__save_tracking:159] 2023-12-18 17:48:18,806 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:18,806 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:18,806 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:18,806 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:48:18,806 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147]
-------------------------------Captured log call--------------------------------
INFO root:run_environment.py:103 RunEnvironment started INFO root:run_environment.py:120 RunEnvironment finished after 0:00:01 (hh:mm:ss) INFO root:run_environment.py:159 Copy tracker file to /builds/esde/machine-learning/mlair/tracking_003.json INFO root:run_environment.py:147 Move log file to /builds/esde/machine-learning/mlair/logging_000.log
Passed test/test_run_modules/test_training.py::TestTraining::test_make_predict_function 4.94
-----------------------------Captured stdout setup------------------------------
Model: "model_8" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_33 (InputLayer) [(None, 8, 1, 2)] 0 _________________________________________________________________ flatten_2 (Flatten) (None, 16) 0 _________________________________________________________________ dense_9 (Dense) (None, 10) 170 _________________________________________________________________ relu_1 (ReLU) (None, 10) 0 _________________________________________________________________ dense_10 (Dense) (None, 2) 22 _________________________________________________________________ linear_output (Activation) (None, 2) 0 ================================================================= Total params: 192 Trainable params: 192 Non-trainable params: 0 _________________________________________________________________ None
Passed test/test_run_modules/test_training.py::TestTraining::test_set_gen 5.48
-----------------------------Captured stdout setup------------------------------
Model: "model_9" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_34 (InputLayer) [(None, 8, 1, 2)] 0 _________________________________________________________________ flatten_3 (Flatten) (None, 16) 0 _________________________________________________________________ dense_11 (Dense) (None, 10) 170 _________________________________________________________________ relu_1 (ReLU) (None, 10) 0 _________________________________________________________________ dense_12 (Dense) (None, 2) 22 _________________________________________________________________ linear_output (Activation) (None, 2) 0 ================================================================= Total params: 192 Trainable params: 192 Non-trainable params: 0 _________________________________________________________________ None
Passed test/test_run_modules/test_training.py::TestTraining::test_set_generators 4.84
-----------------------------Captured stdout setup------------------------------
Model: "model_10" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_35 (InputLayer) [(None, 8, 1, 2)] 0 _________________________________________________________________ flatten_4 (Flatten) (None, 16) 0 _________________________________________________________________ dense_13 (Dense) (None, 10) 170 _________________________________________________________________ relu_1 (ReLU) (None, 10) 0 _________________________________________________________________ dense_14 (Dense) (None, 2) 22 _________________________________________________________________ linear_output (Activation) (None, 2) 0 ================================================================= Total params: 192 Trainable params: 192 Non-trainable params: 0 _________________________________________________________________ None
Passed test/test_run_modules/test_training.py::TestTraining::test_save_model 5.15
-----------------------------Captured stdout setup------------------------------
Model: "model_11" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_36 (InputLayer) [(None, 8, 1, 2)] 0 _________________________________________________________________ flatten_5 (Flatten) (None, 16) 0 _________________________________________________________________ dense_15 (Dense) (None, 10) 170 _________________________________________________________________ relu_1 (ReLU) (None, 10) 0 _________________________________________________________________ dense_16 (Dense) (None, 2) 22 _________________________________________________________________ linear_output (Activation) (None, 2) 0 ================================================================= Total params: 192 Trainable params: 192 Non-trainable params: 0 _________________________________________________________________ None
------------------------------Captured stderr call------------------------------
2023-12-18 17:49:52,274 - WARNING: Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model. [saving_utils.py:try_build_compiled_arguments:313] 2023-12-18 17:49:52,274 - WARNING: Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model. [saving_utils.py:try_build_compiled_arguments:313] 2023-12-18 17:49:52,274 - WARNING: Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model. [saving_utils.py:try_build_compiled_arguments:313] 2023-12-18 17:49:52,274 - WARNING: Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model. [saving_utils.py:try_build_compiled_arguments:313] 2023-12-18 17:49:52,274 - WARNING: Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model. [saving_utils.py:try_build_compiled_arguments:313] 2023-12-18 17:49:52,294 - WARNING: Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model. [saving_utils.py:try_build_compiled_arguments:313] 2023-12-18 17:49:52,294 - WARNING: Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model. [saving_utils.py:try_build_compiled_arguments:313] 2023-12-18 17:49:52,294 - WARNING: Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model. [saving_utils.py:try_build_compiled_arguments:313] 2023-12-18 17:49:52,294 - WARNING: Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model. [saving_utils.py:try_build_compiled_arguments:313] 2023-12-18 17:49:52,294 - WARNING: Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model. [saving_utils.py:try_build_compiled_arguments:313]
-------------------------------Captured log call--------------------------------
DEBUG root:datastore.py:118 get: model_name(general.model)=/builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/model/test_model.h5 DEBUG root:training.py:188 save model to /builds/esde/machine-learning/mlair/test/test_run_modules/TestExperiment/model/test_model.h5 WARNING tensorflow:saving_utils.py:313 Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model. DEBUG h5py._conv:attrs.py:203 Creating converter from 5 to 3 WARNING tensorflow:saving_utils.py:313 Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model. DEBUG root:datastore.py:118 set: model(general)=<mlair.model_modules.fully_connected_networks.FCN object at 0x7fafed6f55b0>
Passed test/test_run_modules/test_training.py::TestTraining::test_save_callbacks_history_created 4.73
-----------------------------Captured stdout setup------------------------------
Model: "model_12" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_37 (InputLayer) [(None, 8, 1, 2)] 0 _________________________________________________________________ flatten_6 (Flatten) (None, 16) 0 _________________________________________________________________ dense_17 (Dense) (None, 10) 170 _________________________________________________________________ relu_1 (ReLU) (None, 10) 0 _________________________________________________________________ dense_18 (Dense) (None, 2) 22 _________________________________________________________________ linear_output (Activation) (None, 2) 0 ================================================================= Total params: 192 Trainable params: 192 Non-trainable params: 0 _________________________________________________________________ None
Passed test/test_run_modules/test_training.py::TestTraining::test_save_callbacks_lr_created 5.59
-----------------------------Captured stdout setup------------------------------
Model: "model_13" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_38 (InputLayer) [(None, 8, 1, 2)] 0 _________________________________________________________________ flatten_7 (Flatten) (None, 16) 0 _________________________________________________________________ dense_19 (Dense) (None, 10) 170 _________________________________________________________________ relu_1 (ReLU) (None, 10) 0 _________________________________________________________________ dense_20 (Dense) (None, 2) 22 _________________________________________________________________ linear_output (Activation) (None, 2) 0 ================================================================= Total params: 192 Trainable params: 192 Non-trainable params: 0 _________________________________________________________________ None
Passed test/test_run_modules/test_training.py::TestTraining::test_save_callbacks_inspect_history 4.50
-----------------------------Captured stdout setup------------------------------
Model: "model_14" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_39 (InputLayer) [(None, 8, 1, 2)] 0 _________________________________________________________________ flatten_8 (Flatten) (None, 16) 0 _________________________________________________________________ dense_21 (Dense) (None, 10) 170 _________________________________________________________________ relu_1 (ReLU) (None, 10) 0 _________________________________________________________________ dense_22 (Dense) (None, 2) 22 _________________________________________________________________ linear_output (Activation) (None, 2) 0 ================================================================= Total params: 192 Trainable params: 192 Non-trainable params: 0 _________________________________________________________________ None
Passed test/test_run_modules/test_training.py::TestTraining::test_save_callbacks_inspect_lr 4.49
-----------------------------Captured stdout setup------------------------------
Model: "model_15" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_40 (InputLayer) [(None, 8, 1, 2)] 0 _________________________________________________________________ flatten_9 (Flatten) (None, 16) 0 _________________________________________________________________ dense_23 (Dense) (None, 10) 170 _________________________________________________________________ relu_1 (ReLU) (None, 10) 0 _________________________________________________________________ dense_24 (Dense) (None, 2) 22 _________________________________________________________________ linear_output (Activation) (None, 2) 0 ================================================================= Total params: 192 Trainable params: 192 Non-trainable params: 0 _________________________________________________________________ None
Passed test/test_run_modules/test_training.py::TestTraining::test_create_monitoring_plots 4.94
-----------------------------Captured stdout setup------------------------------
Model: "model_16" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_41 (InputLayer) [(None, 8, 1, 2)] 0 _________________________________________________________________ flatten_10 (Flatten) (None, 16) 0 _________________________________________________________________ dense_25 (Dense) (None, 10) 170 _________________________________________________________________ relu_1 (ReLU) (None, 10) 0 _________________________________________________________________ dense_26 (Dense) (None, 2) 22 _________________________________________________________________ linear_output (Activation) (None, 2) 0 ================================================================= Total params: 192 Trainable params: 192 Non-trainable params: 0 _________________________________________________________________ None
Passed test/test_workflows/test_abstract_workflow.py::TestWorkflow::test_init 0.00
No log output captured.
Passed test/test_workflows/test_abstract_workflow.py::TestWorkflow::test_add 0.00
No log output captured.
Passed test/test_workflows/test_abstract_workflow.py::TestWorkflow::test_run 5.14
------------------------------Captured stderr call------------------------------
2023-12-18 17:50:40,637 - INFO: Workflow started [run_environment.py:__init__:103] 2023-12-18 17:50:40,637 - INFO: Workflow started [run_environment.py:__init__:103] 2023-12-18 17:50:40,637 - INFO: Workflow started [run_environment.py:__init__:103] 2023-12-18 17:50:40,637 - INFO: Workflow started [run_environment.py:__init__:103] 2023-12-18 17:50:40,637 - INFO: Workflow started [run_environment.py:__init__:103] 2023-12-18 17:50:40,638 - INFO: 6 [test_abstract_workflow.py:__init__:37] 2023-12-18 17:50:40,638 - INFO: 6 [test_abstract_workflow.py:__init__:37] 2023-12-18 17:50:40,638 - INFO: 6 [test_abstract_workflow.py:__init__:37] 2023-12-18 17:50:40,638 - INFO: 6 [test_abstract_workflow.py:__init__:37] 2023-12-18 17:50:40,638 - INFO: 6 [test_abstract_workflow.py:__init__:37] 2023-12-18 17:50:40,638 - INFO: 2 [test_abstract_workflow.py:__init__:42] 2023-12-18 17:50:40,638 - INFO: 2 [test_abstract_workflow.py:__init__:42] 2023-12-18 17:50:40,638 - INFO: 2 [test_abstract_workflow.py:__init__:42] 2023-12-18 17:50:40,638 - INFO: 2 [test_abstract_workflow.py:__init__:42] 2023-12-18 17:50:40,638 - INFO: 2 [test_abstract_workflow.py:__init__:42] 2023-12-18 17:50:40,638 - INFO: 3 [test_abstract_workflow.py:__init__:37] 2023-12-18 17:50:40,638 - INFO: 3 [test_abstract_workflow.py:__init__:37] 2023-12-18 17:50:40,638 - INFO: 3 [test_abstract_workflow.py:__init__:37] 2023-12-18 17:50:40,638 - INFO: 3 [test_abstract_workflow.py:__init__:37] 2023-12-18 17:50:40,638 - INFO: 3 [test_abstract_workflow.py:__init__:37] 2023-12-18 17:50:40,638 - INFO: Workflow finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:50:40,638 - INFO: Workflow finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:50:40,638 - INFO: Workflow finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:50:40,638 - INFO: Workflow finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:50:40,638 - INFO: Workflow finished after 0:00:01 (hh:mm:ss) [run_environment.py:__del__:120] 2023-12-18 17:50:45,752 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:50:45,752 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:50:45,752 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:50:45,752 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:50:45,752 - INFO: Copy tracker file to /builds/esde/machine-learning/mlair/tracking_000.json [run_environment.py:__save_tracking:159] 2023-12-18 17:50:45,781 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:50:45,781 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:50:45,781 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:50:45,781 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147] 2023-12-18 17:50:45,781 - INFO: Move log file to /builds/esde/machine-learning/mlair/logging_000.log [run_environment.py:__move_log_file:147]
-------------------------------Captured log call--------------------------------
INFO root:run_environment.py:103 Workflow started INFO root:test_abstract_workflow.py:37 6 INFO root:test_abstract_workflow.py:42 2 INFO root:test_abstract_workflow.py:37 3 INFO root:run_environment.py:120 Workflow finished after 0:00:01 (hh:mm:ss) INFO root:run_environment.py:159 Copy tracker file to /builds/esde/machine-learning/mlair/tracking_000.json INFO root:run_environment.py:147 Move log file to /builds/esde/machine-learning/mlair/logging_000.log
Passed test/test_workflows/test_default_workflow.py::TestDefaultWorkflow::test_init_no_args 0.00
No log output captured.
Passed test/test_workflows/test_default_workflow.py::TestDefaultWorkflow::test_init_with_args 0.00
No log output captured.
Passed test/test_workflows/test_default_workflow.py::TestDefaultWorkflow::test_init_with_kwargs 0.00
No log output captured.
Passed test/test_workflows/test_default_workflow.py::TestDefaultWorkflow::test_setup 0.00
No log output captured.
Passed test/test_workflows/test_default_workflow.py::TestDefaultWorkflowHPC::test_setup 0.01
No log output captured.