Fixed problem with memory release in run-experiments.py, added quick run results
Compare changes
+ 264
− 0
[2022-07-29 23:32:45,290][INFO ][lmi.indexes.BaseInde] Training model M.0 (root) on dataset(1000000, 282) with {'epochs': 1, 'hidden_layers': {'dense': [{'activation': 'relu', 'dropout': None, 'units': 282}, {'activation': 'relu', 'dropout': None, 'units': 128}]}, 'learning_rate': 0.0001, 'loss': 'sparse_categorical_crossentropy', 'model': 'NN', 'optimizer': 'adam'}.
[2022-07-29 23:36:10,154][INFO ][lmi.indexes.BaseInde] Training level 1 with {'epochs': 1, 'hidden_layers': {'dense': [{'activation': 'relu', 'dropout': None, 'units': 282}, {'activation': 'relu', 'dropout': None, 'units': 128}]}, 'learning_rate': 0.0001, 'loss': 'sparse_categorical_crossentropy', 'model': 'NN', 'optimizer': 'adam'}.
[2022-07-29 23:41:10,145][INFO ][lmi.indexes.BaseInde] Training level 2 with {'epochs': 5, 'hidden_layers': {'dense': [{'activation': 'relu', 'dropout': None, 'units': 100, 'regularizer': True}]}, 'learning_rate': 0.001, 'loss': 'sparse_categorical_crossentropy', 'model': 'NN', 'optimizer': 'adam'}.
[2022-07-30 00:22:43,569][INFO ][lmi.indexes.BaseInde] Training level 3 with {'epochs': 5, 'hidden_layers': {'dense': [{'activation': 'relu', 'dropout': None, 'units': 100, 'regularizer': True}]}, 'learning_rate': 0.001, 'loss': 'sparse_categorical_crossentropy', 'model': 'NN', 'optimizer': 'adam'}.
[2022-07-30 00:51:43,926][INFO ][lmi.indexes.BaseInde] Training level 4 with {'epochs': 5, 'hidden_layers': {'dense': [{'activation': 'relu', 'dropout': None, 'units': 100, 'regularizer': True}]}, 'learning_rate': 0.001, 'loss': 'sparse_categorical_crossentropy', 'model': 'NN', 'optimizer': 'adam'}.
[2022-07-30 00:55:33,591][INFO ][lmi.indexes.BaseInde] Training level 5 with {'epochs': 5, 'hidden_layers': {'dense': [{'activation': 'relu', 'dropout': None, 'units': 100, 'regularizer': True}]}, 'learning_rate': 0.001, 'loss': 'sparse_categorical_crossentropy', 'model': 'NN', 'optimizer': 'adam'}.
\ No newline at end of file