Home > OS >  Avoiding reloading weights/datasets in ML edit-compile-run loop
Avoiding reloading weights/datasets in ML edit-compile-run loop

Time:10-26

In machine learning, the edit-compile-run loop is pretty slow as your script has to load large models and datasets.

In the past, I've avoided this by loading just a tiny subset of the data, and not using pre-initialized weights when setting up the code for training.

CodePudding user response:

Use a Jupyter notebook or google colab.

You can edit and compile a cell at a time, and the dataset and trained weights in another cell will be persisted.

Somehow this didn't click, until just now.

  • Related