Home > Mobile >  Should you normalizing a dataset per label, or across the range of entire dataset at once?
Should you normalizing a dataset per label, or across the range of entire dataset at once?

Time:12-22

So I'm looking to train a CNN model in Keras. The labels (Y) in my dataset are of shape (1080, 1920, 2). The values themselves however are quite large. Ranging from floating point numbers up to 80,000.

To smooth out the training process I want to normalize each label (array) using the following code where y is the array in question:

y/np.linalg.norm(y)

In order to denormalize my array I would simply do the opposite:

y * np.linalg.norm(y)

Should I normalize each label individually, or should I normalize across the entire Y dataset at once? I ask because, when it comes to denormalizing my models output, I won't be sure which normalization rate to use (np.linalg.norm(y) output) if I normalize each label individually.

Am I thinking of this the right way?

I did read this post here: Denormalization of output from neural network

It appears to address denormalization per label. Which, I don't understand how that would denormalize a models output correctly, if each label has it's own range and the model was trained on all labels.

CodePudding user response:

If you follow this approach you have to apply the same process to each tag individually. Alternatively, you can normalize the entire Y dataset at once by calculating the mean and standard deviation of the Y dataset and using these values to normalize and denormalize the data. If you choose this approach, you only need to track the mean and standard deviation of the Y dataset, which will be the same for all labels. This would be simpler and more efficient than just keeping track of the normalization factor for each label individually.

  • Related