Home > Mobile >  Is there any shuffle mode in training Pytorch neural network model?
Is there any shuffle mode in training Pytorch neural network model?

Time:06-17

I'm using the code below to train a simple neural net to learn a harmonic wave by PyTorch. But I want to turn the shuffle mode on to improve the model. Is there any syntax to this aim?

model = FCN(1,1,50,4)
optimizer = torch.optim.Adam(model.parameters(),lr=15e-3, weight_decay=15e-3/4000)

for i in range(4000):
    optimizer.zero_grad()
    yhh = model(x_data)
    loss = torch.mean((yhh-y_data)**2)
    loss.backward()
    optimizer.step()

Also, I used the code below alternatively to reorder the learning set randomly, but the result was awful.

yhh = model(x_data[[np.random.choice(range(len(x_data)), len(x_data), replace=False)]])

CodePudding user response:

You're probably looking for SubsetRandomSampler, example:

validation_split = .2
shuffle_dataset = True
random_seed= 42

dataset_size = len(dataset)
indices = list(range(dataset_size))
split = int(np.floor(validation_split * dataset_size))

if shuffle_dataset :
    np.random.seed(random_seed)
    np.random.shuffle(indices)
train_indices, val_indices = indices[split:], indices[:split]

train_sampler = SubsetRandomSampler(train_indices)
valid_sampler = SubsetRandomSampler(val_indices)

Note: dataset is a custom dataset that implements:

def __init__(self):
    pass
def __len__(self):
    pass
def __getitem__(self, idx):
    pass

CodePudding user response:

Assuming your x_data is plain Torch tensor of size, say [100], you can use torch.utils.data.DataLoader with shuffle=True to shuffle x_data after each epoch:

dataset = torch.utils.data.TensorDataset(x_data)  # first create a dataset wrapping your tensor
dataloader = torch.utils.data.DataLoader(dataset, batch_size=bs, shuffle=True)  # specify your required batch size

dataloader object is now an iterable and can be used as:

for data in dataloader:
    model(data[0])  #data[0] is a tensor of size(bs)
  • Related