Home > database >  Assigning custom weights to embedding layer in PyTorch
Assigning custom weights to embedding layer in PyTorch

Time:05-14

Does PyTorch's nn.Embedding support manually setting the embedding weights for only specific values?

I know I could set the weights of the entire embedding layer like this -

emb_layer = nn.Embedding(num_embeddings, embedding_dim)
emb_layer.weights = torch.nn.Parameter(torch.from_numpy(weight_matrix))

But does PyTorch provide any succinct/efficient method to set the embedding weights for only one particular value?

Something like emb_layer.set_weight(5) = torch.tensor([...]) to manually set the embedding only for the value "5"?

CodePudding user response:

Yes. You can run emb_layer.weight.shape to see the shape of the weights, and then you can access and change a single weight like this, for example:

with torch.no_grad():
  emb_layer.weight[idx_1,idx_2] = some_value

I use two indices here since the embedding layer is two dimensional. Some layers, like a Linear layer, would only require one index.

  • Related