为pytorch中的嵌入层分配自定义权重
Pytorch的nn.embedding
支持仅针对特定值设置嵌入权重吗?
我知道我可以像这样设置整个嵌入层的权重 -
emb_layer = nn.Embedding(num_embeddings, embedding_dim)
emb_layer.weights = torch.nn.Parameter(torch.from_numpy(weight_matrix))
但是Pytorch是否提供了任何简洁/有效的方法来仅设置一个特定值的嵌入权重?
emb_layer.set_weight(5)= torch.tensor([...])
之类的东西是为了手动设置一个值“ 5”?
Does PyTorch's nn.Embedding
support manually setting the embedding weights for only specific values?
I know I could set the weights of the entire embedding layer like this -
emb_layer = nn.Embedding(num_embeddings, embedding_dim)
emb_layer.weights = torch.nn.Parameter(torch.from_numpy(weight_matrix))
But does PyTorch provide any succinct/efficient method to set the embedding weights for only one particular value?
Something like emb_layer.set_weight(5) = torch.tensor([...])
to manually set the embedding only for the value "5"?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
是的。您可以运行
eng_layer.weight.shape
以查看权重的形状,然后可以访问和更改单个权重,例如:我在这里使用两个索引,因为嵌入式层是两个尺寸。有些层,例如线性层,只需要一个索引。
Yes. You can run
emb_layer.weight.shape
to see the shape of the weights, and then you can access and change a single weight like this, for example:I use two indices here since the embedding layer is two dimensional. Some layers, like a Linear layer, would only require one index.