Pytorch:计算批处理张量的标准
我的张量t
带有shape(batch_size x dims)和另一种张量v
带有shape(vocab_size x dims)。我想用shape(batch_size x vocab_size)产生张量d
,以便d [i,j] = norm(t [i] - v [j])。
对于单个张量(无批次)进行此操作是微不足道的:d = torch.norm(v -t)
,因为t
将被广播。当张量批处理时,该怎么办?
I have tensor t
with shape (Batch_Size x Dims) and another tensor v
with shape (Vocab_Size x Dims). I'd like to produce a tensor d
with shape (Batch_Size x Vocab_Size), such that d[i,j] = norm(t[i] - v[j])
.
Doing this for a single tensor (no batches) is trivial: d = torch.norm(v - t)
, since t
would be broadcast. How can I do this when the tensors have batches?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
将单位尺寸插入
v
和t
以使其(1 x vocab_size x dims)和(batch_size x 1 x dims)分别。接下来,以广播的差异获取形状张量(batch_size x vocab_size x dims)。将其传递给torch.norm
以及可选的dim = 2
参数,以便沿最后一个维度采取规范。这将导致所需的(batch_size x vocab_size)规范的张量。编辑:正如@konstantinoskokos在评论中指出的那样,由于广播规则 Numpy和Pytorch使用的是领先的统一维度
V
不需要明确。即您可以使用Insert unitary dimensions into
v
andt
to make them (1 x Vocab_Size x Dims) and (Batch_Size x 1 x Dims) respectively. Next, take the broadcasted difference to get a tensor of shape (Batch_Size x Vocab_Size x Dims). Pass that totorch.norm
along with the optionaldim=2
argument so that the norm is taken along the last dimension. This will result in the desired (Batch_Size x Vocab_Size) tensor of norms.Edit: As pointed out by @KonstantinosKokos in the comments, due to the broadcasting rules used by numpy and pytorch, the leading unitary dimension on
v
does not need to be explicit. I.e. you can use