带有Google的word嵌入式T5?
是否可以使用Google的T5生成单词嵌入?
我假设这是可能的。但是,我找不到我需要能够在相关github上生成词嵌入的代码( https://github.com/google-research/text-text-text-text-text-transfer-transformer )或huggingface( https://huggingface.co/docs/transformers/model_doc/t5 )页面。
Is it possible to generate word embeddings with Google's T5?
I'm assuming that this is possible. However, I cannot find the code I would need to be able to generate word embeddings on the relevant Github (https://github.com/google-research/text-to-text-transfer-transformer) or HuggingFace (https://huggingface.co/docs/transformers/model_doc/t5) pages.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
是的,这是可能的。只需将单词的ID馈送到单词嵌入层:
输出张量具有以下形状:
19个向量是每个令牌的表示。根据您的任务,您可以用 word_ids :
输出:
Yes, that is possible. Just feed the ids of the words to the word embedding layer:
The output tensor has the following shape:
The 19 vectors are the representations of each token. Depending on your task, you can map them back to the individual words with word_ids:
Output: