用单热编码嵌入层

发布于 2025-01-31 05:17:48 字数 823 浏览 4 评论 0原文

假设我有一个壁炉编码的输入数据。我将举一个示例来自 stack Exchange链接 假设我们只有2个句子:

Hope to see you soon
Nice to see you again

因此,我们有7个不同的单词,假设它们独特的整数索引(令牌化表示)是:

[0, 1, 2, 3, 4]

[5, 1, 2, 3, 6]

因此,一列表示是第一个句子希望很快能见到您那:

[[1,0,0,0,0,0,0],
 [0,1,0,0,0,0,0],
 [0,0,1,0,0,0,0],
 [0,0,0,1,0,0,0],
 [0,0,0,0,1,0,0]]

现在,我知道Keras嵌入层的工作原理。它像查找表一样工作,因此,我可以找到所有示例的所有示例。相同链接也解释了嵌入式工作层的方法。

但是,我想知道,如果我将单热编码的输入给KERAS嵌入层,它是否正常工作?因为我知道它可以与令牌化输入或一hot编码的输入一起使用。另外,它会产生性能差异吗?

Let's assume that I have one-hot encoded input data. I will give an example from that stack exchange link
Suppose that we have just 2 sentences:

Hope to see you soon
Nice to see you again

Therefore, we have 7 different words, let's say their unique integer indexes (tokenized representations) are:

[0, 1, 2, 3, 4]

[5, 1, 2, 3, 6]

Thus one-hot representation is the first sentence Hope to see you soon is that:

[[1,0,0,0,0,0,0],
 [0,1,0,0,0,0,0],
 [0,0,1,0,0,0,0],
 [0,0,0,1,0,0,0],
 [0,0,0,0,1,0,0]]

Now, I know the working principle of Keras embedding layer. It works like a lookup table and because of that reason all examples that I can find take tokenized inputs. The same link also explains the embedding layer working approach.

However, I wonder that, if I give one-hot encoded inputs to keras embedding layer, does it work properly? Because I know that it works with either tokenized inputs or one-hot encoded inputs. Also, does it create a performance difference?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文