感知器中阈值的意义是什么?

发布于 2024-11-17 22:24:10 字数 71 浏览 3 评论 0原文

我很难看出阈值在单层感知器中的实际作用。无论阈值是多少,数据通常都是分离的。似乎较低的阈值可以更均匀地划分数据;这是它的用途吗?

I'm having trouble seeing what the threshold actually does in a single-layer perceptron. The data is usually separated no matter what the value of the threshold is. It seems a lower threshold divides the data more equally; is this what it is used for?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

故人的歌 2024-11-24 22:24:11

实际上,当您不使用偏差时,您只需设置阈值即可。否则,阈值为 0。

请记住,单个神经元用超平面划分输入空间。好的?

现在想象一个神经元有 2 个输入 X=[x1, x2]、2 个权重 W=[w1, w2] 和阈值 TH。该方程显示了该神经元的工作原理:

x1.w1 + x2.w2 = TH

这等于:

x1.w1 + x2.w2 - 1.TH = 0

即,这是将划分输入空间的超平面方程。

请注意,如果您手动设置阈值,该神经元才会工作。解决方案是将 TH 更改为另一个权重,因此:

x1.w1 + x2.w2 - 1.w0 = 0

其中术语 1.w0 是您的 BIAS。现在您仍然可以在输入空间中绘制平面,而无需手动设置阈值(即阈值始终为 0)。但是,如果您将阈值设置为另一个值,权重将自行调整以调整方程,即权重(包括偏差)吸收阈值影响。

Actually, you'll just set threshold when you aren't using bias. Otherwise, the threshold is 0.

Remember that, a single neuron divides your input space with a hyperplane. Ok?

Now imagine a neuron with 2 inputs X=[x1, x2], 2 weights W=[w1, w2] and threshold TH. The equation shows how this neuron works:

x1.w1 + x2.w2 = TH

this is equals to:

x1.w1 + x2.w2 - 1.TH = 0

I.e., this is your hyperplane equation that will divides the input space.

Notice that, this neuron just work if you set manually the threshold. The solution is change TH to another weight, so:

x1.w1 + x2.w2 - 1.w0 = 0

Where the term 1.w0 is your BIAS. Now you still can draw a plane in your input space without set manually a threshold (i.e, threshold is always 0). But, in case you set the threshold to another value, the weights will just adapt themselves to adjust equation, i.e., weights (INCLUDING BIAS) absorves the threshold effects.

初与友歌 2024-11-24 22:24:11

在每个节点中计算权重和输入的乘积之和,如果该值高于某个阈值(通常为 0),则神经元触发并获取激活值(通常为 1);否则,它将采用停用值(通常为 -1)。具有这种激活函数的神经元也称为人工神经元或线性阈值单元。

The sum of the products of the weights and the inputs is calculated in each node, and if the value is above some threshold (typically 0) the neuron fires and takes the activated value (typically 1); otherwise it takes the deactivated value (typically -1). Neurons with this kind of activation function are also called Artificial neurons or linear threshold units.

開玄 2024-11-24 22:24:11

在 Daok 的帮助下,我想我现在明白了。我只是想添加信息供其他人查找。

单层感知器的分离器的方程为

Σwjxj+bias=threshold

这意味着如果输入高于阈值,或者

Σwjxj+偏差>阈值,它被分类为一类,并且如果

Σwjxj+bias <阈值,它被分类到另一个。

偏差和阈值实际上具有相同的目的,即翻译该线(请参阅偏差的作用在神经网络中)。然而,由于位于等式的两侧,它们是“负比例”。

例如,如果偏差为 0,阈值为 0.5,则相当于偏差为 -0.5,阈值为 0。

I think I understand now, with help from Daok. I just wanted to add information for other people to find.

The equation for the separator for a single-layer perceptron is

Σwjxj+bias=threshold

This means that if the input is higher than the threshold, or

Σwjxj+bias > threshold, it gets classified into one category, and if

Σwjxj+bias < threshold, it get classified into the other.

The bias and the threshold really serve the same purpose, to translate the line (see Role of Bias in Neural Networks). Being on opposite sides of the equation, though, they are "negatively proportional".

For example, if the bias was 0 and the threshold 0.5, this would be equivalent to a bias of -0.5 and a threshold of 0.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文