人工神经网络的简单解释?

发布于 2024-08-28 06:41:06 字数 1536 浏览 4 评论 0原文

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

也只是曾经 2024-09-04 06:41:06

求和只是将一堆东西相加。所以,

Summation(1,2,3,4,5) = 1+2+3+4+5 = 15

(注意:总是加法:如果你想减法,就用负数求和)

这很简单,对吧? ;)

向量是一个有序元组,这实际上意味着它是按特定顺序排列的一堆数字。在物理学中最常见的是描述位置、力、速度等......它实际上没什么特别的,只是一些有序的数字,其中顺序很重要:

v = <1,2,3>

如果我们谈论几何,那么这个向量表示 3 维中的一个点x 坐标为 1、y 坐标为 2、z 坐标为 3 的空间(看这也很简单,对吧)?

在神经网络中,向量通常是神经元的输入向量,因此它实际上只是一个数值列表。向量的求和只不过是将向量中的所有值相加并得到一个数字作为结果(可以称为“标量”值)。

(这是匆忙和简化的 - 我相信其他人会帮助我完善它;))

PS。感谢您在中学阶段就深入研究这些东西! :)

Summation is just adding up a bunch of things. So,

Summation(1,2,3,4,5) = 1+2+3+4+5 = 15

(note: it's always adding: if you want to subtract, do a summation with negative numbers)

That was easy, right? ;)

A vector is an ordered tuple, which really just means it's bunch of numbers in a specific order. Most often seen in physics to describe position, force, velocity, etc... it's really nothing special, just some ordered numbers, where the ordering is significant:

v = <1,2,3>

If we are talking about geometry, then this vector represents a point in 3-dimensional space where the x coordinate is 1, the y coordinate is 2, and the z coordinate is 3 (See that was easy too, right)?

In neural nets, the vector is usually the vector of inputs to a neuron, so it's really just a list of numeric values. The summation of the vector would be nothing more than adding up all of the values in the vector and getting a single number as a result (which may be referred to as as "scalar" value).

(this was rushed and simplified - I'm sure someone else will help me refine it ;) )

PS. Kudos to you for diving into this stuff at the middle school level! :)

西瓜 2024-09-04 06:41:06

我有一段时间也遇到同样的问题了。我是一名高中生,所以你比我领先一点。我有一个假期,我用它来学习关于反向传播的所有知识,我发现几乎没有任何资源真正有太大帮助,除非你想读太多微积分,以至于你想死。我的建议是首先编写一个感知器,它是一个只有输入层和输出层的网络。这启发了我写一篇文章,所以希望在我在这里发帖后的半小时内应该有一个关于 http:// certioraomnia.blogspot.com/。这个问题可能有点晚了,因为这个问题是三年前提出的,但可能会对以后的其他人有所帮助。

I've had the same problem for a while. I'm a high school student, so you're a little ahead of me. I got a vacation and I used it to learn all I could on backpropagation, and I've found almost no resources that really help too much unless you want to read so much calculus that you want to die. My advice is to first write a perceptron, which is a network with only input layers and output layers. This inspired me o write a post, so hopefully within half an hour of my posting here there should be a tutorial on http://certioraomnia.blogspot.com/. It may be a little late as this question was asked three years ago, but it may help others later.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文