filterKeys会导致堆栈溢出吗?

发布于 2025-01-01 05:25:45 字数 318 浏览 0 评论 0原文

据我了解,MapLike 的方法 filterKeys 在原始地图上创建了一个包装器。因此,如果我执行下面的代码 m 将是一串 10K 包装器和原始地图。

var m = Map(1 -> "one", 2 -> "two")
for(1 <- 0 until 10000) {m = m.filterKeys(_%2 == 0)}

现在我认为调用 m.contains 导致了堆栈溢出,但它并没有发生。您能解释一下这个案例中发生了什么吗?

As I understand, method filterKeys of MapLike creates a wrapper over the original map. So, if I execute the code below m will be a chain of 10K wrappers and the original map.

var m = Map(1 -> "one", 2 -> "two")
for(1 <- 0 until 10000) {m = m.filterKeys(_%2 == 0)}

Now I thought that an invocation of m.contains caused a stack overflow but it does not happen. Could you explain what is going on in this case?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

瘫痪情歌 2025-01-08 05:25:45

我无法重现堆栈溢出,但发生的情况如下:

override def filterKeys(p: A => Boolean): Map[A, B] = new DefaultMap[A, B] {
  override def foreach[C](f: ((A, B)) => C): Unit = for (kv <- self) if (p(kv._1)) f(kv)
  def iterator = self.iterator.filter(kv => p(kv._1))
  override def contains(key: A) = self.contains(key) && p(key)
  def get(key: A) = if (!p(key)) None else self.get(key)
}

请注意,没有复制值:新类只是向四个方法添加规则,这将改变它们的工作方式以反映您添加的过滤器。由于您重复应用 filterKeys (10000 次),这意味着您有 10000 个类,每个类都指向前一个类,第一个类则指向您的原始地图。

因此,在最终类上(直接或间接)调用上述方法之一将调用 10000 个嵌套方法,如果堆栈足够小,这肯定会产生堆栈溢出。

I could not reproduce the stack overflow, but here's what happen:

override def filterKeys(p: A => Boolean): Map[A, B] = new DefaultMap[A, B] {
  override def foreach[C](f: ((A, B)) => C): Unit = for (kv <- self) if (p(kv._1)) f(kv)
  def iterator = self.iterator.filter(kv => p(kv._1))
  override def contains(key: A) = self.contains(key) && p(key)
  def get(key: A) = if (!p(key)) None else self.get(key)
}

Note that there's no copying of values: the new class simply add rules to four methods that will change how they work to reflect the filter you added. Since you repeatedly apply filterKeys (10000 times), that means you have 10000 classes, each pointing to the previous one, and the first pointing to your original map.

Calling one of the above methods (directly or indirectly) on the final class will, therefore, call 10000 nested methods, which can certainly produce a stack overflow if your stack is small enough.

诗化ㄋ丶相逢 2025-01-08 05:25:45

我能够像这样导致堆栈溢出(Scala 2.9.1),

scala> var m = Map(1 -> "one", 2 -> "two")
scala> for (_ <- 0 until 1000000) { m = m.filterKeys(_ % 2 == 0) }
scala> m.contains(1)
huge stack trace

当然,您可以通过强制 filterKeys 在每一步实际执行其工作来避免堆栈跟踪:

scala> var m = Map(1 -> "one", 2 -> "two")
scala> for (_ <- 0 until 1000000) { m = Map() ++ m.filterKeys(_ % 2 == 0) }
scala> m.contains(1)
res1: Boolean = false

I was able to cause it to Stack Overflow like this (Scala 2.9.1)

scala> var m = Map(1 -> "one", 2 -> "two")
scala> for (_ <- 0 until 1000000) { m = m.filterKeys(_ % 2 == 0) }
scala> m.contains(1)
huge stack trace

You can, of course, avoid the stack trace by forcing filterKeys to actually do its work at each step:

scala> var m = Map(1 -> "one", 2 -> "two")
scala> for (_ <- 0 until 1000000) { m = Map() ++ m.filterKeys(_ % 2 == 0) }
scala> m.contains(1)
res1: Boolean = false
动听の歌 2025-01-08 05:25:45

如果我逐字复制,你的循环只执行1次。因此,您只创建了一个包装器,因此原本打算成为 10000 个包装器的链实际上只是 1 的链。这可能是一个拼写错误,但循环

for(1 <- 0 until 10000) {m = m.filterKeys(_%2 == 0)}

可能应该是

for(i <- 0 until 10000) {m = m.filterKeys(_%2 == 0)}

除此之外,丹尼尔是对的; fitlerKeys 确实产生了本质上是包装器的东西。我花了超过 10k 次迭代,但我确实成功创建了一个 StackOverflowError。

Your loop only executes 1 time, if I copy verbatim. Because of this you are only creating a single wrapper, so what was intended to be a chain of 10000 wrappers is just a chain of 1. It might be a typo, but the loop,

for(1 <- 0 until 10000) {m = m.filterKeys(_%2 == 0)}

should probably be

for(i <- 0 until 10000) {m = m.filterKeys(_%2 == 0)}

Aside from that, Daniel is right; fitlerKeys does produce what is essentially a wrapper. It took me more than 10k iterations, but I did manage to create a StackOverflowError.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文