为什么 Ada 没有垃圾收集器?
我知道 GC 在 Ada 开发时并不流行,对于嵌入式编程的主要用例来说,它仍然不是一个好的选择。
但考虑到 Ada 是一种通用编程语言,为什么在该语言的后续版本和编译器实现中没有引入部分可选(仅跟踪显式标记的内存对象)垃圾收集器。
我简直无法再考虑开发一个没有垃圾收集器的普通桌面应用程序。
I know GC wasn't popular in the days when Ada was developed and for the main use case of embedded programming it is still not a good choice.
But considering that Ada is a general purpose programming language why wasn't a partial and optional (traces only explicitly tagged memory objects) garbage collector introduced in later revisions of the language and the compiler implementations.
I simply can't think of developing a normal desktop application without a garbage collector anymore.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(7)
Ada 的设计考虑了军事应用。其设计的首要任务之一是决定论。也就是说,人们希望 Ada 程序每次、在任何环境、在所有操作系统下始终以完全相同的方式执行......类似的事情。
垃圾收集器将一个应用程序变成两个应用程序,相互竞争。当 GC 决定开始工作时,Java 程序会随机出现问题,如果它太慢,则应用程序有时会耗尽堆,而其他时候则不会。
简化:垃圾收集器向程序中引入了设计者不希望的一些可变性。你把事情弄得一团糟——你把它清理干净!每次都是相同的代码,相同的行为。
请注意,并不是说《艾达》在全球范围内取得了巨大的成功。
Ada was designed with military applications in mind. One of the big priorities in its design was determinism. i.e. one wanted an Ada program to consistently perform exactly the same way every time, in any environment, under all operating systems... that kinda thing.
A garbage collector turns one application into two, working against one another. Java programs develop hiccups at random intervals when the GC decides to go to work, and if it's too slow about it there's a chance that an application will run out of heap sometimes and not others.
Simplified: A garbage collector introduces some variability into a program that the designers didn't want. You make a mess - you clean it up! Same code, same behavior every time.
Not that Ada became a raging worldwide success, mind you.
因为 Ada 设计用于实时控制武器的防御系统,而垃圾收集会干扰应用程序的计时。这是很危险的,这就是为什么多年来 Java 都警告说它不能用于医疗保健和军事控制系统。
我相信Java不再有这样的免责声明的原因是因为底层硬件变得更快,而且Java有更好的GC算法和对GC更好的控制。
请记住,Ada 是在 1970 年代和 1980 年代开发的,当时计算机的功能远远不如今天,并且在控制应用程序中,计时问题至关重要。
Because Ada was designed for use in defense systems which control weapons in realtime, and garbage collection interferes with the timing of your application. This is dangerous which is why, for many years, Java came with a warning that it was not to be used for healthcare and military control systems.
I believe that the reason there is no longer such a disclaimer with Java is because the underlying hardware has become much faster as well as the fact that Java has better GC algorithms and better control over GC.
Remember that Ada was developed in the 1970's and 1980's at a time when computers were far less powerful than they are today, and in control applications timing issues were paramount.
首先,该语言中没有任何内容真正禁止垃圾收集。
其次,一些实现确实执行垃圾收集。特别是所有针对 JVM 垃圾收集的实现。
第三,有一种方法可以让所有编译器进行一定量的垃圾收集。您会看到,当访问类型超出范围时,如果您专门告诉语言留出一定量的空间来存储其对象,那么该空间将在此时被销毁。我过去曾使用过它来进行一些垃圾收集。您使用的声明 voodo 是:
如果您这样做,那么当 Foo 类型超出范围时,分配给 Foo 指针指向的 Blah 对象的所有(100K)内存将被清除。由于 Ada 允许您将子例程嵌套在其他子例程中,因此它的功能特别强大。
要详细了解 storage_size 和存储池可以为您做什么,请参阅 LRM 13.11
第四,编写良好的 Ada 程序不会像 C 程序那样依赖动态内存分配。 C 有许多设计漏洞,实践者学会了使用指针来弥补这些漏洞。其中很多习语在 Ada 中并不是必需的。
First off, there is nothing in the language really that prohibits garbage collection.
Secondly some implementations do perform garbage collection. In particular, all the implementations that target the JVM garbage collect.
Thirdly, there is a way to get some amount of garbage collection with all compilers. You see, when an access type goes out of scope, if you specifially told the language to set aside a certian amount of space for storage of its objects, then that space will be destroyed at that point. I've used this in the past to get some modicum of garbage collection. The declaration voodo you use is:
If you do this, then all (100K of) memory allocated to Blah objects pointed to by Foo pointers will be cleaned up when the Foo type goes out of scope. Since Ada allows you to nest subroutines inside of other subroutines, this is particularly powerful.
To see more about what storage_size and storage pools can do for you, see LRM 13.11
Fourthly, well-written Ada programs don't tend to rely on dynamic memory allocation nearly as much as C programs do. C had a number of design holes that practicioners learned to use pointers to paint over. A lot of those idioms aren't nessecary in Ada.
答案更复杂:由于实时限制等原因,Ada 不需要垃圾收集器。然而,该语言经过巧妙设计,可以实现垃圾收集器。
尽管许多(几乎所有)编译器不包含垃圾收集器,但有一些值得注意的实现:
网络上还有很多关于 Ada 垃圾收集的其他资源。这个主题已经被详细讨论过,主要是因为 90 年代中期与 Java 的激烈竞争(看看 此页:
“Ada 95 是 Java 语言应有的样子”
),在 Microsoft 开发 C# 之前,Java 是“下一个大事件”。the answer is more complicated: Ada does not require a garbage collector, because of real-time constraints and such. however, the language have been cleverly designed so as to allow the implementation of a garbage collector.
although, many (almost all) compilers do not include a garbage collector, there are some notable implementation:
there are plenty other sources about garbage collection in Ada around the web. this subject has been discussed at length, mainly because of the fierce competition with Java in the mid '90s (have a look at this page:
"Ada 95 is what the Java language should have been"
), when Java was "The Next Big Thing" before Microsoft drew C#.我想我应该分享一个如何实现 Free() 过程的非常简单的示例(它将以所有 C 程序员熟悉的方式使用)...
在程序末尾调用 Free 会将分配的 Integer 返回到存储池(C 语言中的“堆”)。您可以使用 valgrind 来证明这实际上可以防止 4 字节内存泄漏。
Ada.Unchecked_Deallocation(一个通用定义的过程)可以用于(我认为)任何可以使用“new”关键字分配的类型。 Ada 参考手册(“13.11.2 Unchecked Storage Deallocation”)有更多详细信息。
I thought I'd share a really simple example of how to implement a Free() procedure (which would be used in a way familiar to all C programmers)...
Calling Free at the end of the program will return the allocated Integer to the Storage Pool ("heap" in C parlance). You can use valgrind to demonstrate that this does in fact prevent 4 bytes of memory being leaked.
The Ada.Unchecked_Deallocation (a generically defined procedure) can be used on (I think) any type that may be allocated using the "new" keyword. The Ada Reference Manual ("13.11.2 Unchecked Storage Deallocation") has more details.
首先,我想知道现在谁在使用 Ada。我实际上很喜欢这门语言,甚至还有一个适用于 Linux/Ada 的 GUI 库,但我已经很多年没有听说过任何有关 Ada 积极开发的消息了。由于它与军事的联系,我真的不确定它是否有古老的历史,或者是否非常成功,以至于所有提到它的用途都是保密的。
我认为 Ada 中没有 GC 有几个原因。首先,也是最重要的,它可以追溯到大多数编译语言主要使用堆栈或静态内存,或者在少数情况下显式堆分配/释放的时代。 GC 作为一种通用哲学,大约在 1990 年左右才真正开始流行,当时 OOP、改进的内存管理算法和强大到足以腾出周期来运行它的处理器开始发挥作用。仅仅编译 Ada 就能对 1989 年的 IBM 4331 大型机造成无情的后果。现在我有了一部可以超越那台机器的CPU 的手机。
另一个很好的理由是,有人认为严格的程序设计包括对内存资源的精确控制,并且不应该容忍让动态获取的对象浮动。可悲的是,随着动态内存变得越来越普遍,太多的人最终导致了内存泄漏。另外,就像汇编语言相对于高级语言的“效率”,以及原始 JDBC 相对于 ORM 系统的“效率”一样,手动内存管理的“效率”随着规模的扩大而趋于反转(我见过 ORM 基准测试)其中 JDBC 等效项的效率只有一半)。我知道这违反直觉,但如今系统在全局优化大型应用程序方面要好得多,而且它们能够针对表面上的微小变化进行彻底的重新优化。包括基于检测到的动态重新平衡算法加载。
恐怕我不得不与那些说实时系统无法承受 GC 内存的人有所不同。 GC 不再每隔几分钟就会冻结整个系统。如今,我们有更智能的方法来回收记忆。
First off, I'd like to know who's using Ada these days. I actually like the language, and there's even a GUI library for Linux/Ada, but I haven't heard anything about active Ada development for years. Thanks to its military connections, I'm really not sure if it's ancient history or so wildly successful that all mention of its use is classified.
I think there's a couple of reason for no GC in Ada. First, and foremost, it dates back to an era where most compiled languages used primarily stack or static memory, or in a few cases, explicit heap allocate/free. GC as a general philosophy really only took off about 1990 or so, when OOP, improved memory management algorithms and processors powerful enough to spare the cycles to run it all came into their own. What simply compiling Ada could do to an IBM 4331 mainframe in 1989 was simply merciless. Now I have a cell phone that can outperform that machine's CPU.
Another good reason is that there are people who think that rigorous program design includes precise control over memory resources, and that there shouldn't be any tolerance for letting dynamically-acquired objects float. Sadly, far too many people ended up leaking memory as dynamic memory became more and more the rule. Plus, like the "efficiency" of assembly language over high-level languages, and the "efficiency" of raw JDBC over ORM systems, the "efficiency" of manual memory management tends to invert as it scales up (I've seen ORM benchmarks where the JDBC equivalent was only half as efficient). Counter-intuitive, I know, but these days systems are much better at globally optimizing large applications, plus they're able to make radical re-optimizations in response to superficially minor changes.Including dynamically re-balancing algorithms on the fly based on detected load.
I'm afraid I'm going to have to differ with those who say that real-time systems can't afford GC memory. GC is no longer something that freezes the whole system every couple of minutes. We have much more intelligent ways to reclaim memory these days.
你的问题不正确。确实如此。请参阅为您处理 GC 的 ada.finalization 包。
Your question is incorrect. It does. See the package ada.finalization which handles GC for you.