使用 CFPropertyList 处理大型 .plist 文件
我使用 https://github.com/rodneyrehm/CFPropertyList 中的 CFPropertyList 来处理我添加的内容PHP。
一切工作正常,但现在所有内容都已添加,我的文件大约有 700KB,虽然不大,但似乎大到足以让 Apache 在尝试保存文件时崩溃。
子进程 pid 1278 退出信号分段错误
我在 CacheGrind 中看到,我的应用程序中的很多时间都是通过调用 CFPropertyList->import() 和 CFDictionary->toXML() 来完成的,那么哪里可能是瓶颈呢?
我是否要同时进行多项更改?我应该在更改之间加载()和保存()更多以避免一次保存太多更改吗?
有什么线索吗?
I'm using CFPropertyList from https://github.com/rodneyrehm/CFPropertyList for handling content I add with PHP.
It all worked fine, but now that all content is added my file has about 700KB which is not big but seems big enough to let Apache crash on trying to save a file.
child pid 1278 exit signal Segmentation fault
I see in CacheGrind that a lot of time in my application is taken by calls to CFPropertyList->import() and CFDictionary->toXML() so where could be the bottleneck there???
Am I making to many changes at once? Should I load() and save() inbetween changes more to avoid having too many changes saved at once?
Any clue?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
我不认为是大小造成了问题,而是 PHP 中的错误。仅当 PHP 本身存在严重错误时才会出现段错误。
接下来的步骤:
I do not think that it's the size that makes problems but a bug in PHP. Segfaults occur only if there is a serious bug in PHP itself.
The next steps:
当您在未知大小的文档中实现 searchNode() 函数时,您应该始终使用“深度”参数,以避免在文档中单步执行并在递归循环中多次调用您的函数。
因为这会产生无限循环,也会导致 PHP 中的段错误,但不会以致命错误或警告结束。
When you implement a searchNode() function in an document of unknown size, you should always use a "depth" parameter to avoid stepping down in the document and calling your function enormous times in a recursive loop.
Because that creates infinite loops that also cause a segfault in PHP which don't end in a fatal error or warning.