Plone对上传数据大小有限制吗
我开发了一个 Plone 网站,并使用 csvreplicata 插件从 csv 文件上传数据。我成功上传了大约 6000 行的文件,但是当我尝试上传大约 120,000 行的 csv 时,plone 站点挂起。请问有人知道上传的数据大小是否有限制吗?
提前致谢
I developed a Plone site and I am using csvreplicata add-on to upload data from csv files. I managed to upload a file of about 6000 rows, however when I try to upload a csv with about 120,000 rows the plone site hangs. Does anyone know if there is a limit on the size of data uploaded please?
Thanks in advance
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
查看产品代码 我看不到任何事务保存点,所以我认为这里真正的问题是您的服务器由于单个事务的大小巨大而内存不足,并且由于它开始交换而挂起。尝试监视服务器的内存使用情况,如有必要,在导入过程中每处理 N 个元素就添加此代码:
这将创建事务的“子事务”,将数据从内存移动到硬盘。
请务必选择适当的 N 维度:对于非常大的事务,您可能会面临节省内存和磁盘空间不足的风险,因为每个保存点都会乘以硬盘上的数据维度。
更多信息:http://docs.zope.org/zope3/Book/ zodb/savepoints/show.html
Looking at the product code i can't see any transaction savepoint, so I think that the real issue here is that your server goes out of memory because of the huge size of the single transaction and it hangs because it starts swapping. Try to monitor your server's memory usage and if necessary add this code every N elements processed during your import:
This will make a "sub-transaction" of the transaction, moving data from memory to the hard disk.
Be sure to choose an appropriate N dimension: for very huge transactions you risk to save your memory and going out of disk space because every savepoint multiplies the data dimension on the hdd.
More info: http://docs.zope.org/zope3/Book/zodb/savepoints/show.html