简单 mysql select 查询允许内存大小超过 256mb
编辑:对于以后发现这个问题的人来说,CI 在这种情况下会使用大量内存,因为它会为每一行创建一个对象(使用 result_array() 似乎并没有好多少),所以最好的选择就是使用 PHP 的内置 mysql 函数。如果您使用 MYSQLI,您可以像这样访问连接链接:
$this->db->conn_id
我正在尝试通过命令行运行脚本(测试 cronjob),该脚本所做的事情是无关紧要的,因为它一开始就失败了选择并没有进一步。
我正在使用 Codeigniter 2.0.3。
我的表如下所示:
CREATE TABLE IF NOT EXISTS `graphic_files` (
`graphic_file_id` int(11) unsigned NOT NULL AUTO_INCREMENT,
`graphic_file_style_id` tinyint(2) unsigned NOT NULL,
`graphic_file_fm_id` bigint(20) unsigned DEFAULT NULL,
`graphic_file_config_line` varchar(255) NOT NULL,
`graphic_file_config_line_hash` varchar(32) NOT NULL,
`graphic_file_location` varchar(255) DEFAULT NULL,
`graphic_file_pack_id` int(10) unsigned NOT NULL,
`graphic_file_enabled` tinyint(1) NOT NULL,
`graphic_file_alternative` tinyint(1) NOT NULL,
`graphic_file_version` decimal(4,2) NOT NULL,
`graphic_file_time_added` int(11) unsigned NOT NULL,
`graphic_file_time_modified` int(11) unsigned NOT NULL,
`graphic_file_size` int(11) unsigned NOT NULL,
PRIMARY KEY (`graphic_file_id`),
KEY `graphic_file_style_id` (`graphic_file_style_id`),
KEY `graphic_file_fm_id` (`graphic_file_fm_id`),
KEY `graphic_file_config_line_hash` (`graphic_file_config_line_hash`),
KEY `graphic_file_pack_id` (`graphic_file_pack_id`),
KEY `graphic_file_enabled` (`graphic_file_enabled`),
KEY `graphic_file_version` (`graphic_file_version`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 AUTO_INCREMENT=240752 ;
有 240,000 行。
我尝试使用此查询选择其中大约 120,000 个:
SELECT * FROM graphic_files WHERE graphic_file_enabled = 0 AND graphic_file_style_id = 5
但我收到允许的内存大小错误,如下所示:
允许的内存大小 268435456 字节已耗尽(尝试分配 92 字节)中 xxx/codeigniter_2.0.3/database/drivers/mysqli/mysqli_result.php 上 第167行
我意识到简单的答案是我内存不足,但这对于简单地执行选择查询来说似乎很荒谬,特别是在允许的内存大小为 256mb 的情况下。
有人可以提出这个原因吗?这可能与 codeigniter 及其构建结果对象的方式有关吗?
EDIT: For anyone finding this at a later date CI will use alot of memory in these sorts of circumstances since it will create an object for every row (using result_array() doesn't seem much better), so the best bet is just to use PHP's built in mysql functions. If you're using MYSQLI you can access the connection link like so:
$this->db->conn_id
I'm trying to run a script through the command line (testing for what will be a cronjob), what the script does is irrelevant since it is failing at the first select and doesn't get any further.
I am using Codeigniter 2.0.3.
My table looks like this:
CREATE TABLE IF NOT EXISTS `graphic_files` (
`graphic_file_id` int(11) unsigned NOT NULL AUTO_INCREMENT,
`graphic_file_style_id` tinyint(2) unsigned NOT NULL,
`graphic_file_fm_id` bigint(20) unsigned DEFAULT NULL,
`graphic_file_config_line` varchar(255) NOT NULL,
`graphic_file_config_line_hash` varchar(32) NOT NULL,
`graphic_file_location` varchar(255) DEFAULT NULL,
`graphic_file_pack_id` int(10) unsigned NOT NULL,
`graphic_file_enabled` tinyint(1) NOT NULL,
`graphic_file_alternative` tinyint(1) NOT NULL,
`graphic_file_version` decimal(4,2) NOT NULL,
`graphic_file_time_added` int(11) unsigned NOT NULL,
`graphic_file_time_modified` int(11) unsigned NOT NULL,
`graphic_file_size` int(11) unsigned NOT NULL,
PRIMARY KEY (`graphic_file_id`),
KEY `graphic_file_style_id` (`graphic_file_style_id`),
KEY `graphic_file_fm_id` (`graphic_file_fm_id`),
KEY `graphic_file_config_line_hash` (`graphic_file_config_line_hash`),
KEY `graphic_file_pack_id` (`graphic_file_pack_id`),
KEY `graphic_file_enabled` (`graphic_file_enabled`),
KEY `graphic_file_version` (`graphic_file_version`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 AUTO_INCREMENT=240752 ;
There are 240,000 rows.
I am trying to select around 120,000 of them with this query:
SELECT * FROM graphic_files WHERE graphic_file_enabled = 0 AND graphic_file_style_id = 5
But I get an allowed memory size error like so:
Allowed memory size of 268435456 bytes exhausted (tried to allocate 92
bytes) in
xxx/codeigniter_2.0.3/database/drivers/mysqli/mysqli_result.php on
line 167
I realise the simple answer is that i'm out of memory, but this seems ridiculous for simply doing a select query, especially with a high allowed memory size of 256mb.
Can anybody suggest a reason for this? Could it be to do with codeigniter and the way it builds the results object?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
好的,假设我们处理的是最低限度(此处仅使用声明的大小),每行数据为 624 字节。这有点轻描淡写,因为许多可变宽度字段需要额外的空间来记录它们的实际大小。鉴于所有这些都将被转换为内部 PHP 样式,我们可能可以在那里添加更多内容(结果作为数组返回,可能是哈希值,具体取决于设置)。但实际上,在一切尘埃落定之后,我们可能会处理每条记录总共近 2kb 的数据。
此外,我们正在处理其中的 120,000 个。 120000 * 2048 = 245760000 字节 = 234.4 MB 数据。
PHP 有它的开销,codeigniter 也是如此。总之,这足以让您突破内存限制。
如果您想更好地估计正在使用多少内存,请继续调高内存使用限制,然后在执行选择查询后,检查 memory_get_usage()。
要减少内存使用量,您可以通过添加额外的 where 子句来减少选择的行数,仅选择必要的列而不是全部列,或者使用 LIMIT 语句。如果您采用 LIMIT 路线,则可以处理所有记录和所有列,但要分块处理。每个 select 语句可以返回有限数量的行,例如 100,但是您可以使每个后续调用从另一个调用停止的地方继续。这样,在给定时间内存中的数据永远不会超过 100 行。
Ok, assuming that we're dealing with the bare minimum (just using declared sizes here), each row of data is 624 bytes. That's a bit of an understatement because many of those variable width fields take additional space to note how large they actually are. Given that all of them are going to be converted to an internal PHP style, we can probably tack on a bit more there (results returned as an array, possibly a hash depending on settings). But in actuality we're probably going to be dealing with a total of almost 2kb per record after it's all said and done.
Also, we're dealing with 120,000 of them. 120000 * 2048 = 245760000 bytes = 234.4 MB of data.
PHP has it's overhead, and so does codeigniter. Together, that's more than enough to push you over the memory limit.
If you want a better estimate of how much memory is being used, go ahead and crank up the memory usage limit, then after you do your select query, check memory_get_usage().
What you can do to cut down on memory usage is either thin down the number of rows being selected by adding additional where clauses, select only the necessary columns instead of all of them, or use a LIMIT statement. If you go the LIMIT route, you can process all the records and all the columns, but in chunks. Each select statement could return a limited number of rows, say 100, but you could make each subsequent call resume where the other left off. That way you'll never have more than 100 rows of data in memory at a given time.
对于大数据,你需要使用
mysql资源
而不是CI函数With big data u need to use
mysql resource
instead of CI functions