如何使用 PHP 在 MySQL 数据库中插入大文件?

发布于 2024-07-12 13:25:57 字数 349 浏览 6 评论 0原文

我想将最大大小为 10MB 的大文件上传到我的 MySQL 数据库。 使用 .htaccess 我将 PHP 自己的文件上传限制更改为“10485760”= 10MB。 我可以毫无问题地上传最大 10MB 的文件。

但如果文件大小超过 1 MB,我无法将其插入数据库。

我使用 file_get_contents 读取所有文件数据并将其作为要插入 LONGBLOB 字段的字符串传递给插入查询。

但大于 1 MB 的文件不会添加到数据库中,尽管我可以使用 print_r($_FILES) 来确保文件上传正确。 任何帮助将不胜感激,我将在接下来的 6 小时内需要它。 所以,请帮忙!

I want to upload a large file of maximum size 10MB to my MySQL database. Using .htaccess I changed PHP's own file upload limit to "10485760" = 10MB. I am able to upload files up to 10MB without any problem.

But I can not insert the file in the database if it is more that 1 MB in size.

I am using file_get_contents to read all file data and pass it to the insert query as a string to be inserted into a LONGBLOB field.

But files bigger than 1 MB are not added to the database, although I can use print_r($_FILES) to make sure that the file is uploaded correctly. Any help will be appreciated and I will need it within the next 6 hours. So, please help!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(7

爱已欠费 2024-07-19 13:25:57

您需要检查 MySQL 配置值“max_allowed_pa​​cket”,该值可能设置得太小,从而阻止 INSERT(本身很大)发生。

从 mysql 命令提示符运行以下命令:

mysql> show variables like 'max_allowed_packet';

确保其足够大。 有关此配置选项的更多信息,请参阅

MySQL max_allowed_pa​​cket

这也会影响 PHP 中的 mysql_escape_string() 和 mysql_real_escape_string() 限制字符串创建的大小。

You will want to check the MySQL configuration value "max_allowed_packet", which might be set too small, preventing the INSERT (which is large itself) from happening.

Run the following from a mysql command prompt:

mysql> show variables like 'max_allowed_packet';

Make sure its large enough. For more information on this config option see

MySQL max_allowed_packet

This also impacts mysql_escape_string() and mysql_real_escape_string() in PHP limiting the size of the string creation.

浅笑轻吟梦一曲 2024-07-19 13:25:57

据我所知,不要将文件存储在数据库中通常是更快更好的做法,因为它会很快变得庞大并减慢速度。 最好采取一种将文件存储在目录中的方法,然后将文件的位置存储在数据库中。

我们在 CMS 中为图像/pdfs/mpegs 等执行此操作,方法是为从 url 安全文件名命名的文件创建一个文件夹,并将文件夹名称存储在数据库中。 然后在表示层中写出它的 url 就很容易了。

As far as I know it's generally quicker and better practice not to store the file in the db as it will get massive very quickly and slow it down. It's best to make a way of storing the file in a directory and then just store the location of the file in the db.

We do it for images/pdfs/mpegs etc in the CMS we have at work by creating a folder for the file named from the url-safe filename and storing the folder name in the db. It's easy just to write out the url of it in the presentation layer then.

MySQL 的某些 PHP 扩展存在 LONGBLOB 和 LONGTEXT 数据类型的问题。 扩展程序可能不支持 Blob 流(一次发布一个 Blob 片段),因此它们必须一次性发布整个对象。

因此,如果 PHP 的内存限制或 MySQL 的数据包大小限制限制了您可以发布到数据库的对象的大小,您可能需要更改 PHP 或 MySQL 上的某些配置以允许这样做。

您没有说明正在使用哪个 PHP 扩展(MySQL 至少有三个),也没有显示用于将 blob 发布到数据库的任何代码。

Some PHP extensions for MySQL have issues with LONGBLOB and LONGTEXT data types. The extensions may not support blob streaming (posting the blob one segment at a time), so they have to post the entire object in one go.

So if PHP's memory limit or MySQL's packet size limit restrict the size of an object you can post to the database, you may need to change some configuration on either PHP or MySQL to allow this.

You didn't say which PHP extension you're using (there are at least three for MySQL), and you didn't show any of the code you're using to post the blob to the database.

╭⌒浅淡时光〆 2024-07-19 13:25:57

最好的答案是使用更好的实现,并且可以解决该问题。
您可以在此处阅读文章。 存储10MB、1000MB,都没关系。 该实现将文件分成许多较小的部分并将它们存储在多行中。这有助于加载和获取,因此内存也不会成为问题。

The best answer is to use an implementation that is better and also works around that issue.
You can read an article here. Store 10MB, 1000MB, doesn't matter. The implementation chunks/cuts the file into many smaller pieces and stores them in multiple rows.. This helps with load and fetching so memory doesn't also become an issue.

小糖芽 2024-07-19 13:25:57

您可以使用 MySQL 的 LOAD_FILE 函数存储文件,但您仍然必须遵守 max_allowed_pa​​cket 值以及文件必须与 MySQL 实例位于同一服务器上的事实。

You could use MySQL's LOAD_FILE function to store the file, but you still have to obey the max_allowed_packet value and the fact that the file must be on the same server as the MySQL instance.

疑心病 2024-07-19 13:25:57

您没有说出您遇到的错误(使用 mysql_error() 找出答案),但我怀疑您可能达到了最大数据包大小。

如果是这种情况,您需要 更改您的 MySQL 配置 max_allowed_pa​​cket

You don't say what error you're getting (use mysql_error() to find out), but I suspect you may be hitting the maximum packet size.

If this is the case, you'd need to change your MySQL configuration max_allowed_packet

对你的占有欲 2024-07-19 13:25:57

您没有说出您遇到的错误(使用 mysql_error() 来找出),但我怀疑您可能达到了最大数据包大小。

如果是这种情况,您需要更改 MySQL 配置 max_allowed_pa​​cket

我也有同样的问题。 并且数据无法通过chunck以“io模式”输入mysql数据库chunck

loop for : 
   read $data from file,
   write $data to blob
end loop
close file
close blob

解决方案似乎创建一个包含多部分blob的表,例如
创建表data_details

id int pk 自动增量,
chunk_number int 不为空,
数据部分 blob
);
???

You don't say what error you're getting (use mysql_error() to find out), but I suspect you may be hitting the maximum packet size.

If this is the case, you'd need to change your MySQL configuration max_allowed_packet

Well I have the same problem. And data cannot be entered in the mysql database chunck by chunck in a "io mode"

loop for : 
   read $data from file,
   write $data to blob
end loop
close file
close blob

A solution seems to create a table with multi-part blobs like
create table data_details
(
id int pk auto_increment,
chunck_number int not null,
dataPart blob
);
???

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文