我有一个 18MB 的 MySQL 表备份。如何恢复这么大的 SQL 文件?
我使用名为“Shopp”的 WordPress 插件。它将产品图像存储在数据库中而不是标准的文件系统中,直到现在我才想到这一点。
我必须移动服务器,所以我做了备份,但恢复备份是一项可怕的任务。我需要恢复一张名为 wp_shopp_assets 的表,该表大小为 18MB。
非常感谢任何建议。
谢谢, 亨利.
I use a Wordpress plugin called 'Shopp'. It stores product images in the database rather than the filesystem as standard, I didn't think anything of this until now.
I have to move server, and so I made a backup, but restoring the backup is proving a horrible task. I need to restore one table called wp_shopp_assets which is 18MB.
Any advice is hugely appreciated.
Thanks,
Henry.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(5)
对于这样的大型操作,最好使用命令行。当涉及大量数据时,phpMyAdmin 会变得很棘手,因为 PHP 中存在各种超时,可能会导致它出错。
如果您可以通过 SSH 连接到两台服务器,那么您可以执行如下操作:
登录到 server1(您当前的服务器)并使用“mysqldump”将表转储到文件 ---
mysqldump --添加删除表-uSQLUSER -pPASSWORD -h
SQLSERVERDOMAIN DBNAME TABLENAME > BACKUPFILE
使用“scp”将该文件从 server1 安全复制到 server2 ---
scp BACKUPFILE USER@SERVER2DOMAIN:FOLDERNAME
注销服务器 1
登录到服务器 2(您的新服务器)并将该文件导入到新服务器中使用“mysql”的数据库 ---
mysql -uSQLUSER -pPASSWORD DBNAME < BACKUPFILE
您需要将大写文本替换为您自己的信息。如果您不知道在哪里可以找到这些内容,请在评论中询问。
如果您不时进行此类管理,那么了解其中一些命令行技巧是值得的。
For large operations like this it is better to go to command line. phpMyAdmin gets tricky when lots of data is involved because there are all sorts of timeouts in PHP that can trip it up.
If you can SSH into both servers, then you can do a sequence like the following:
Log in to server1 (your current server) and dump the table to a file using "mysqldump" ---
mysqldump --add-drop-table -uSQLUSER -pPASSWORD -h
SQLSERVERDOMAIN DBNAME TABLENAME > BACKUPFILE
Do a secure copy of that file from server1 to server2 using "scp" ---
scp BACKUPFILE USER@SERVER2DOMAIN:FOLDERNAME
Log out of server 1
Log into server 2 (your new server) and import that file into the new DB using "mysql" ---
mysql -uSQLUSER -pPASSWORD DBNAME < BACKUPFILE
You will need to replace the UPPERCASE text with your own info. Just ask in the comments if you don't know where to find any of these.
It is worthwhile getting to know some of these command line tricks if you will be doing this sort of admin from time to time.
尝试 HeidiSQL http://www.heidisql.com/
编辑:只是为了澄清。这是一个桌面应用程序,您将远程连接到您的数据库服务器。您不会受到 php 脚本最大运行时间或上传大小限制。
try HeidiSQL http://www.heidisql.com/
EDIT: Just to clarify. This is a desktop application, you will connect to your database server remotely. You won't be limited to php script max runtime, or upload size limit.
使用bigdupm。
在您的服务器上创建一个不容易猜到的文件夹,例如“BigDump_D09ssS”,或者我们
下载 http:// www.ozerov.de/bigdump.php 导入器文件,并在阅读说明并填写配置信息后将它们添加到该目录。
将 .SQL 文件与 bigdump 脚本一起通过 FTP 传输到该文件夹,然后转到浏览器并导航到该文件夹。
选择您上传的文件将开始导入 SQL 分割块,这将是一种更快的方法!
或者,如果这是一个问题,我推荐有关 SSH 和 mysql -u -p -n -f 方法的其他评论!
use bigdupm.
create a folder on your server witch is not easy to guess like "BigDump_D09ssS" or w.e
Download the http://www.ozerov.de/bigdump.php importer file and add them to that directory after reading the instructions and filling out your config information.
FTP The .SQL File to that folder along side the bigdump script and go to your browser and navigate to that folder.
Selecting the file you uploaded will start importing the SQL is split chunks and would be a much faster method!
Or if this is an issue i reccomend the other comment about SSH And
mysql -u -p -n -f
method!尽管这是一篇旧文章,但我想补充一点,当您拥有超过 10 个产品(图像)时,建议不要对图像使用数据库存储。
与其导出和导入如此大的文件,不如在传输之前将 Shopp 安装传输到图像文件存储中。
您可以使用这个免费插件来帮助您。在执行此操作之前,请务必备份您的文件和数据库。
Even though this is an old post I would like to add that it is recommended to not use database-storage for images when you have more than like 10 product(image)s.
Instead of exporting and importing such a huge file it would be better to transfer the Shopp installation to file-storage for images before transferring.
You can use this free plug-in to help you. Always backup your files and database before performing this action.
我所做的就是在代码编辑器中打开该文件,复制并粘贴到 phpmyadmin 中的 SQL 窗口中。听起来很傻,但我通过大文件发誓。
What I do is open the file in a code editor, copy and paste into a SQL window within phpmyadmin. Sounds silly, but I swear by it via large files.