如何通过 PHP 从 MySQL 数据库导出 20 000 个联系人并导入桌面通讯录?

发布于 2024-08-09 06:48:46 字数 300 浏览 3 评论 0原文

我需要编写一个脚本来导出联系人数据,以便在贴纸上打印以进行邮寄。数据库表中最多可以有 20 000 条记录。

主要问题是记录数量可能如此之多,并且该站点托管在共享服务器上,导出全部 20k 记录可能会终止脚本或停止运行。 如果数字不是那么高,我只需将所有数据导出到 hCard 文件中。

该解决方案需要采用 PHP 语言,并且生成的文件必须可供 MS Office 用于打印地址贴纸。

欢迎所有想法

I need to write a script to export contact data for printing on stickers for mailing. There could be up to 20 000 records in the database table.

The main problem is that the number of records could be so high and the site is hosted on a shared server and exporting the whole 20k records would, presumably, kill the script or stall.
I the numbers were not so high i would simply export all data into a hCard file.

The solution needs to be in PHP, and the resulting file must be usable by MS Office for use to print out address stickers.

All ideas are welcome!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(5

第七度阳光i 2024-08-16 06:48:46

我假设您确实有权访问 MySQL 服务器。为什么不直接连接到 MySQL 它应该可以节省你很多时间。如果操作需要很长时间或者您预计会出现性能问题,请在午夜进行计划。

您可以像这样直接导出到 csv,

SELECT * INTO OUTFILE 'result.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM my_table;

通过 mailmerge 将其加载到 Word 中,然后就可以了!

I'm presuming that you DO have access to MySQL server. Why not connect to MySQL directly it should save you a LOT of time. If the operation takes to long or you expect performance issues then plan it at midnight.

You can export directly to csv like this,

SELECT * INTO OUTFILE 'result.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM my_table;

Load it in Word through mailmerge and of you go!

七七 2024-08-16 06:48:46

20k 条记录导出到 CSV 文件应该非常快。如果您的共享主机资源严重匮乏,无法在几秒钟内处理 20k 条记录,那么您就会遇到更大的问题。

20k records should be very fast to export to a CSV file. if your shared hosting is so resource-starved that it can't process 20k records in under a couple of seconds, then you've got bigger problems.

蓝海 2024-08-16 06:48:46

批量工作...

  1. 将 50 个地址加载到 $i.csv 中。
  2. 重新加载您的网站(提示:header()),重复大约 400 次。
  3. 将 400 个 cvs 文件复制到一个大文件中(即使使用记事本)。
  4. 例如使用 Excel 打开。

Work in batches...

  1. Load 50 Addresses into $i.csv.
  2. reload your site (tip: header()), repeat for around 400 times.
  3. copy the 400 cvs files into one big one (even by using notepad).
  4. Open e.g. with Excel.
捎一片雪花 2024-08-16 06:48:46

这取决于主机,但通常可以使用 设置时间限制。将数据转储到 csv 文件中是一种方法。正如 longneck 所说,20k 条记录应该比通常分配给脚本运行的 30 秒更快。

It depends on the host, but you can generally allow for longer script execution times by using set_time_limit. This coupled with dumping the data into a csv file is one way. As longneck has stated, 20k records should be faster than the 30 seconds usually allotted for scripts to run.

宫墨修音 2024-08-16 06:48:46

代码示例:(我已经必须这样做)

$static_amount = 100; // 100 Entries to the same time
for($i = 0; $i < $mysql->count(); $i+$static_amount)
{
$toFile[$i] = $mysql->query('SELECT * FROM table WHERE id < $i AND id > $static_amount');
sleep(10); // very important!!!
}

if($_COOKIE['amount'])
{
$_COOKIE['amount'] = 100;
}

$toFile = $mysql->query('SELECT * FROM table WHERE id > $_COOKIE['alreadyPerformed'] LIMIT $_COOKIE['amount']';
setCookie('amount') = 100;
setCookie('alreadyPerformed') = $_COOKIE['alreadyPerformed'] + 100;

Code example: (I already had to do this)

$static_amount = 100; // 100 Entries to the same time
for($i = 0; $i < $mysql->count(); $i+$static_amount)
{
$toFile[$i] = $mysql->query('SELECT * FROM table WHERE id < $i AND id > $static_amount');
sleep(10); // very important!!!
}

OR

if($_COOKIE['amount'])
{
$_COOKIE['amount'] = 100;
}

$toFile = $mysql->query('SELECT * FROM table WHERE id > $_COOKIE['alreadyPerformed'] LIMIT $_COOKIE['amount']';
setCookie('amount') = 100;
setCookie('alreadyPerformed') = $_COOKIE['alreadyPerformed'] + 100;
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文