如何作为 cron 作业在不同时间每 30 分钟运行 3 个 php 脚本?
好吧,
我有 3 个不同的 php 脚本,实际上它是由 cron 作业运行的,而且我的 cron 文件中有 3 个条目,而我只想只有一个。我将尝试解释:
1 - 文件在我的数据库中运行查询并将一些数据保存在数据库中。 (需要几秒钟) 2 - 运行 2 号脚本,这将在我的 Web 服务器内创建 2 个新的配置文件。 (执行需要1分多钟) 3 - 第三个文件打开 ssh 连接并在服务器中执行一些命令,它将把步骤 2 生成的文件放入服务器并重新加载服务器。
实际情况:
我每次执行3个脚本,这3个脚本都需要每30分钟运行一次,然后我运行第一个,5分钟后运行第二个,最后15分钟后运行我运行了脚本 2,我运行了脚本 3。
它工作得很好,但我想在一个 php 脚本中完成所有事情,这可能吗?在我的 cron 中只进行一次调用?
Well,
I have 3 different php scripts and actually it is running by a cron job, but also I have 3 entries in my cron file, and I would like to have only one. I will try to explain:
1 - File run a query in my data base and save some data in the database. (It takes few seconds)
2 - Run script number 2, this will create 2 new files of configuration inside my web server. (It takes more than 1 minute to execute)
3 - The third file open a ssh connection and exec some commands in the server, it will put the files generated is step 2 into the server and reload the server.
The actual situation:
I execute the 3 scripts each time, all the 3 scripts need to run every 30 minutes, then I run the first one, after 5 minutes I run the second one and finally after 15 minutes I had run the script 2 I run the script 3.
It works perfect, but I would like to do every thing inside a single php script, is that possible and give only one call in my cron?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
如果这些脚本都依赖于前一个脚本已完成其工作,那么您只想以固定的时间间隔运行第一个脚本,然后让该脚本按顺序执行其他两个脚本。
stage1.php:
这样,第二个和第三个脚本直到第一个脚本完成其必须执行的操作后才会开始运行,并且如果第一个序列中出现某些问题,您将有机会中止整个过程。
任何其他方式都会非常激烈。考虑以下顺序:
无论出于何种原因,脚本 #1 运行时间较长,直到脚本 #2 启动并开始传输备份文件备份完成之前备份文件。文件传输需要很长时间,因为网络繁忙,所以现在您已经运行备份,它正在传输到其他服务器,并且清理脚本现在已经消失并删除了 .tar.gz脚本 #1 正在制作。
If the scripts all depend on the previous one having completed its work, you'd want to run only the first one at fixed intervals, then have that script execute the other two in sequence.
stage1.php:
That way, the second and third scripts will NOT start running until after first one has completed whatever it has to do, and gives you the opportunity to abort the whole thing if something failed in the first sequence.
Any other way would be very racey. Consider the following sequence:
For whatever reason script #1 runs long, to the point that script #2 kicks in and starts transferring the backup file BEFORE the backup has completed. The file transfer takes extra long because the network's busy, so now you've the backup running, it's being transferred to the other server, AND the cleanup script has now gone and deleted the .tar.gz that script #1 was making.