包含Django item 的 scrapy 爬虫不能正常在 celery 中运行

发布于 2022-09-04 05:57:58 字数 4489 浏览 12 评论 0

我用 scarpy 写了一个爬虫,爬虫爬到的信息会用到 Djangoitem 存进一个 Django 项目的数据库中,当我单独运行爬虫时一切正常,但我想将这个爬虫作为一个任务加到任务队列中定期执行,我用 celery 进行任务管理:

这是我的 celery task:

# coding_task.py
import sys

from celery import Celery
from collector.collector.crawl_agent import crawl

app = Celery('coding.net', backend='redis', broker='redis://localhost:6379/0')
app.config_from_object('celery_config')


@app.task
def period_task():
    crawl()

collector.collector.crawl_agent.crawl 是一个包含 Django item 的爬虫, item 如下:

import django

os.environ['DJANGO_SETTINGS_MODULE'] = 'RaPo3.settings'
django.setup()

from scrapy_djangoitem import DjangoItem
from xxx.models import Collection


class CodingItem(DjangoItem):
    django_model = Collection
    amount = scrapy.Field(default=0)
    role = scrapy.Field()
    type = scrapy.Field()
    duration = scrapy.Field()
    detail = scrapy.Field()
    extra = scrapy.Field()

此时如果运行 :celery -A coding_task worker --loglevel=info --concurrency=1,就会出现如下错误:

[2016-11-16 17:33:41,934: ERROR/Worker-1] Process Worker-1
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/billiard/process.py", line 292, in _bootstrap
    self.run()
  File "/usr/local/lib/python2.7/site-packages/billiard/pool.py", line 292, in run
    self.after_fork()
  File "/usr/local/lib/python2.7/site-packages/billiard/pool.py", line 395, in after_fork
    self.initializer(*self.initargs)
  File "/usr/local/lib/python2.7/site-packages/celery/concurrency/prefork.py", line 80, in process_initializer
    signals.worker_process_init.send(sender=None)
  File "/usr/local/lib/python2.7/site-packages/celery/utils/dispatch/signal.py", line 151, in send
    response = receiver(signal=self, sender=sender, **named)
  File "/usr/local/lib/python2.7/site-packages/celery/fixups/django.py", line 152, in on_worker_process_init
    self._close_database()
  File "/usr/local/lib/python2.7/site-packages/celery/fixups/django.py", line 181, in _close_database
    funs = [self._db.close_connection]  # pre multidb
AttributeError: 'module' object has no attribute 'close_connection'
[2016-11-16 17:33:41,942: INFO/MainProcess] Connected to redis://localhost:6379/0
[2016-11-16 17:33:41,957: INFO/MainProcess] mingle: searching for neighbors
[2016-11-16 17:33:42,962: INFO/MainProcess] mingle: all alone
/usr/local/lib/python2.7/site-packages/celery/fixups/django.py:199: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '

[2016-11-16 17:33:42,968: WARNING/MainProcess] /usr/local/lib/python2.7/site-packages/celery/fixups/django.py:199: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '

[2016-11-16 17:33:42,968: WARNING/MainProcess] celery@MacBook-Pro.local ready.
[2016-11-16 17:33:42,969: ERROR/MainProcess] Process 'Worker-1' pid:2777 exited with 'exitcode 1'
[2016-11-16 17:33:42,991: ERROR/MainProcess] Unrecoverable error: WorkerLostError('Could not start worker processes',)
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/celery/worker/__init__.py", line 208, in start
    self.blueprint.start(self)
  File "/usr/local/lib/python2.7/site-packages/celery/bootsteps.py", line 127, in start
    step.start(parent)
  File "/usr/local/lib/python2.7/site-packages/celery/bootsteps.py", line 378, in start
    return self.obj.start()
  File "/usr/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 271, in start
    blueprint.start(self)
  File "/usr/local/lib/python2.7/site-packages/celery/bootsteps.py", line 127, in start
    step.start(parent)
  File "/usr/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 766, in start
    c.loop(*c.loop_args())
  File "/usr/local/lib/python2.7/site-packages/celery/worker/loops.py", line 50, in asynloop
    raise WorkerLostError('Could not start worker processes')
WorkerLostError: Could not start worker processes

应该是 Django 环境和 celery 的环境冲突了? 在 item 里去掉Django item 相关的东西再运行一切正常。

如果我想运行这个包含 Django item 的 celery-scrapy 任务,应该怎么做呢?谢谢!

原文:https://github.com/celery/cel...

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

涫野音 2022-09-11 05:57:58

没人回答,自问自答吧。Github 上解决了。

这个是因为 Celery 3.1 对 Django 1.7 及以上的兼容性存在问题,当 Scrapy item 里用了 os.environ['DJANGO_SETTINGS_MODULE'] = 'RaPo3.settings' Celery 就会把这个 task 当做一个 Django 工程看待。

最直接的解决办法就是升级 Celery 到 4.0 之后到版本,修复了这个兼容性问题。

如果你用 redis 做 Celery 的 backend 支持,那么再升级到 4.0 的时候会遇到任务启动时连接 redis 出现 timeout 错误,这是 python 的 redis lib 版本过低导致的,确保升级 redis 的 python lib 到 2.10.4 及以后。

pip install redis==2.10.5 --upgrade

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文