scrapy 执行爬虫报错:exceptions.KeyError: 'd',怎么解决?

发布于 2022-09-02 00:27:09 字数 2388 浏览 7 评论 0

我在windows环境使用scrapy框架写了个爬虫,一切运行良好,准备部署到服务器,服务器环境为ubuntu
安装了pythonscrapy等之后,执行爬虫,报错。
各种google,还是没有解决,请各位帮忙看下!多谢!

Unhandled Error
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 150, in _run_command
    cmd.run(args, opts)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py", line 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 153, in crawl
    d = crawler.crawl(*args, **kwargs)
  File "/usr/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1237, in unwindGenerator
    return _inlineCallbacks(None, gen, Deferred())
--- <exception caught here> ---
  File "/usr/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1099, in _inlineCallbacks
    result = g.send(result)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 71, in crawl
    self.engine = self._create_engine()
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 83, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "/usr/local/lib/python2.7/dist-packages/scrapy/core/engine.py", line 67, in __init__
    self.scraper = Scraper(crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/core/scraper.py", line 70, in __init__
    self.itemproc = itemproc_cls.from_crawler(crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 56, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 34, in from_settings
    mw = mwcls.from_crawler(crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/pipelines/media.py", line 33, in from_crawler
    pipe = cls.from_settings(crawler.settings)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/pipelines/images.py", line 57, in from_settings
    return cls(store_uri)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/pipelines/files.py", line 160, in __init__
    self.store = self._get_store(store_uri)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/pipelines/files.py", line 180, in _get_store
    store_cls = self.STORE_SCHEMES[scheme]
exceptions.KeyError: 'd'

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

泪之魂 2022-09-09 00:27:09

我也出现这个问题,解决不了

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文