蟒蛇 +机械化不与 Delicious 合作

发布于 2024-10-08 19:52:51 字数 1247 浏览 8 评论 0原文

我正在使用 Mechanize 和 Beautiful soup 来从 Delicious 上刮掉一些数据,

from mechanize import Browser
from BeautifulSoup import BeautifulSoup

mech = Browser()
url = "http://www.delicious.com/varunsrin"
page = mech.open(url)
html = page.read()

soup = BeautifulSoup(html)
print soup.prettify()

这适用于我扔给它的大多数网站,但在 Delicious 上失败,输出如下

Traceback (most recent call last):  
File "C:\Users\Varun\Desktop\Python-3.py",
line 7, in <module>
    page = mech.open(url)
File "C:\Python26\lib\site-packages\mechanize\_mechanize.py",
line 203, in open
    return self._mech_open(url, data, timeout=timeout)   File
"C:\Python26\lib\site-packages\mechanize\_mechanize.py",
line 255, in _mech_open
    raise response httperror_seek_wrapper: HTTP Error
403: request disallowed by robots.txt
C:\Program Files (x86)\ActiveState Komodo IDE 6\lib\support\dbgp\pythonlib\dbgp\client.py:1360:
DeprecationWarning:
BaseException.message has been deprecated as of Python 2.6
    child = getattr(self.value, childStr)
C:\Program Files (x86)\ActiveState Komodo IDE 6\lib\support\dbgp\pythonlib\dbgp\client.py:456:
DeprecationWarning:
BaseException.message has been deprecated as of Python 2.6
    return apply(func, args)

I'm using Mechanize and Beautiful soup to scrape some data off Delicious

from mechanize import Browser
from BeautifulSoup import BeautifulSoup

mech = Browser()
url = "http://www.delicious.com/varunsrin"
page = mech.open(url)
html = page.read()

soup = BeautifulSoup(html)
print soup.prettify()

This works for most sites I throw it at, but fails on Delicious with the following output

Traceback (most recent call last):  
File "C:\Users\Varun\Desktop\Python-3.py",
line 7, in <module>
    page = mech.open(url)
File "C:\Python26\lib\site-packages\mechanize\_mechanize.py",
line 203, in open
    return self._mech_open(url, data, timeout=timeout)   File
"C:\Python26\lib\site-packages\mechanize\_mechanize.py",
line 255, in _mech_open
    raise response httperror_seek_wrapper: HTTP Error
403: request disallowed by robots.txt
C:\Program Files (x86)\ActiveState Komodo IDE 6\lib\support\dbgp\pythonlib\dbgp\client.py:1360:
DeprecationWarning:
BaseException.message has been deprecated as of Python 2.6
    child = getattr(self.value, childStr)
C:\Program Files (x86)\ActiveState Komodo IDE 6\lib\support\dbgp\pythonlib\dbgp\client.py:456:
DeprecationWarning:
BaseException.message has been deprecated as of Python 2.6
    return apply(func, args)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

送你一个梦 2024-10-15 19:52:51

此处获取使用 python+mechanize 模拟浏览器的一些技巧。添加 addheadersset_handle_robots 似乎是最低要求。使用下面的代码,我得到输出:

from mechanize import Browser, _http
from BeautifulSoup import BeautifulSoup

br = Browser()    
br.set_handle_robots(False)
br.addheaders = [('User-agent', 'Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.1) Gecko/2008071615 Fedora/3.0.1-1.fc9 Firefox/3.0.1')]

url = "http://www.delicious.com/varunsrin"
page = br.open(url)
html = page.read()

soup = BeautifulSoup(html)
print soup.prettify()

Take some of the tips for emulating a browser with python+mechanize from here. Adding addheaders and set_handle_robots appears to be the minimum required. With the code below, I get output:

from mechanize import Browser, _http
from BeautifulSoup import BeautifulSoup

br = Browser()    
br.set_handle_robots(False)
br.addheaders = [('User-agent', 'Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.1) Gecko/2008071615 Fedora/3.0.1-1.fc9 Firefox/3.0.1')]

url = "http://www.delicious.com/varunsrin"
page = br.open(url)
html = page.read()

soup = BeautifulSoup(html)
print soup.prettify()
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文