pandas read_htm解析失败lxml
我肯定已经安装了LXML,但是Pandas Read_html认为没有。 (我在Windows10 Python38上)。 我的代码:
# problem: lxml not found, please install it
import pandas as pd
import lxml # IS installed
url = 'https://harrypotter.fandom.com/wiki/Yvonne'
df = pd.read_html(url) # lxml not found, please install it
print(df.head())
追溯(最近的最新通话): 文件“ c:c:c:/python/python38/ourstuff/ai/nlp/knowledge-scraper.py”,第5行,in pd = pd.read_html(url)#lxml找不到,请安装 文件“ c:\ python \ python38 \ lib \ lib \ site-packages \ pandas \ pandas \ util_decorators.py”,第311行,在包装器中 返回func(*args,** kwargs) 文件“ c:\ python \ python38 \ lib \ lib \ site-packages \ pandas \ io \ io \ html.py”,第1098行,在read_html中 返回_parse( 文件“ c:\ python \ python38 \ lib \ lib \ site-packages \ pandas \ io \ io \ html.py”,第902行,在_Parse中 Parser = _Parser_disPatch(FLAV) 文件“ c:\ python \ python38 \ lib \ lib \ site-packages \ pandas \ io \ io \ html.py”,第859行,in _parser_dispatch 提高Importerror(“找不到LXML,请安装”) Infrorror:找不到LXML,请安装
I definitely have lxml installed but pandas read_html thinks not. (I am on windows10 python38).
my code:
# problem: lxml not found, please install it
import pandas as pd
import lxml # IS installed
url = 'https://harrypotter.fandom.com/wiki/Yvonne'
df = pd.read_html(url) # lxml not found, please install it
print(df.head())
Traceback (most recent call last):
File "C:/Python/Python38/OurStuff/AI/NLP/knowledge-scraper.py", line 5, in
pd = pd.read_html(url) # lxml not found, please install it
File "C:\Python\Python38\lib\site-packages\pandas\util_decorators.py", line 311, in wrapper
return func(*args, **kwargs)
File "C:\Python\Python38\lib\site-packages\pandas\io\html.py", line 1098, in read_html
return _parse(
File "C:\Python\Python38\lib\site-packages\pandas\io\html.py", line 902, in _parse
parser = _parser_dispatch(flav)
File "C:\Python\Python38\lib\site-packages\pandas\io\html.py", line 859, in _parser_dispatch
raise ImportError("lxml not found, please install it")
ImportError: lxml not found, please install it
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论