如何刮擦_ngcontent-c0?
我正在尝试写我的第一个刮刀,并且正在面临问题。当然,我看过的所有教程都提到了标签和每个故事的国家:
import requests
import csv
from bs4 import BeautifulSoup
from itertools import zip_longest
result = requests.get("https://www.cdc.gov/globalhealth/healthprotection/stories-from-the-
field/stories-by-country.html?Sort=Date%3A%3Adesc")
source = result.content
soup = BeautifulSoup(source,"lxml")
-----------------------------现在我的问题出现了------------------------------------------------------------ ---------------------------------------------- 当我开始寻求在CDC越南刮擦标题时,会使用技术创新来改善COVID-19的响应!
当我尝试了解代码时:
title = soup.find_all("span__ngcontent-c0",{"class": ##I don't know what goes here!})
I'm trying to write my first ever scraper and I'm facing a problem. all of the tutorials I've watched of course mention Tags in order to kind of catch the part you want to scrape and they mention something like this, or this is actually my code thus far, I'm trying to scrape the title, date, and country of each story:
import requests
import csv
from bs4 import BeautifulSoup
from itertools import zip_longest
result = requests.get("https://www.cdc.gov/globalhealth/healthprotection/stories-from-the-
field/stories-by-country.html?Sort=Date%3A%3Adesc")
source = result.content
soup = BeautifulSoup(source,"lxml")
--------------------------NOW COMES MY PROBLEM------------------------------------------
when I start looking to scrape the title it in a CDC Vietnam uses Technology Innovations to Improve COVID-19 Response like this!
When I try the code I learned :
title = soup.find_all("span__ngcontent-c0",{"class": ##I don't know what goes here!})
of course it doesn't work. I have searched and found this _ngcontent-c0 is actually angular but I don't know how to scrape it! Any help?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
此网络需要 javascript 渲染您要刮擦的所有内容。
它调用API获取所有内容。 只需要求此API。
您需要做类似的事情:
输出:
我希望我能够为您提供帮助。
This web needs javascript to render all content you want to scrape.
It calls API to get all content. Just request this API.
You need to do something like this:
OUTPUT:
I hope I have been able to help you.