This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 months ago.
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
接受
或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
发布评论
评论(4)
您可以通过此在线工具免费下载最多 500 个 URL 的列表:
XML Sitemap Generator
...只需在该工具抓取您的网站后选择“文本列表”即可。
You can download a list of up to 500 URLs free through this online tool:
XML Sitemap Generator
...Just select "text list" after the tool crawls your site.
一些网站管理员提供站点地图,它们本质上是域中每个 URL 的 XML 列表。 然而,除了爬行之外,没有通用的解决方案。 如果您确实使用爬虫,请遵守robots.txt。
Some webmasters offer Sitemaps, which are essentially XML lists of every URL on the domain. However, there is no general solution except crawling. If you do use a crawler, please obey robots.txt.
似乎没有皇家的网络爬行方式,所以我只会坚持我当前的方法...
而且我发现大多数搜索引擎无论如何都只公开前 1000 个结果。
Seems there is no royal way to web crawling, so I will just stick to my current approach...
Also I found most search engines only expose the first 1000 results anyway.
这在 pentesing 和 bug bounty 中被称为广泛侦察。
流程:
祝你好运。
This is called wide reconn in pentesing and bug bounty.
The Process:
Good luck.