使用 selenium 和 java 并行检查页面上损坏的链接
任务是使用 selenium RC 和 java 检查网页上损坏的链接。可以简单地通过
a) click on link A
b) wait for page to open
c) focus on this window
d) verify text present on this page
e) Close this window
然后按照链接 B 、链接 C .... 链接 N 的步骤 a 到 e
完成此过程是连续的。
是否可以在新窗口中并行打开所有链接并验证它们是否已损坏,即更专业的方式?
请提供建议(如果可能的话,提供代码示例)
The task is to check broken links on a webpage using selenium RC and java . It can be done simply by
a) click on link A
b) wait for page to open
c) focus on this window
d) verify text present on this page
e) Close this window
Then follow steps a to e for links B , links C .... links N
This process is sequential.
Is it possible to parallely open all links in new windows and verify whether they are broken or not i.e. a more professional way ?
Kindly Advice ( if possible with a sample of code )
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
由于浏览器实际上一次只能单击一个链接,这就是您使用 Selenium 所能做的全部事情。它所做的只是像用户一样操纵浏览器。
如果您根本不关心 AJAX,那么更好的选择可能是使用 HTTPClient 在 Selenium 之外执行此操作。在那里,您可以获取源代码和所有链接,并发出 HEAD 请求以查看是否收到 404(无需断言文本存在)。您可以并行执行此操作,而根本不需要等待浏览器。
Since a browser can realistically only click one link at a time, that's all you'll be able to do with Selenium. All it's doing is manipulating the browser as a user might.
If you're not concerned with AJAX at all, your better bet is probably to do this outside of Selenium with HTTPClient. There you could fetch the source and all links and issue a HEAD request to see if you get a 404 (no need to assert text is present). You could do this in parallel and not need to wait for the browser at all.