从 https URL 获取 XML
我正在尝试从多个 XML 源获取数据并将该数据复制到我的本地数据库。我尝试研究 SimpleXML 和我在互联网上找到的其他一些东西,但我想知道处理这样的事情的最佳途径是什么。
我正在寻找一种不仅能从安全位置获取 XML 还能将其转换为一系列数组的东西。
I'm trying to get data from multiple XML feeds and copy that data to my local database. I've tried looking into SimpleXML and some other things I've found on the internet but I'm wondering what the best route to take with something like this.
I'm looking for something that will not only get the XML from the secure location but also convert it to a series of arrays.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
这是一个非常简单的过程,您可以使用 CURL 和某种 XML 到数组类来完成。我将在这里为您提供有关两者的详细信息。
PHP:
就是这样。
This is a pretty simple process that you can accomplish with CURL and some sort XML to array class. I'll give you details on both here.
The PHP:
And there you have it.
使用 SimpleXML 时,您将 XML 作为对象获取包含节点数组。与任何其他 XML 解析器相比,我个人更喜欢使用 SimpleXML,因为其简单性和整体性能。在操作数据时,它很容易用作 DOM 操纵器,并且在用作 XPath 解析器时,也很容易仅检索几个节点。
如果您遍历一个非常大的 XML 文件,您可能最好使用延迟加载的直接 SAX 解析器,但由于您是通过网络进行读取,我相信性能差异可以忽略不计。
When using SimpleXML you get the XML as an object that contains arrays of your nodes. I personally prefer to use SimpleXML over any other XML-parser, because of the simplicity and overall performance. It's easy to use as a DOM-manipulator when manipulating data as well as easy to retrieve only a few nodes when used as XPath-parser.
If your traversing through a very large XML-file you may be better of with a straight up SAX-parser that loads lazy, but since you're reading over a network I believe the performance differences are negligable.
使用 cURL (http://php.net/manual/en/ref.curl.php )从 https 位置获取数据,然后 simplexml_load_string 加载它。
Use cURL (http://php.net/manual/en/ref.curl.php) to fetch the data from the https location, then simplexml_load_string to load it.