Sparql 查询与推理
我有一些 rdf 和rdfs 文件,我想使用 jena sparql 实现来查询它,我的代码如下所示:
//model of my rdf file
Model model = ModelFactory.createMemModelMaker().createDefaultModel();
model.read(inputStream1, null);
//model of my ontology (word net) file
Model onto = ModelFactory.createOntologyModel( OntModelSpec.RDFS_MEM_RDFS_INF);
onto.read( inputStream2,null);
String queryString =
"PREFIX rdf:<http://www.w3.org/1999/02/22-rdf-syntax-ns#> "
+ "PREFIX wn:<http://www.webkb.org/theKB_terms.rdf/wn#> "
+ "SELECT ?person "
+ "WHERE {"
+ " ?person rdf:type wn:Person . "
+ " }";
Query query = QueryFactory.create(queryString);
QueryExecution qe = QueryExecutionFactory.create(query, ????);
ResultSet results = qe.execSelect();
ResultSetFormatter.out(System.out, results, query);
qe.close();
我在 rdf 文件中有一个 wordNet 本体,我想在我的查询中使用这个本体来自动进行推理(当我查询 person 时,查询应该返回例如男人,女人) 那么如何将本体链接到我的查询呢?请帮我。
更新:现在我有两个模型:我应该从哪个模型运行我的查询?
QueryExecution qe = QueryExecutionFactory.create(query, ????);
提前致谢。
i have some rdf & rdfs files and i want to use jena sparql implementation to query it and my code look like :
//model of my rdf file
Model model = ModelFactory.createMemModelMaker().createDefaultModel();
model.read(inputStream1, null);
//model of my ontology (word net) file
Model onto = ModelFactory.createOntologyModel( OntModelSpec.RDFS_MEM_RDFS_INF);
onto.read( inputStream2,null);
String queryString =
"PREFIX rdf:<http://www.w3.org/1999/02/22-rdf-syntax-ns#> "
+ "PREFIX wn:<http://www.webkb.org/theKB_terms.rdf/wn#> "
+ "SELECT ?person "
+ "WHERE {"
+ " ?person rdf:type wn:Person . "
+ " }";
Query query = QueryFactory.create(queryString);
QueryExecution qe = QueryExecutionFactory.create(query, ????);
ResultSet results = qe.execSelect();
ResultSetFormatter.out(System.out, results, query);
qe.close();
and i have a wordNet Ontology in rdf file and i want to use this ontology in my query to do Inferencing automaticly (when i query for person the query should return eg. Man ,Woman)
so how to link the ontology to my query? please help me.
update: now i have tow models : from which i should run my query ?
QueryExecution qe = QueryExecutionFactory.create(query, ????);
thanks in advance.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
关键是要认识到,在耶拿,模型是中心抽象之一。推理模型只是一个
模型
,其中存在一些三元组,因为它们是由推理规则蕴含的,而不是从源文档中读入的。因此,您只需更改示例的第一行,即最初创建模型的位置。虽然您可以直接创建推理模型,但通常最简单的方法是创建一个具有所需推理支持程度的
OntModel
:如果您想要不同的推理器,或者OWL 支持,您可以选择不同的
OntModelSpec
常量。请注意,大型和/或复杂的模型可能会导致查询速度变慢。更新(原始问题的编辑后)
要推理两个模型,您需要并集。您可以通过 OntModel 的子模型功能来完成此操作。我将按如下方式更改您的示例(注意:我尚未测试此代码,但它应该可以工作):
The key is to recognise that, in Jena,
Model
is the one of the central abstractions. An inferencing model is just aModel
, in which some of the triples are present because they are entailed by inference rules rather than read in from the source document. Thus you only need to change the first line of your example, where you create the model initially.While you can create inference models directly, it's often easiest just to create an
OntModel
with the required degree of inference support:If you want a different reasoner, or OWL support, you can select a different
OntModelSpec
constant. Be aware that large and/or complex models can make for slow queries.Update (following edit of original question)
To reason over two models, you want the union. You can do this through
OntModel
's sub-model factility. I would change your example as follows (note: I haven't tested this code, but it should work):