0

I'm going to create an ontology, containing more than 500 000 simple facts, and reason it by Hermit or Pellet (it might get, saying, 1 000 000 facts). And I'd like to save the inferred results to a RDF datastore (in order to implement their processing by SPARQL).

Is that ontology considered as large or small one? Can I perform the mentioned above procedure, using OWL API & a reasoner (Hermit or Pellet)?

geeknet
  • 57
  • 7
  • 2
    One million axioms is not small but it's not necessarily challenging. Scalability of ontologies depends more on their complexity than their size. What is the size and complexity of the tbox for this ontology? – Ignazio Feb 15 '19 at 13:53
  • 1
    Also, what OWL profile do you need? For reasoning over facts you often only need the RL subset of OWL. If this is the case then you can have stricter performance guarantees be using an explicitly RL reasoner. – Chris Mungall Feb 18 '19 at 19:33
  • That's issue. I'm unable to evaluate the complexity of the tbox because of needing to unite a number of datasets. Also for one dataset I've got the DL expressivity as ALEOJ(P). I guess, it wants a bit of simplification. – geeknet Feb 19 '19 at 06:23
  • Yeap, there's lack of my experience in the field. I've read all the "hello world" guides about OWL, but I haven't managed to find any guide about design of linked data applications, i.e. how OWL, RDF relate, how to build a knowledge-based system. I'm not sure, that I need the OWL reasoning for the whole task. After some thinking, I've decided to start from RDF datasets and a SPARQL point to avoid perfomance issues. I'm going to apply the OWL reasoning in advance for some small datasets. – geeknet Feb 19 '19 at 06:38

0 Answers0