0

I am trying to implement simple reasoning operator using Apache Flink on scala. Now I can read data as a stream from a .csv file. But I cannot cope with RDF and OWL data processing. Here is my code to load data from .csv:

val csvTableSource = CsvTableSource
      .builder
      .path("src/main/resources/data.stream")
      .field("subject", Types.STRING)
      .field("predicate", Types.STRING)
      .field("object", Types.STRING)
      .fieldDelimiter(";")
      .build()

Could anyone show me an example to load this data with Flink using RDF and OWL? As I understood, an RDF stream contains dynamic data and the OWL is for static. I have to create a simple reasoning operator, which I can ask for information, e.g who is a friend of a friend.

Any help will be appreciated.

xijox
  • 1
  • "an RDF stream contains dynamic data and the OWL is for static" - ehm, no. One is RDF data, one is OWL data. Nothing more, nothing less. And obviously, you'll need a parser for data in RDF and/or OWL in Apache Flink. This has to be implemented first, afterwards you can continue – UninformedUser Jul 08 '18 at 18:47
  • By the way, there is already at least one project dealing with RDF/OWL in Apache Spark and Flink, called SANSA – UninformedUser Jul 08 '18 at 18:48
  • @AKSW, could you please show a simple example, how it can be implemented? Because I am not understanding that it can be after parsing. Could you please explain what the difference between OWL and RDF? – xijox Jul 09 '18 at 06:24
  • What do you mean by "an example" This aren't just a few lines of code. Apache Flink can read lines of text. Each line would have to be parsed. Obviously, this only works that simple if you have the RDF file in N-Triples format. But as said before, SANSA-RDF already does this and provides an API to load RDF into Apache Flink. – UninformedUser Jul 09 '18 at 09:39

0 Answers0