Introduction
The challenge I bring to you today is: To implement a Real Rime REST API (GET
, POST
, PUT
, DELETE
, etc) to query and update any SPARQL endpoint using the Django REST Framework for a frontend application (I am using React) to request and use the serialized data provided by the REST API.
Please note that I'm using Django because I would like to implement Web AND Mobile applications in the future, but for now I will just implement it on a React Web application.
Specifications
The REST API should be able to:
- Perform (read or update) queries to a SPARQL endpoint via HTTP requests.
- Serialize the response to a JSON RDF standarized table, or an RDF Graph, depending on the HTTP response.
- Store the serialized response in a Python object.
- Provide an endpoint with the serialized response to a frontend application such as React).
- Handle incoming requests from the frontend application, "translate" and execute as a SPARQL query.
- Send back the response to the frontend application's request.
ALL OF THIS while performing all queries and updates In Real Time.
What I mean with a Real Time API:
- A SPARQL query is executed from the REST API to a SPARQL endpoint via an HTTP request.
- The REST API reads the HTTP response generated from the request.
- The REST API serializes the response to the corresponding format.
- This serialized response is stored locally in a Python object for future use.
(Note: All the triples from the SPARQL endpoint in the query now exist both in the SPARQL endpoint as well as in a Python object, and are consistent both locally and remotely.)
- The triples are then (hypothetically) modified or updated (Either locally or remotely).
- Now the local triples are out of synch with the remote triples.
- The REST API now becomes aware of this update (maybe through Listener/Observer objects?).
- The REST API then automatically synchs the triples, either through an update query request (if the changes were made locally) or by updating the Python object with the response from a query request (if the update was made remotely).
- Finally, both (the SPARQL endpoint and the Python object) should share the latest updated triples and, therefore, be in synch.
Previous Attempts
I have currently been able to query a SPARQL endpoint using the SPARQLWrapper
package (for executing the queries), and the RDFLib
and JSON
packages for serializing and instantiating Python objects from the response, like this:
import json
from rdflib import RDFS, Graph
from SPARQLWrapper import GET, JSON, JSONLD, POST, TURTLE, SPARQLWrapper
class Store(object):
def __init__(self, query_endpoint, update_endpoint=None):
self.query_endpoint = query_endpoint
self.update_endpoint = update_endpoint
self.sparql = SPARQLWrapper(query_endpoint, update_endpoint)
def graph_query(self, query: str, format=JSONLD, only_conneg=True):
results = self.query(query, format, only_conneg)
results_bytes = results.serialize(format=format)
results_json = results_bytes.decode('utf8').replace("'", '"')
data = json.loads(results_json)
return data
def query(self, query: str, format=JSON, only_conneg=True):
self.sparql.resetQuery()
self.sparql.setMethod(GET)
self.sparql.setOnlyConneg(only_conneg)
self.sparql.setQuery(query)
self.sparql.setReturnFormat(format)
return self.sparql.queryAndConvert()
def update_query(self, query: str, only_conneg=True):
self.sparql.resetQuery()
self.sparql.setMethod(POST)
self.sparql.setOnlyConneg(only_conneg)
self.sparql.setQuery(query)
self.sparql.query()
store = Store('http://www.example.com/sparql/Example')
print(store.query("""SELECT ?s WHERE {?s ?p ?o} LIMIT 1"""))
print(store.graph_query("""DESCRIBE <http://www.example.com/sparql/Example/>"""))
The Challenge
The previous code solves can already:
- Perform (read or update) queries to a SPARQL endpoint via HTTP requests
- Serialize the response to a JSON RDF standarized table, or an RDF Graph, depending on the HTTP response
- Store the serialized response in a Python object.
But still fails to implement these other aspects:
- Provide an endpoint with the serialized response to a frontend application such as React). Handle incoming requests from the frontend application, "translate" and execute as a SPARQL query.**
- Send back the response to the frontend application's request.
And last, but not least, it fails completely to implement the real time aspect of this challenge.
The Questions:
- How would you implement this?
- Is this really the best approach?
- Can the already working code be optimized?
- Is there something that already does this?