Background
I am working on a project where multiple independent system services need to talk to a central service via a well/fully-defined REST API (i.e. via HTTP GET
, PUT
, and POST
requests). There are some complex sets of operations we would like to reduce to a single function and make available to all the services acting as clients to the REST services (i.e. GET
from a bunch of different endpoints, combine the results, and execute a POST
and PUT
to a different/new endpoint).
Problem
All of these system services are written in various programming languages (i.e. C, C++, Go, Node.js, and Python), and we'd like to re-use the same functions in these various project. So far, the most viable approach seems to be to write a c library and make it available to the other projects.
- However, using
C
as the language to house the code will result in use missing out on certain features most other/newer langaugues support, and will overly complicate the code. - If we were to make the library in C++ though, now get to mess around with binding generators which adds more complexity still).
- We could potentially create an additional REST service (we're not allowed to modify the current one) that acts as an intermediate "agent" to "translate requests" for the other system services, but we'd rather go with a library rather than adding another service, if possible (since we then have to re-define/re-factor the authtentication/security model by having a "proxy" service added to the mix).
Question
Is there a more elegant/maintainable way to allow code re-use amongst multiple projects and programming languages, since all of them just need a way to interpret HTTP calls in the same manner? For example, is this something that protocol buffers (or something similar) could manage?