1

I'm indexing my PostgreSQL data for Elasticsearch using the Logstash JDBC Input Plugin. I have two tables called REQUEST and ASSIGNMENT listed below.

How can I use Logstash to index the two tables into one Elasticsearch document of type REQUEST with a nested arrays for all child ASSIGNMENT records?

Table: REQUEST

REQUEST_ID | POC
---------- | ----------------
1234       | Jon Snow
1256       | Tyrion Lannister


Table: ASSIGNMENT

ASSIGN_ID | REQUEST_ID | STATUS  | CREATED
--------- | ---------- | ------- | ----------
2345      | 1234       | New     | 2017-01-06
2364      | 1234       | Working | 2017-03-12
2399      | 1234       | Working | 2017-05-20
5736      | 1256       | New     | 2017-06-28

This is what I want my Elasticsearch document to look like. It is a sample of the _source value of the search result:

"_source": {
  "request_id": "1234",
  "poc": "Jon Snow",
  "assignments": [
    {
      "assign_id": "2345",
      "status": "New",
      "created": "2017-01-06"
    },
    {
      "assign_id": "2364",
      "status": "Working",
      "created": "2017-03-12"
    },
    {
      "assign_id": "2399",
      "status": "Working",
      "created": "2017-05-20"
    }
  ]
}
EricPSU
  • 163
  • 1
  • 7
  • The one idea I have is to concatenate all of my `assignment` data into one column in a Postgres view and then use Logstash Grok to manipulate the data into the desired JSON format, but I don't know if that is possible – EricPSU Aug 07 '17 at 13:11
  • I have the same problem. Actually I read the request and create an empty array "assignments". Now i can't put the assignments as multiple object in the array. The last assignment of request override the previous one – Daniele Licitra Feb 12 '18 at 16:52
  • @DanieleLicitra - I gave up on this approach and just ended up having a separate index for Request and Assignment. I'd still rather nest my data like above, but I needed something that worked. – EricPSU Feb 12 '18 at 19:19

0 Answers0