0

I want to stream data from LogStash to a MS SQL Server working table with a schema something like this:

CREATE TABLE [dbo].[WorkingTable] (
    [MessageId] bigint,       //Auto Incrementing primary key.
    [Data]      nvarchar(MAX) //JSON data.
)

I want the [Data] field to contain the entire message from LogStash, serialized as a JSON string, and don't want to half to use sprintf to format it explicitly.

How can this be done using the JDBC output plugin or perhaps some other plugin?

chrismon
  • 75
  • 7
  • 1
    I have never used the jdbc output, but perhaps the [json codec](https://www.elastic.co/guide/en/logstash/current/plugins-codecs-json.html) might help? – baudsp Sep 06 '18 at 09:21
  • @baudsp Thank you yes this is what I believe will be needed! If nobody beats me to it I will post an answer that demonstrates this solution. – chrismon Sep 06 '18 at 23:25
  • Spoke too soon.. no support for codecs in the logstash-output-jdbc plugin. I was also looking at json_encode filter but it seems to support encoding single fields only, and not an entire event. – chrismon Sep 06 '18 at 23:45
  • Have you tried the unofficlal plugin : https://github.com/theangryangel/logstash-output-jdbc – Polynomial Proton Sep 12 '18 at 20:32

1 Answers1

0
output {
jdbc {
    driver_jar_path => 'jar file path'
    connection_string => "jdbc:sqlserver://sqlserver:port;databaseName=databaseName;user=user;password=pass"
    enable_event_as_json_keyword => true
    statement => ["INSERT INTO [dbo].[WorkingTable] (MessageId, Data) VALUES(?, ?)", "%{[Data][MessageId]}", "@event"]
}}

You should set enable_event_as_json_keyword to true and then use the magic @event in the statement

zmihaylov
  • 16
  • 2