0

I use the logstash-input-jdbc plugin to sync my data from mysql to elasiticsearch. However, when I looked at the data in elasticsearch, I found that the format of the fields of all date types changed from "yyyy-MM-dd" to "yyyy-MM-dd'T'HH:mm:ss.SSSZ".I have nearly 200 fields whose type is date, so I want to know how to configure logstash so that it can output the format "yyyy-MM-dd" instead of "yyyy-MM-dd'T'HH:mm:ss.SSSZ".

wangjinhao
  • 123
  • 2
  • 9

1 Answers1

0

Elasticsearch stores dates as UTC timestamps:

Internally, dates are converted to UTC (if the time-zone is specified) and stored as a long number representing milliseconds-since-the-epoch.

Queries on dates are internally converted to range queries on this long representation, and the result of aggregations and stored fields is converted back to a string depending on the date format that is associated with the field.

So if you want to retain the yyyy-MM-dd format, you'll have to store it as a keyword (which you then won't be able to do range queries on).

You can change Kibana's display to only show the yyyy-MM-dd format, but note that it will convert the date to the timezone of the viewer which may result in a different day than you entered in the input field.

If you want to ingest the date as a string, you'll need to create a mapping for the index in question to prevent default date processing.

Community
  • 1
  • 1
Alcanzar
  • 16,985
  • 6
  • 42
  • 59
  • I've already stored these fields as keyword in es, but it still show the 'yyyy-MM-dd'T'HH:mm:ss.SSSZ' format – wangjinhao Jan 07 '19 at 23:33
  • you need to look at the mapping for the index and make sure it's stored as keyword (GET /index/_mapping in developer tool). Also if you changed the mapping, you need to refresh the index in kibana to get it to understand that it's changed – Alcanzar Jan 08 '19 at 17:56