Questions tagged [ingest]

41 questions
0
votes
2 answers

If the file is deleted, delete from the ElasticSearch index

I'm trying to make a piece of code that will be responsible for deleting an indexed file from the elasticsearch index, I pass with the indexed file md5(file name), to the id value. It is necessary to make sure that when deleting a file from a…
SplendX
  • 1
  • 1
0
votes
1 answer

Accessing metadata fields within ingestpipeline.yml set's processor in Elasticsearch

I have to write an ingest pipeline for elasticsearch within an pipeline.yml file. I was able to retrieve my field with grok and was able to divide it with the split processor. Now, I want to assign each value of the resulting array from the split…
0
votes
0 answers

NoAliveNodes Elasticsearch-PHP (Ingest-Attachment)

I'm new to programming. when used in php-es code "return $client->ingest()->putPipeline($params);" outputs an error "NoAliveNodes", but when using for example "return $client->index($params);" it works! Plugin Ingest-Attachment installed version…
SplendX
  • 1
  • 1
0
votes
1 answer

How to add ingest node to k8s cluster with bitnami/elasticsearch helm chart

We are using bitnami/elasticsearch helm chart for install elasticsearch cluster. We start with command: helm install --namespace esn elasticsearch bitnami/elasticsearch -f es_values_bitnami.yaml es_values_bitnami.yaml is: ## global: storageClass:…
Alex Nozzy
  • 31
  • 4
0
votes
1 answer

Use Ingestion Pipeline to split between two indexes

I have documents containing the field "Status", this can have three values "Draft", "In Progress", or "Approved". I am trying to pass this document through a ingest pipeline, and if the status is equal to "Approved" then it should add it in the B…
Ashish Mishra
  • 145
  • 3
  • 13
0
votes
1 answer

How to inject pdf into elasticsearch

I add the Ingest Attachment Processor Plugin on to Elastic. Than I create a very simple pdf file. This file (the content) I try to inject into Elastic. (see commands below) But the try to find a word out of the file fails. (see third answer near the…
Frank Mehlhop
  • 1,480
  • 4
  • 25
  • 48
0
votes
1 answer

Azure Synapse Copy pipeline for ingesting complex XML

I have a copy pipeline set up that connects to an SFTP server (on Azure Synapse). I have used it to copy csv files and this works fine, but now I have the complex task to pull in a multi-layered xml file and convert it to something usable in the…
0
votes
1 answer

Is there a way to get notified of deleted documents in ElasticSearch?

I am aware of the ingest pipeline for ingested documents, or the workaround of marking documents as deleted / moving them to a different index with a rollover policy. But is there a way to directly get notified and react upon deleted documents?…
Shlomi Uziel
  • 868
  • 7
  • 15
0
votes
0 answers

Ingest utility doesn't insert NULL value in a column of integer type

I am reading a CSV file through a named pipe. In the CSV file the field2 column is blank which need to be inserted into a table column as NULL. The table column is of type integer, but When I try to run the ingest I am getting an error that says…
vineeth
  • 641
  • 4
  • 11
  • 25
0
votes
1 answer

Ingest Utility with delete statement in db2 doesnt show number of rows deleted

When I run the ingest utility with the delete statement it gives the number of rows inserted as 0 and doesn't show the number of rows deleted. Is there any option to show the number of rows deleted? I have included the output message of the ingest…
vineeth
  • 641
  • 4
  • 11
  • 25
0
votes
1 answer

How split a field to words by ingest pipeline in Kibana

I have created an ingest pipeline as below to split a field into words: POST _ingest/pipeline/_simulate { "pipeline": { "description": "String cutting processing", "processors": [ { "split": { …
YNR
  • 867
  • 2
  • 13
  • 28
0
votes
1 answer

Ingesting from large json files to kusto from blob - expanding array of objects

I am trying to ingest json file into kusto (.zip file), and further processing json using update policies Approach 1 :file has following contents { "id": "id0", "logs": [ { "timestamp": "2021-05-26T11:33:26.182Z", "message":…
0
votes
1 answer

How to configure Elastic Search Ingest pipelines using Dockerfile and/or Docker-Compose?

I have written pipeline files for Logstash, but my current client is opposed to using Logstash and wants to ingest Filebeat generated logs directly in Elasticsearch. Fine, if that is really what he wants. But I cannot find a complimentary pipeline…
mphare
  • 1
0
votes
0 answers

Call API sequentially with first response is empty

Can anyone tell me if there is a way in which calling an API sequentially would make the result list grow sequentially rather than retrieving all data at once? Such as the first call would return an empty list, the second one item, and so on until…
davidJay
  • 13
  • 1
  • 5
0
votes
1 answer

debugging Elastic Ingest pipelines with grok processor

I have an elastic ingest pipeline with grok processor defined along with error handling { "my_ingest" : { "description" : "parse multiple patterns", "processors" : [ { "grok" : { "field" : "message", …
enigmatic
  • 1
  • 2