0

Hello i new in elasticsearch and i tried to make search in Node.js with elastic search. I make JSON.stringify() show error. Internet said that problem with data format. If you working in elasticsearch in Node.JS. Can you help with code please? In search.js i try make function bulk, index that insert json data. I put JSON data from API if you need in parser code you can write but i think error in search.js.

My code (search.js):

const es = require('elasticsearch');
const search = async (search_keyword, data) => {
    const esClient = new es.Client({
        host: 'localhost:9200',
        log: 'trace'
    });
    await esClient.indices.create({
        index: 'test',
        body: data,
    });
   await esClient.search({
        index: 'test',
        body: {
            query: {
                multi_match: {
                    query: search_keyword,
                    fields: ["id", "name", "snippet"],
                    operator: "or"
                }
            }
        }
    })
};
module.exports=search;

server.js

const express = require("express");
const app = express();
const parser = require('./parser/parser');
const search=require('./elasticsearch/search')
;(async () => {
    const  result  = await parser();
for (const item of result) {
    const data = JSON.parse(item.data);
    const output = data.items.map(({id, name, snippet}) => ({
        id,
        name,
        snippet
    }));
    //console.log(output);
    try {
        await search("JS", output);
    }catch (err) {
        console.log(err);
    }
}

})();
const server = app.listen(3001, () => {
    console.log("listening on port %s...", server.address().port);
});

Error:

StatusCodeError: [not_x_content_exception] Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes
    at respond (D:\Programming\test-task\node_modules\elasticsearch\src\lib\transport.js:349:15)
    at checkRespForFailure (D:\Programming\test-task\node_modules\elasticsearch\src\lib\transport.js:306:7)
    at HttpConnector.<anonymous> (D:\Programming\test-task\node_modules\elasticsearch\src\lib\connectors\http.js:173:7)
    at IncomingMessage.wrapper (D:\Programming\test-task\node_modules\lodash\lodash.js:4949:19)
    at IncomingMessage.emit (events.js:215:7)
    at endReadableNT (_stream_readable.js:1183:12)
    at processTicksAndRejections (internal/process/task_queues.js:80:21) {
  status: 500,
  displayName: 'InternalServerError',
  message: '[not_x_content_exception] Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes',
  path: '/test',
  query: {},
  body: {
    error: {
      root_cause: [Array],
      type: 'not_x_content_exception',
      reason: 'Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes'
    },
    status: 500
  },
  statusCode: 500,
  response: '{"error":{"root_cause":[{"type":"not_x_content_exception","reason":"Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes"}],"type":"not_x_content_excep
tion","reason":"Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes"},"status":500}',
  toString: [Function],
  toJSON: [Function]
}
Nikita Kagan
  • 75
  • 1
  • 7

1 Answers1

1

I think the problematic part is

await esClient.indices.create({
        index: 'test',
        body: data,
});

because data is the output you're passing to the search search function:

const output = data.items.map(({id, name, snippet}) => ({
        id,
        name,
        snippet
}));

which is guaranteed to be an array. And the client.incides.create function does not accept that. Check this valid .create usage and see if it helps you further.


P.S.: The search function does multiple things at once:

  1. it instantiates the client
  2. creates an index (with the hope of indexing docs -- which fails. Check out _bulk indexing instead.)
  3. and performs the search.

All of this combined violates a principle called separation of concerns. See, the search function failed but it shouldn't have because there's nothing wrong with search per se. Even if it didn't fail, other people (or you in 6 months' time) reading the code will wonder why multiple unrelated actions are being performed there. So a bit of refactoring would be reasonable.

Joe - GMapsBook.com
  • 15,787
  • 4
  • 23
  • 68