-1

I integrate two systems and i have to insert data from one client table to another in another server, without any business logic or datamodification, onece per week. Every time when its run i have to inesrt all data. So i wrote camel configuration which i atached below. Its working for small piece of data but when clients table has over than 20000 rows i get exeption. java.lang.OutOfMemoryError: GC overhead limit exceeded. I try change java memory like "set JAVA_OPTS=-Dfile.encoding=UTF-8 -Xms2048m -Xmx16384m -XX:PermSize=1012m -XX:MaxPermSize=2048m -XX:+UseConcMarkSweepGC -XX:-UseGCOverheadLimit". But its not helps. enter image description here enter image description here

1 Answers1

0

i am working on the same tool. To query and the insert large tables i am using jdbc StreamList. In this example i work with two datasources, dataSource1 for query and dataSource2 for insert. Query from dataSource1 processed as stream, then body split row by row, so not all data at once, then each row are written to file and also an prepared insert statement is build and executed for dataSource2.

private void createRoute(String tableName) {

    this.from("timer://timer1?repeatCount=1") //
            .setBody(this.simple("delete from " + tableName)) //
            .to("jdbc:dataSource2") //
            .setBody(this.simple("select * from " + tableName)) //
            .to("jdbc:dataSource1?outputType=StreamList") //
            // .to("log:stream") //
            .split(this.body())//
            .streaming() //
            // .to("log:row") //
            .multicast().to("direct:file", "direct:insert") //
            .end().log("End");

    this.from("direct:file")
            // .marshal(jsonFormat) //
            .marshal(new CsvDataFormat().setDelimiter(';'))//
            .to("stream:file?fileName=output/" + tableName + ".txt").log("Data: ${body}");

    this.from("direct:insert").process(new Processor() {

        @Override
        public void process(Exchange exchange) throws Exception {
            StringBuilder insert = new StringBuilder("insert into ").append(tableName).append(" (");
            StringBuilder values = new StringBuilder("values(");

            LinkedHashMap<?, ?> body = (LinkedHashMap<?, ?>) exchange.getIn().getBody();

            Iterator<String> i = (Iterator<String>) body.keySet().iterator();

            while (i.hasNext()) {
                String key = i.next();
                insert.append(key);
                values.append(":?" + key);
                exchange.getOut().getHeaders().put(key, body.get(key));
                if (i.hasNext()) {
                    insert.append(",");
                    values.append(",");
                } else {
                    insert.append(") ");
                    values.append(") ");
                }
            }

            String sql = insert.append(values).toString();
            exchange.getOut().setBody(sql);
        }
    }) //
            .log("SQL: ${body}")//
            .to("jdbc:dataSource2?useHeadersAsParameters=true");