I have following code:
//sftp
from(String.format("sftp://%s@%s:%d/%s?password=%s&delete=true",
sftpConfiguration.getUsername(),
sftpConfiguration.getHost(),
sftpConfiguration.getPort(),
sftpConfiguration.getSourcePath(),
sftpConfiguration.getPassword()))
.process(new Processor() {
@Override
public void process(Exchange exchange) {
try {
isCsv = CSV.equals(new
Tika().detect(exchange.getIn().getBody(InputStream.class), fileName));
} catch (IOException e) {
....
return;
}
final List<Map<String, Object>> parsedLines = misCsvParser.parse(exchange.getIn().getBody(InputStream.class), fileName);
...
}
}).to("seda:parsed_csv");
from("seda:parsed_csv")
.to(String.format("sftp://%s@%s:%d/%s?password=%s",
sftpConfiguration.getUsername(),
sftpConfiguration.getHost(),
sftpConfiguration.getPort(),
sftpConfiguration.getDestPathRejected(),
sftpConfiguration.getPassword()));
Briefly this code take files from sftpConfiguration.getSourcePath()
, analyze it and then send it to the sftpConfiguration.getDestPathRejected()
On my local mashine it works well always but on test environment our QA noticed interesting behaviour: Sometimes file lose some data. For example input file - 20 MB but after analyzing application put file with size less than 20MB. For example it can be 6 MB. Moreover if to repeat this operation for the same file size will change. It can became 8 MB and so on. Thus behaviour is not predictable.
I assume that something wrong with resource hanling but I don't have ideas what can be wrong.
Any ideas?