0

I'm trying to create a file zip based on on the file extension which follows this standard: filename.{NUMBER}, what I'm doing is reading a folder, grouping by .{number} and then creating a unique file .zip with that .num at the end, for example:

folder /

file.01

file2.01

file.02

file2.02

folder -> /processed

file.01.zip which contains -> file.01, file2.01

file02.zip which contains -> file.02, file2.02

what I done is using an outboundGateway, splitting files, enrich headers reading the file extension, and then aggregating reading that header, but doesn't seems to work properly.

public IntegrationFlow integrationFlow() {
return flow
.handle(Ftp.outboundGateway(FTPServers.PC_LOCAL.getFactory(), AbstractRemoteFileOutboundGateway.Command.MGET, "payload")
                .fileExistsMode(FileExistsMode.REPLACE)
                .filterFunction(ftpFile -> {
                    int extensionIndex = ftpFile.getName().indexOf(".");
                    return extensionIndex != -1 && ftpFile.getName().substring(extensionIndex).matches("\\.([0-9]*)");
                })
                .localDirectory(new File("/tmp")))
            .split() //receiving an iterator, creates a message for each file
            .enrichHeaders(headerEnricherSpec -> headerEnricherSpec.headerExpression("warehouseId", "payload.getName().substring(payload.getName().indexOf('.') +1)"))
            .aggregate(aggregatorSpec -> aggregatorSpec.correlationExpression("headers['warehouseId']"))
            .transform(new ZipTransformer())
            .log(message -> {
                log.info(message.getHeaders().toString());
                return message;
            });
}

it's giving me a single message containing all files, I should expect 2 messages.

Community
  • 1
  • 1

1 Answers1

0

due to the nature of this dsl, I have a dynamic number of files, so I couldn't count messages (files) ending with the same number, and I don't think timeout could be a good release Strategy, I just wrote the code on my own without writing to disk:


.<List<File>, List<Message<ByteArrayOutputStream>>>transform(files -> {
                HashMap<String, ZipOutputStream> zipOutputStreamHashMap = new HashMap<>();
                HashMap<String, ByteArrayOutputStream> zipByteArrayMap = new HashMap<>();
                ArrayList<Message<ByteArrayOutputStream>> messageList = new ArrayList<>();
                files.forEach(file -> {
                    String warehouseId = file.getName().substring(file.getName().indexOf('.') + 1);
                    ZipOutputStream warehouseStream = zipOutputStreamHashMap.computeIfAbsent(warehouseId, s -> new ZipOutputStream(zipByteArrayMap.computeIfAbsent(s, s1 -> new ByteArrayOutputStream())));
                    try {
                        warehouseStream.putNextEntry(new ZipEntry(file.getName()));
                        FileInputStream inputStream = new FileInputStream(file);
                        byte[] bytes = new byte[4096];
                        int length;
                        while ((length = inputStream.read(bytes)) >= 0) {
                            warehouseStream.write(bytes, 0, length);
                        }
                        inputStream.close();
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                });
                zipOutputStreamHashMap.forEach((s, zipOutputStream) -> {
                    try {
                        zipOutputStream.close();
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                });
                zipByteArrayMap.forEach((key, byteArrayOutputStream) -> {
                    messageList.add(MessageBuilder.withPayload(byteArrayOutputStream).setHeader("warehouseId", key).build());
                });

                return messageList;
            })
            .split()
            .transform(ByteArrayOutputStream::toByteArray)
            .handle(Ftp.outboundAdapter(FTPServers.PC_LOCAL.getFactory(), FileExistsMode.REPLACE)
            ......