I'm trying to come up with the configuration to be able to parse a feed file coming from the mainframe (fixed-length).
Given the fact that the first position of every line of that feed represents a record type, I'm trying to build a minimal skeleton for a BeanIO (2.1) configuration (just the 1st field of every record for now) to represent that file's format.
The sample file to parse (just the 1st field of every line/record, for now):
1
5
6
6
6
8
5
6
8
9
While running the BeanIO unmarshaller to parse above minimal mainframe file format, the following exception is being thrown:
org.beanio.UnexpectedRecordException: End of stream reached, expected record 'batchControl'
at org.beanio.internal.parser.UnmarshallingContext.newUnsatisfiedRecordException(UnmarshallingContext.java:367)
at org.beanio.internal.parser.Group.unmarshal(Group.java:127)
at org.beanio.internal.parser.DelegatingParser.unmarshal(DelegatingParser.java:39)
at org.beanio.internal.parser.RecordCollection.unmarshal(RecordCollection.java:42)
at org.beanio.internal.parser.Group.unmarshal(Group.java:140)
at org.beanio.internal.parser.BeanReaderImpl.internalRead(BeanReaderImpl.java:106)
at org.beanio.internal.parser.BeanReaderImpl.read(BeanReaderImpl.java:67)
at com.pru.globalpayments.feeds.downstream.dailycashreport.acquire.provider.sftp.unmarshal.AchFileUnmarshallerService.unmarshalAchReturnFile(AchFileUnmarshallerService.java:76)
at com.pru.globalpayments.feeds.downstream.dailycashreport.acquire.provider.sftp.unmarshal.AchFileUnmarshallerServiceTest.testSuccessfulUnmarshallingOfMinimalFileToSkeletonAchObjectGraph(AchFileUnmarshallerServiceTest.java:175)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
Here are the relevant files:
AchFileUnmarshallerService.java
@Service
@Slf4j
public class AchFileUnmarshallerService {
public static final String BEANIO_PARSER_FORMAT = "fixedlength";
public final static String UNMARSHALLER_STREAM_NAME = "sftp-ach";
public AchFileCollection unmarshalAchReturnFile(File filetoParse, StreamBuilder streamBuilder){
Assert.notNull(filetoParse, "A valid file must be provided for parsing.");
Assert.state(Files.exists(filetoParse.toPath()), "A valid ACH file must exist at the provided path to attempt parsing.");
String fileName = filetoParse.getAbsolutePath();
log.info("Starting unmarshaling '{}' to a corresponding ACH structure.", fileName);
// StreamBuilder streamBuilder = new StreamBuilder(UNMARSHALLER_STREAM_NAME)
// .format(BEANIO_PARSER_FORMAT)
// .parser(new FixedLengthParserBuilder())
// .addGroup(AchFileCollection.class)
// .occurs(1, -1)
// //.strict()
// ;
StreamFactory factory = StreamFactory.newInstance();
factory.define(streamBuilder);
BeanReader beanioReader = factory.createReader(UNMARSHALLER_STREAM_NAME, fileName);
/*
beanioReader.setErrorHandler(new BeanReaderErrorHandler() {
@Override
public void handleError(BeanReaderException ex) throws Exception {
for (int i = 0; i < ex.getRecordCount(); i++) {
RecordContext context = ex.getRecordContext(i);
log.info(context.toString());
}
throw new BeanReaderException(ex.toString(), ex);
}
});
*/
return (AchFileCollection)beanioReader.read();
}
}
AchFileUnmarshallerServiceTest.java
@SpringJUnitConfig(classes = {
DcrDataFactoryApplication.class,
/* FileAcquisitionConfig.class, */
/* FileDistributionConfig.class */
})
@EnableConfigurationProperties
@PropertySource(value = "application.yml", factory = YamlPropertySourceFactory.class)
@Slf4j
class AchFileUnmarshallerServiceTest {
@Autowired
private AchFileUnmarshallerService achFileUnmarshallerService;
private StreamBuilder streamBuilder() {
StreamBuilder streamBuilder = new StreamBuilder(AchFileUnmarshallerService.UNMARSHALLER_STREAM_NAME)
.format(AchFileUnmarshallerService.BEANIO_PARSER_FORMAT).minOccurs(1)
.addGroup(new GroupBuilder("achFile").type(AchFile.class).order(1).occurs(1, -1)
.addRecord(new RecordBuilder("fileHeader", AchFileHeader.class).order(1).occurs(1, 1)
.addField(new FieldBuilder("recordTypeCode").at(0).length(1).required().literal("1")))
.addGroup(new GroupBuilder("batchRecords").type(AchBatch.class).order(2).occurs(0, -1)
.collection(List.class)
.addRecord(new RecordBuilder("batchHeader", AchBatchHeader.class).order(1).occurs(1, 1)
.addField(new FieldBuilder("recordTypeCode").at(0).length(1).required()
.literal("5")))
.addRecord(new RecordBuilder("batchEntries", AchBatchEntry.class).order(2).occurs(0, -1)
.collection(List.class)
.addField(new FieldBuilder("recordTypeCode").at(0).length(1).required()
.literal("6")))
.addRecord(new RecordBuilder("batchControl", AchBatchFooter.class).order(3).occurs(1, 1)
.addField(new FieldBuilder("recordTypeCode").at(0).length(1).required()
.literal("8"))))
.addRecord(new RecordBuilder("fileControl", AchFileFooter.class).order(3).occurs(1, 1)
.addField(new FieldBuilder("recordTypeCode").at(0).length(1).required().literal("9"))))
;
return streamBuilder;
}
@Test
@SneakyThrows
void testSuccessfulUnmarshallingOfMinimalFileToSkeletonAchObjectGraph() {
log.info("Ready to unmarshall.");
File fileToUnmarshall = new ClassPathResource("skeleton.ach.file.txt").getFile();
AchFileCollection root = achFileUnmarshallerService.unmarshalAchReturnFile(fileToUnmarshall, streamBuilder());
log.info("Finished unmarshalling {} file into the following structure {}.", fileToUnmarshall.getName(), root);
}
Domain model classes (with records containing just a single field, as explained earlier):
AchFile.java:
@Data
//@Group
public class AchFile {
//@Record
private AchFileHeader fileHeader;
//@Group
private List<AchBatch> batchRecords = new ArrayList<>();
//@Record
private AchFileFooter fileControl;
}
AchFileFooter.java:
@Data
public class AchFileFooter {
//@Field(name = "recordTypeCode", ordinal = 1, at = 0, length = 1, required = true, literal="9")
private String recordTypeCode;
}
AchFileHeader.java:
@Data
public class AchFileHeader {
//@Field(name = "recordTypeCode", ordinal = 1, at = 0, length = 1, required = true, literal="1")
private String recordTypeCode;
}
AchFileCollection.java (for now, unused):
@Data
//@Group(minOccurs = 1, maxOccurs = -1)
public class AchFileCollection {
//@Group
private List<AchFile> achFiles = new ArrayList<>();
}
AchBatch.java:
@Data
//@Group
public class AchBatch {
//@Record
private AchBatchHeader batchHeader;
//@Record
private List<AchBatchEntry> batchEntries = new ArrayList<>();
//@Record
private AchBatchFooter batchControl;
}
AchBatchHeader.java:
@Data
//@Record(order = 1, minOccurs = 0, maxOccurs = 1)
public class AchBatchHeader {
//@Field(name = "recordTypeCode", ordinal = 1, at = 0, length = 1, required = true, literal="5")
private String recordTypeCode;
}
AchBatchFooter.java:
@Data
//@Record(order = 3, minOccurs = 0, maxOccurs = 1)
public class AchBatchFooter {
//@Field(name = "recordTypeCode", ordinal = 1, at = 0, length = 1, required = true, literal="8")
private String recordTypeCode;
}
AchBatchEntry.java:
@Data
//@Record(order = 2, minOccurs = 0, maxOccurs = -1)
public class AchBatchEntry {
//@Field(name = "recordTypeCode", ordinal = 1, at = 0, length = 1, required = true, literal="6")
private String recordTypeCode;
}
N.B.: eventually, when I add more fields on each record type, the configuration will migrate into the domain model (hence the annotations there, commented for now). Wanted to first get it via BeanIO's Java builders option, all in one place, for testing.
My assumption is once BeanReader
is configured with input topology via StreamFactory
and given the input data to process (in a form of a file) it would match the two together and would know how to iterate the lines of the input file while recognizing types of records they represent, build the resulting output object representation of that input file and return it properly hydrated. Or is it too optimistic and not how BeanIO reader works and I need to do something else manually?
What am I doing wrong to get the above exception and how do I straighten it out?
TIA.