0

I have to extract 20k rows from my DB. I don't know why but I'm able to get just 2k (1k per IJ_COD that is my PK) I thought to modify the pageSize but it doesn't make any sense because I have another Reader and Writer with the same configuration (easier query) that extracts 300k rows without any problem.

@Bean("trackReader")
@JobScope
public ItemReader<HappyFridayTrackUser> trackReader() throws Exception {

    JdbcPagingItemReader<HappyFridayTrackUser> databaseReader = new JdbcPagingItemReader<HappyFridayTrackUser>();
    final SqlPagingQueryProviderFactoryBean sqlPagingQuery = new SqlPagingQueryProviderFactoryBean();

    sqlPagingQuery.setDataSource(dataSource);


    sqlPagingQuery.setSelectClause("select T.ij_cod, T.creation_date, U.msisdn, u.flag_master, T.cd_user, P.title ");
    sqlPagingQuery.setFromClause("from SJ4_VODAFONE_BUSINESS.VODA_HAPPY_MONDAY_TRACK T, SJ4_VODAFONE_BUSINESS.USERS U, sj4_vodafone_business.promo_catalog P ");
    sqlPagingQuery.setWhereClause("where p.ij_cod = t.ij_cod and t.cd_user = u.cd_user and t.ij_cod in( "
            + "select IJ_COD "
            + "from SJ4_VODAFONE_BUSINESS.PROMO_CATALOG "
            + "where CAT_LEVEL_1 = 'monday' and PROMO_ID like 'MONDAY_%' "
            + "and END_DATE >trunc((sysdate -6), 'DD') "
            + "and END_DATE <= trunc((sysdate),'DD'))");

    sqlPagingQuery.setSortKey("ij_cod");
    databaseReader.setQueryProvider(sqlPagingQuery.getObject());
    databaseReader.setDataSource(dataSource);
    databaseReader.setPageSize(1000);
    databaseReader.setRowMapper(new BeanPropertyRowMapper<HappyFridayTrackUser>(HappyFridayTrackUser.class));

    return databaseReader;
}//close trackReader


@Bean("trackWriter")
@StepScope
public FlatFileItemWriter<HappyFridayTrackUser> trackWriter(){

    LocalDate localDate = LocalDate.now();
    DateTimeFormatter formatter = DateTimeFormatter.ofPattern("dd-LLLL-yyyy");
    String formattedString = localDate.format(formatter);

    Resource outputResource = new FileSystemResource(pathExportTrack + formattedString + ".csv");

    FlatFileItemWriter<HappyFridayTrackUser> writer = new FlatFileItemWriter<>();

    writer.setResource(outputResource);
    writer.setAppendAllowed(true);
    writer.setLineAggregator(new DelimitedLineAggregator<HappyFridayTrackUser>() {
        {
            setDelimiter("|");
            setFieldExtractor(new BeanWrapperFieldExtractor<HappyFridayTrackUser>() {
                {
                    setNames(new String[] {"ijCod", "creationDate", "msisdn",
                            "flagMaster", "cdUser", "title"});
                }
            });
        }
    });

    return writer;
}//close trackWriter

Does anyone have any suggestions about this?

jeprubio
  • 17,312
  • 5
  • 45
  • 56
s2theergio
  • 23
  • 3
  • Are you sure your query returns more the 2k items? Have you tried to execute it in a sql client and see the result? – Mahmoud Ben Hassine Jun 12 '19 at 09:43
  • yes @MahmoudBenHassine, I tried and it gets exactly 20.288 rows. – s2theergio Jun 12 '19 at 09:51
  • ok thanks. I'm not sure the page size plays a role here, especially since you said your other reader works fine with the same config. I can't see from what you shared what could be the cause of your issue, can you enable debug mode and share the queries that Spring Batch is sending to your DB? Your issue is probably a duplicate of https://stackoverflow.com/questions/35358905/spring-batch-jdbcpagingitemreader-seems-not-executing-all-the-items – Mahmoud Ben Hassine Jun 12 '19 at 11:46
  • UPDATE: I changed the sortKey with another column name and it worked, I'm not sure what sortKey does, I thought that it sorted data but maybe it's not right... – s2theergio Jun 12 '19 at 13:19

0 Answers0