I am trying to write a data from database to a csv file via java thread. For writing i am making use of OPENCSV jar. The problem i am facing is that sometimes in csv file values get corrupted like shown below in line 1 and 4. I have no idea as to why this is happening. Values coming from the database are all ok (as can be seen in the logs) but in csv file its not.
[E[EcoUnit 01] [Segment B/1] [2017-12-29 22:13:23.047] [ventilation air humidity] [70.18]
[EcoUnit 01] [Segment B/1] [2017-10-25 22:21:36.583] [ventilation air humidity] [69.65]
[EcoUnit 01] [Segment B/1] [2017-10-25 22:22:36.59] [ventilation air humidity] [69.33]
[EcoUnit 01] [Segment B/017-11-14 12:02:48.013] [ventilation fan] [30]
I would be really grateful if anyone can let me suggest why this is happening. Code is as follows: -
List<String> values = new ArrayList<String>();
fw = new FileWriter(file);
writer = new CSVWriter(fw);
writer.writeNext(headers);
values.add(doc.getFieldValue("Unit_Label").toString());
values.add(doc.getFieldValue("Segment_Label").toString());
values.add("[" + doc.getFieldValue("datestring").toString() + "]");
values.add(doc.getFieldValue("Item_Label").toString());
values.add(doc.getFieldValue("Value").toString());
writer.writeNext(values.toArray(new String[]{}));
Adding complete code of the function responsible for creating file and writing into it.
public void createAndFillFile(String startDateStr, String endDateStr, int fileNumber,SolrDocumentList results){
try{
String startDateParts[] = startDateStr.split(" ");
String startDate = startDateParts[0];
String endDateParts[] = endDateStr.split(" ");
String endDate = endDateParts[0];
if(fileNumber == 1){
Date date = new Date() ;
SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd HH-mm-ss") ;
String currentDate = dateFormat.format(date); //This line can be removed and in below line directly can be used
zipFile = currentDate + ".zip";
dir = new File("C:" + File.separator + "EcotronDownloadable" + File.separator + currentDate);
dir.mkdir();
path = dir.getAbsolutePath() + File.separator ;
file = new File(path+ startDate + "_" + endDate + "_" + fileNumber + ".csv");
fw = new FileWriter(file);
writer = new CSVWriter(fw);
writer.writeNext(headers);
}
synchronized(file){
for (SolrDocument doc : results) {
List<String> values = new ArrayList<String>();
Thread.sleep(1);
long fileLength = file.length();
if(fileLength<maxFileSize){
values.add(doc.getFieldValue("Unit_Label").toString());
values.add(doc.getFieldValue("Segment_Label").toString());
values.add("[" + doc.getFieldValue("datestring").toString() + "]");
values.add(doc.getFieldValue("Item_Label").toString());
values.add(doc.getFieldValue("Value").toString());
//log.trace(values);
writer.writeNext(values.toArray(new String[]{}));
}
else{
fw.flush();
fw.close();
// writer.close();
j = j + 1;
file = new File(path + startDate + "_" + endDate + "_" + j + ".csv") ;
fw = new FileWriter(file);
writer = new CSVWriter(fw);
writer.writeNext(headers);
values.add(doc.getFieldValue("Unit_Label").toString());
values.add(doc.getFieldValue("Segment_Label").toString());
values.add("[" + doc.getFieldValue("datestring").toString() + "]");
values.add(doc.getFieldValue("Item_Label").toString());
values.add(doc.getFieldValue("Value").toString());
//log.trace(values);
writer.writeNext(values.toArray(new String[]{}));
}
}
}
// fw.flush();
// fw.close();
// writer.close();
}
catch (Exception e) {
e.printStackTrace();
}
}
``