I have tried similar suggestions from stack overflow, still issue persist.
I am executing following command from java
public static void main(String[] args) throws Exception {
try {
String line;
//String[] commandToExecute = {"sh" , "MultiDel_FI.sh", " < abcd.log 2>&1 &"};
String[] commandToExecute = {"sh" , "MultiDel_FI.sh"};
Process p = Runtime.getRuntime().exec( commandToExecute );
p.waitFor();
BufferedReader reader = new BufferedReader(
new InputStreamReader(p.getInputStream()));
writer = new FileWriter("redirect.log");
while ((line = reader.readLine()) != null) {
writer.write(line);
}
writer.close();
p.waitFor();
p.destroy();
catch (Exception e) {
//
}
MultiDel_FI.sh file is placed on edge node with following content
hadoop fs -rm -R -skipTrash 'hdfs://<path>/abc_767' 'hdfs://<path>/abc_768' 'hdfs://<path>/abc_769' ...........many more records
when I run the code it successfully removes the folders from HDFS location. I need to track the output of the command to identify which folder got deleted successfully and which one is not. I tried various options with bin/sh ,writing inputstream to file, it generates empty file. Any suggestions please?
direct unix command properly redirects output.
sh MultiDel_FI.sh > abcd.log 2>&1