I need to write a perl script to read gzipped files from a text file list of their paths and then concatenate them together and output to a new gzipped file. ( I need to do this in perl as it will be implemented in a pipeline) I am not sure how to accomplish the zcat and concatenation part, as the file sizes would be in Gbs, I need to take care of the storage and run time as well.
So far I can think of it as -
use strict;
use warnings;
use IO::Compress::Gzip qw(gzip $GzipError) ;
#-------check the input file specified-------------#
$num_args = $#ARGV + 1;
if ($num_args != 1) {
print "\nUsage: name.pl Filelist.txt \n";
exit;
$file_list = $ARGV[0];
#-------------Read the file into arrray-------------#
my @fastqc_files; #Array that contains gzipped files
use File::Slurp;
my @fastqc_files = $file_list;
#-------use the zcat over the array contents
my $outputfile = "combined.txt"
open(my $combined_file, '>', $outputfile) or die "Could not open file '$outputfile' $!";
for my $fastqc_file (@fastqc_files) {
open(IN, sprintf("zcat %s |", $fastqc_file))
or die("Can't open pipe from command 'zcat $fastqc_file' : $!\n");
while (<IN>) {
while ( my $line = IN ) {
print $outputfile $line ;
}
}
close(IN);
my $Final_combied_zip = new IO::Compress::Gzip($combined_file);
or die "gzip failed: $GzipError\n";
Somehow I am not able to get it to run. Also if anyone can guide on the correct way to output this zipped file.
Thanks!