0

I'm trying to check backuped file on our S3 bucket using getsize. I think it's not possible to use this function with the s3 wrapper.

Any alternative?

Here is my code:

def check_created_file(self,destination) :
    if destination != None  : 
        files = os.popen("hdfs dfs -ls %s | awk {'print $8'}"%(destination)).read().split("\n")
        files = files[1::len(files)-1]
        for file in files : 
            if  os.path.getsize(file) >  0  :
                print file 
            else: 
                print " [*] WARNING "
                print "%s is empty"%(file)
    else: 
        raise Exception("Remote dir is not specified")

the error I am getting:

OSError: [Errno 2] No such file or directory
Ralf
  • 16,086
  • 4
  • 44
  • 68
Alex
  • 15
  • 1
  • `hdfs` is not your system path and `os.popen` fails with `OSError`. Try specifying the full path to the executable, e.g. `/usr/bin/hdfs`, or better use `boto`: https://stackoverflow.com/a/5498841/1869597 – Jordan Jambazov Aug 14 '18 at 08:55
  • the fail is in the getsize function, the hdfs command is working good – Alex Aug 14 '18 at 08:59

0 Answers0