3

I am trying to create a file/directory in HDFS using python. To be clear, I am running a Hadoop streaming job with mapper written in Python. This mapper is actually trying to create a file in HDFS. I read that there are several Python frameworks to do this, but my interest is to go for Hadoop streaming. So, is there any way in Hadoop streaming to accomplish this?.

CiscoJavaHadoop
  • 141
  • 1
  • 6

4 Answers4

1

You Can run command HDFS in script python

import sys, subprocess

def run_cmd(args_list):
        proc = subprocess.Popen(args_list, stdout=subprocess.PIPE,stderr=subprocess.PIPE)
        (output, errors) = proc.communicate()
        if proc.returncode:
                raise RuntimeError('Error run_cmd')
        return (output, errors)

And run

(out, errors)=run_cmd(['hdfs','dfs','-mkdir','%s' %apth_HDFS_to_create_folder])
0

there is no way to create file with python script, but it's possible to create directory using pydoop or snakebit

see : https://www.geeksforgeeks.org/creating-files-in-hdfs-using-python-snakebite/

Zak_Stack
  • 103
  • 8
0

Solution using supprocess inspired by this answer in the "Create HDFS file" question.

from subprocess import Popen, PIPE 

(ret, out, err) = run_cmd(['hdfs', 'dfs', '-touchz', '/directory/filename'])
phwt
  • 1,356
  • 1
  • 22
  • 42
Zak_Stack
  • 103
  • 8
0

it is possible to create file using:

#define run commande function which run hadoop native linux cmd 
def run_cmd(args_list):
        """
        run linux commands
        """
        # import subprocess
        print('Running system command: {0}'.format(' '.join(args_list)))
        proc = Popen(args_list, stdout=PIPE, stderr=PIPE)
        s_output, s_err = proc.communicate()
        s_return =  proc.returncode
        return s_return, s_output, s_err 

(ret, out, err)= run_cmd(['hdfs', 'dfs', '-touchz', filename])
Zak_Stack
  • 103
  • 8
  • 1
    Please edit your [other answer](https://stackoverflow.com/a/73828803/2308683)(s) rather than post multiple different ones – OneCricketeer Sep 30 '22 at 16:30