-1

I'm trying to read a file on my hdfs server in my python app deployed with docker, during dev, I don't have any problem, but in prod there are this error :

Erreur: HTTPConnectionPool(host='dnode2', port=9864): Max retries exceeded with url: /webhdfs/v1/?op=OPEN&namenoderpcaddress=namenode:9000&offset=0 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f1af13d45d0>: Failed to establish a new connection: [Errno -2] Name or service not known'))

Note that I use an address IP not that "dnode2" name and neither use that port!!

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245

1 Answers1

0

Sorry, but It was just that I needed to add a static mapping in my /etc/hosts and it's work!!

  • 1
    As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Apr 15 '22 at 12:17
  • While it may work, you shouldn't be editing your hosts file to fix hadoop or Docker networking problems. – OneCricketeer Apr 16 '22 at 13:09