0

I'm trying to create DCOS services that download artifacts(custom config files etc.) from hdfs. I was using simple ftp server before for it but I wanted to use hdfs. It is allowed to use "hdfs://" in artifact uri but it doesn't work correctly.

Artifact fetch ends with error because there's no "hadoop" command. Weird. I read that I need to provide own hadoop for it.

So I downloaded hadoop, set up necessary variables in /etc/profile. I can run "hadoop" without any problem when ssh'ing to node but service still ends with the same error.

It seems that environment variables configured in service are used after the artifact fetch because they don't work at all. Also, it looks like services completely ignore /etc/profile file.

So my question is: how do I set up everything so my service can fetch artifacts stored on hdfs?

Tomasz
  • 658
  • 1
  • 7
  • 22

1 Answers1

0

The Mesos fetcher supports local Hadoop clients, please check your agent configuration and in particular your --hadoop_home setting.

js84
  • 3,676
  • 2
  • 19
  • 23
  • From documentation it seems that in case of empty --hadoop_home it looks at HADOOP_HOME and PATH variables. These are correctly set. – Tomasz Feb 05 '18 at 10:03