1

I am installing HAWQ on RedHat servers provisioned on Amazon EC2. I already have HDP 2.3 setup on the cluster. I have cloned HAWQ from Github.

First I run ./configure --prefix=/opt/hawq.

In the second step, I run make. The dependencies are compiling correctly when I run make from the root folder of incubator-hawq. The following error occours when make moves to compiling from src folder in the root directory (incubator-hawq):

make[2]: Entering directory `/root/incubator-hawq/src/port'
gcc -O3 -std=gnu99  -Wall -Wmissing-prototypes -Wpointer-arith  -Wendif-labels -Wformat-security -fno-strict-aliasing -fwrapv -fno-aggressive-loop-optimizations  -I/usr/include/libxml2 -I../../src/port -DFRONTEND -I../../src/include -D_GNU_SOURCE  -I/root/incubator-hawq/depends/libhdfs3/build/install/usr/local/hawq/include -I/root/incubator-hawq/depends/libyarn/build/install/usr/local/hawq/include  -c -o copydir.o copydir.c
In file included from copydir.c:25:0:
../../src/include/storage/fd.h:61:23: fatal error: hdfs/hdfs.h: No such file or directory
#include "hdfs/hdfs.h"
^
compilation terminated.
make[2]: *** [copydir.o] Error 1
make[2]: Leaving directory `/root/incubator-hawq/src/port'
make[1]: *** [all] Error 2
make[1]: Leaving directory `/root/incubator-hawq/src'
make: *** [all] Error 2

I know the compiler cannot find hdfs/hdfs.h, but as the dependencies (libhdfs3) compiled successfully, I don't understand why the particular file isn't found. Please help if somebody has come across the same problem as I am pretty much stuck here.

Raman
  • 1,221
  • 13
  • 20
  • You should note that `gcc` compiles c code, not c++. – πάντα ῥεῖ Aug 30 '16 at 01:57
  • `gcc` stands for GNU Compiler Collection and can compile C,C++, Objective-C, Fortran, Java, Ada, and Go. Check [here](https://gcc.gnu.org/). I have updated the tag to `c` from `c++` BTW. – Raman Aug 30 '16 at 02:15
  • Do you actually think I don't know what I'm talking about? I don't need you to explain that to me! For compiling c sources you use `gcc`, for c++ source it's `g++`. Tag update was may be the correct reaction. Though I'm wondering if your question fits for stack overflow at all, because it's not about a specific programming problem, but about installing a package to your particular environment. It's unlikely you'll get concise answers here. There may be better sites to ask at SE. – πάντα ῥεῖ Aug 30 '16 at 02:20
  • Thanks for your suggestion, I'll put it up on Server Fault as well. – Raman Aug 30 '16 at 02:44
  • 1
    Please don't crosspost questions, remove this one and post it again at Server Fault (sounds more promising indeed). Sorry for my snarky comment above, but your response just sounded too smart-alec for me (got me upset a bit). – πάντα ῥεῖ Aug 30 '16 at 02:47
  • See also [Is cross-posting a question on multiple Stack Exchange sites permitted?](http://meta.stackexchange.com/questions/64068/is-cross-posting-a-question-on-multiple-stack-exchange-sites-permitted-if-the-qu) –  Aug 30 '16 at 02:49
  • Thanks, I'll keep the above points in mind. – Raman Aug 30 '16 at 02:52

2 Answers2

2

Could you check file /root/incubator-hawq/depends/libhdfs3/build/install/usr/local/hawq/include/hdfs/hdfs.h exists? If not, then it should be build defect, please open defect to hawq team or email to: dev@hawq.incubator.apache.org. Thanks.

Ming Li
  • 41
  • 2
1

Do you have a folder in incubator-hawq/depends/libhdfs3/build/installafter make?

The problem seems that libhdfs3 dependency is not successfully built. There are some possible reasons for that: using a old version gcc(<4.7), configuration error of libhdfs3.

To test my words, you could try this:

cd incubator-hawq/depends/libhdfs3
mkdir build_debug && cd build_debug
cmake ..
make

If you could successfully do that, I think there is another reason for your problem. In this case, could you paste the information with more building lines?

Another possible reason is that you use different configuration prefix. In this case, you should run make distclean before another configuration.

xunzhang
  • 2,838
  • 6
  • 27
  • 44