0

I have written my own codes of map and reduce function in OpenCL kernel. General scenario of MapReduce which is basically incorporated in Hadoop which itself is written in java.

How can I use my own map-reduce codes written in C/OpenCL in hadoop on multi-node cluster?

I have already asked this type of question before but didn't get any response. Any link for tutorial would be useful.

I am willing to read on my own, I just can't find any resources pertaining to this. ANY kind of help would be appreciated. Thank you for time and concern.

sandeep.ganage
  • 1,409
  • 2
  • 21
  • 47
  • One option is to use JNDI or take a look at http://stackoverflow.com/questions/15495698/how-to-use-hadoop-mapreuce-framework-for-an-opencl-application/15509904#15509904 – alexeipab May 02 '13 at 14:04

1 Answers1

1

HDFS is a file system; you can use HDFS file system with any language.

HDFS data is distributed over multiple machines, it is highly available to process the data in GPU computing.

For more information reference Hadoop Streaming