0

My hadoop is running perfectly well , I am using hadoop-0.20.2.
But problem is I am not able to move /tmp/hsperfdata_hadoop file , so after a while /tmp gets filled with data and my process get time out or killed.
I have to move this folder , please help me with its configurations.

Debugger
  • 544
  • 1
  • 9
  • 25

2 Answers2

0

Try adding _Djava.io.tmpdir= to your java options.

I found this in the error log output of my mapreduce job when it failed due to this directory running out of space.

Elaine
  • 1
0

A couple of links you may find interesting to read:

From looking through those pages, this is a java thing (storage of performance counters), which may or may not be easily configurable for another location. Is your /tmp directory full or relativey small?

Can you run a df -h from your command line, and see how much space is assigned to that partition, and dig into /tmp to discover what else might be consuming the space.

I'm surprised that your OS and other application are not adversely affected by the lack of temporary storage space.

Community
  • 1
  • 1
Chris White
  • 29,949
  • 4
  • 71
  • 93
  • Thank you , your links are very helpful.Still I didnt get anything to change its path , If you get anything please let me know.I dont wana disable hsprefdata. – Debugger Mar 22 '12 at 07:47