What would happen if a file I store in HDFS is 5GB but there are only 3 DataNodes 1GB each?
Lets say I store a file 3GB in HDFS having 4 DataNodes 1GB each. After processing I have some results.txt. What will happen to processed file blocks which are stored in DataNodes? Because if I want to store another 3GB file to process then there won't be enough space for it? Or maybe those blocks get deleted after processing? Should I deleted it myself?
Asked
Active
Viewed 44 times
1

Eduardo Yáñez Parareda
- 9,126
- 4
- 37
- 50

SliceOfPig
- 123
- 3
- 10