I have a number of sequenced Hadoop jobs in which I need a DistributedCache file.
The driver class(Controller) receives the input from the previous job, modifies a file, it places it in the DistributedCache and it starts a new job.
After the first job (i.e. in the second job), I get this error:
java.io.IOException:
The distributed cache object hdfs://xxxx/xx/x/modelfile2#modelfile2
changed during the job from 11/8/12 11:55 PM to 11/8/12 11:55 PM
Does anyone know what the problem might be ?