If I understand correctly the following piece of code should run in parallel
@ray.remote
class Worker:
...
def train(self, item, i):
time.sleep(i)
logging.info(f'{i} {item}')
...
worker = Worker.remote()
list = ['a', 'b', 'c']
results = ray.get([worker.train.remote(item, len(list) - idx) for idx, item in enumerate(list)])
print(results)
logging.info("successful print")
This should output
1 c
2 b
3 a
[1,2,3]
However, this outputs:
3 a
2 b
[3,2,1]
I am new to using ray and am unable to understand this behaviour. If anyone could point me towards the right direction, that'd be great!