Questions tagged [ray]

Ray is a library for writing parallel and distributed Python applications. It scales from your laptop to a large cluster, has a simple yet flexible API, and provides high performance out of the box.

At its core, Ray is a library for writing parallel and distributed Python applications. Its API provides a simple way to take arbitrary Python functions and classes and execute them in the distributed setting.

Learn more about Ray:

Ray also includes a number of powerful libraries:

  • Cluster Autoscaling: Automatically configure, launch, and manage clusters and experiments on AWS or GCP.
  • Hyperparameter Tuning: Automatically run experiments, tune hyperparameters, and visualize results with Ray Tune.
  • Reinforcement Learning: RLlib is a state-of-the-art platform for reinforcement learning research as well as reinforcement learning in practice.
  • Distributed Pandas: Modin provides a faster dataframe library with the same API as Pandas.
702 questions
0
votes
1 answer

Ray-triangle intersection. It work with two directions

I don't know why, but ray-triangle intersection algorithm (Möller-Trumbore and Watertight) works with 2 directions. from origin to end - how it must work from end to origin - the problem white cross - an unnecessary intersection screenshot I tried…
0
votes
0 answers

ImportError: cannot import name 'RolloutWorker'(ray.rllib)

I want to import one of agents in ray.rllib, 'ddpg', but some error pop out. Anyone can help me?? I use google colab.
Peter Kim
  • 65
  • 4
0
votes
1 answer

Decreasing action sampling frequency for one agent in a multi-agent environment

I'm using rllib for the first time, and trying to traini a custom multi-agent RL environment, and would like to train a couple of PPO agents on it. The implementation hiccup I need to figure out is how to alter the training for one special agent…
sh0831
  • 1
0
votes
1 answer

Using Ray-Tune with sklearn's RandomForestClassifier

Putting together different base and documentation examples, I have managed to come up with this: X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2) def objective(config, reporter): for i in range(config['iterations']): …
LeggoMaEggo
  • 512
  • 1
  • 9
  • 24
0
votes
0 answers

Using Ray from Flask - init() fails (with core dump)

I'm trying to use Ray from a Flask web application. The whole thing runs in Docker container. Ray Version is 0.8.6, Flask 1.1.2 When I start the web application, Ray tries to init twice, at it seems, and then the processes crashes. I added the…
Bernd H
  • 101
  • 5
0
votes
0 answers

Open question - Is high level parallelising of many multi threaded serial jobs across a cluster using joblib backend possible?

I am totally new to Ray and have a question regarding it being a potential solution. I am optimising an image modelling code and have successfully optimised it to run on a single machine, using multi-threaded numpy operations. Each image generation…
0
votes
0 answers

Python ray: Wrong sequence of task-results | How to index/convert ObjectID to String?

My basic Python ray knowledge is slowly growing but now I could need some theroretical help. To keep things simple, I converted my situation to this Example: I want to count from 1 to 100 by distributed computing by ray. 10 tasks were started to…
Rene Munsch
  • 23
  • 1
  • 5
0
votes
1 answer

Is there a way to train a PPOTrainer on one environment, then finish training it on a slightly modified environment?

I'm attempting to first train a PPOTrainer for 250 iterations on a simple environment, and then finish training it on a modified environment. (The only difference between the environments would be a change in one of the environment configuration…
sbrand
  • 11
  • 1
0
votes
2 answers

python ray AttributeError : 'function' has no attribute 'remote'

I'm trying to use ray module to on an existing code based on if an env variable is true or not. This is what I've done so far. this code structure is similar to mine but not exactly due to it's size. import os if os.getenv("PARALLEL"): import…
AnotherBrick
  • 153
  • 3
  • 6
  • 14
0
votes
0 answers

Function is not found in imported file, yet still works

The line here calls tune.run: https://github.com/ray-project/ray/blob/90b05983d616900f1ccd325bfe235cd3c38ef174/python/ray/util/sgd/torch/examples/tune_example.py#L51 However, the tune variable points to ray.tune which points to this __init__.py…
user3180
  • 1,369
  • 1
  • 21
  • 38
0
votes
1 answer

Ray seems to be caching the imported .py even after code terminates

I am testing ray on 1 head node and 1 cluster node. I started the head node with: ray start --head --redis-port=6379 and the cluster node with: ray start --address=':6379' At both the head node and the cluster node, there is f.py &…
0
votes
0 answers

Read Large XML File Using Python and Ray

I have seen many questions and answers regarding reading large XML files, but none of them seem to have provided true multi-core performance to a python process parsing XML. I have just started using an excellent framework for managing…
0
votes
2 answers

Can you run python code with RAY in AWS Lambda, remotely from an IDE (eg. PyCharm)?

Keen to run a library of python code, which uses "RAY", on AWS Lambda / a serverless infrastructure. Is this possible? What I am after: - Ability to run python code (with RAY library) on serverless (AWS Lambda), utilising many CPUs/GPUs - Run the…
cwse
  • 584
  • 2
  • 10
  • 20
0
votes
1 answer

How to ensure each worker use exactly one CPU?

I'm implementing SEED using ray, and therefore, I define a Worker class as follows import numpy as np import gym class Worker: def __init__(self, worker_id, env_name, n): import os os.environ['OPENBLAS_NUM_THREADS'] = '1' …
Maybe
  • 2,129
  • 5
  • 25
  • 45
0
votes
1 answer

Storing and retrieving object in ray.io

I have a ray cluster running on a machine as below: ray start --head --redis-port=6379 I have two files that need to run on the cluster. Producer p_ray.py: import ray ray.init(address='auto', redis_password='5241590000000000') @ray.remote class…