Questions tagged [ray]

Ray is a library for writing parallel and distributed Python applications. It scales from your laptop to a large cluster, has a simple yet flexible API, and provides high performance out of the box.

At its core, Ray is a library for writing parallel and distributed Python applications. Its API provides a simple way to take arbitrary Python functions and classes and execute them in the distributed setting.

Learn more about Ray:

Ray also includes a number of powerful libraries:

  • Cluster Autoscaling: Automatically configure, launch, and manage clusters and experiments on AWS or GCP.
  • Hyperparameter Tuning: Automatically run experiments, tune hyperparameters, and visualize results with Ray Tune.
  • Reinforcement Learning: RLlib is a state-of-the-art platform for reinforcement learning research as well as reinforcement learning in practice.
  • Distributed Pandas: Modin provides a faster dataframe library with the same API as Pandas.
702 questions
-1
votes
1 answer

How to log to tensorboard from ray.tune when using the class API?

None of the examples in the docs show how logging happens with the class api? Do you still use tune.track.log even in the class api?
mathtick
  • 6,487
  • 13
  • 56
  • 101
-1
votes
1 answer

Is there any OpenAI Gym compliant interface implementation for continuous action spaces?

Is there any OpenAI Gym compliant interface implementation for continuous action spaces? If so, does it support multi-agent environments? I'm working on multi-agent DDPG implementation, but I couldn't find the suitable baseline environment.
-1
votes
1 answer

(Mac OS) No matching distribution found for ray

so I've been trying to pip install ray in a conda environment. Here's the code I use: pip install --trusted-host pypi.org --trusted-host files.pythonhosted.org -U ray I get this error: ERROR: Could not find a version that satisfies the requirement…
-1
votes
1 answer

Where does Ray.Tune create the model vs implementing the perturbed hyperparameters

I am new to using ray.tune. I already have my network written in a modular format and now I am trying to incorporate ray.tune, but I do not know where to initialize the model (vs updating the perturbed hyperparameters) so that the model and the…
LaMaster90
  • 11
  • 1
-1
votes
1 answer

Cython class initialization in ray

I'm using Ray (https://github.com/ray-project/ray) and Cython 0.29 to parallelize some existing code, and I decided to define a cdef class with one of my Cython functions as its method to ease running the code for multiple Actors in paralell. The…
greg_pro
  • 21
  • 3
-2
votes
2 answers

How to receive real time logs from PID's in terminal?

I have a computational program written in Python using Ray package with the following output: Actor(Play001,69a6825d641b461327313d1c01000000) This process uses the following pid: pid = 87972 In the Ray dashboard I can view the logs. Snippets is…
-2
votes
1 answer

Best multiprocessing technique to speed up this code

I am trying to learn more about parallelisation to speed up this classification code. I literally started reading about it less than 24 hours ago (to share some context). I am wondering which multiprocessing technique will be the best to tackle this…
Carla
  • 1
  • 1
  • 1
-2
votes
1 answer

Paralleling via ray issue

@ray.remote def parallel1(a): Gbr1 = ExponentialSmoothing(endog=series1['Price'],trend = True,damped_trend = True).fit(disp = False) pr1 = list(Gbr1.forecast(136).values)[-1] return pr1 @ray.remote def…
Gregory
  • 23
  • 3
-2
votes
1 answer

No module named 'ray.rllib.agents.ppo.ppo_policy'

I have this code: from copy import deepcopy import json import ray try: from ray.rllib.agents.agent import get_agent_class except ImportError: from ray.rllib.agents.registry import get_agent_class from ray.rllib.agents.ppo.ppo_policy import…
-3
votes
1 answer

Python Ray: Any good resources other than the website about the Ray API for python?

I would like to learn more about the Ray API but was hoping to find some easy to read information regarding this python module. The website is great but I find it abit of a challenge to follow. Any suggestions are appreciated! Thank you
1 2 3
46
47