0

I'm using a distributed selenoid cloud infrastructure to distribute my automated tests. In order to keep track of the selenoid instances I have a selenoid table in MySQL which tracks which selenoid instances are enabled and disabled.

I have this table modeled using the python alembic library.

I'm now in the situation where I need to make some custom api calls to the selenoid instances. I was thinking about adding the api interface directly to the model. This way I can query for my selenoid instances and then immediately make api calls with the results.

Example of what I'm thinking:

selenoid_instances = session.query(Selenoids).all()

for selenoid in selenoid_instances:
  videos = selenoid.get_videos()

Is this a good design practice or a bad idea?

Ryan Nygard
  • 200
  • 8

1 Answers1

0

DISCLAIMER: I am one of core Selenoid contributors.

We have two free open-source solutions that do not require a database at all but still allow to distribute requests across multiple Selenoid instances.

  1. Ggr - this is for distributing Selenium requests and other features like downloading videos and video recording.
  2. Ggr UI - this is for automatically fetching launched browser session from multiple Selenoid instances. The main use case for this daemon is showing all running browser sessions for the entire cluster in the same UI.

So instead of using a MySQL database I would just query these daemons directly. Also if you need next-generation Selenium solution - take a look at Moon (but this is closed-source and commercial).

If you still have any question related to any of the tools, don't hesitate to also ask in Telegram support channel.

vania-pooh
  • 2,933
  • 4
  • 24
  • 42