1

I created a script, that, for a game situation tries to find the best possible solution. It does this, by simulating each and every possible move, and quantifying them, thus deciding which is the best move to take (which will result in the fastest victory). To make it faster, I've implemented PHP's pthread, in the following way: each time the main thread needs to find a possible move (let's call this JOB), it calculates all the possible moves in the current depth, then starts a Pool, and adds to it, each possible move (let's call this TASK), so the threads develop the game tree for each move separately, for all the additional depths.

This would look something like this:

(1) Got a new job with 10 possible moves
(1) Created a new pool
(1) Added all jobs as tasks to the pool

(1) The tasks work concurently, and return an integer as a result, stored in a Volatile object

(1) The main thread selects a single move, and performs it
.... the same gets repeated at (1) until the fight is complete

Right now, the TASKS use their own caches, meaning while they work, they save caches and reuse them, but they do not share caches between themselves, and they do not take caches over from a JOB to another JOB. I tried to resolve this, and in a way managed, but I don't think this is the intended way, because it makes everything WAY slower.

What I tried to do is as follows: create a class, that will store all the cache hashes in arrays, then before creating the pool, add it to a Volatile object. Before a task is being run, it retrieves this cache, uses it for read/write operation, and when the task finished, it merges it with the instance which is in the Volatile object. This works, as in, the caches made in JOB 1, can be seen in JOB 2, but it makes the whole process way much slower, then it was, when each thread only used their own cache, which was built while building the tree, and then destroyed, when the thread finished. Am I doing this wrong, or the thing I want is simply not achieavable? Here's my code:

class BattlefieldWork extends Threaded {
    public $taskId;

    public $innerIterator;
    public $thinkAhead;
    public $originalBattlefield;
    public $iteratedBattlefield;
    public $hashes;

    public function __construct($taskId, $thinkAhead, $innerIterator, Battlefield $originalBattlefield, Battlefield $iteratedBattlefield) {
        $this->taskId = $taskId;
        $this->innerIterator = $innerIterator;
        $this->thinkAhead = $thinkAhead;
        $this->originalBattlefield = $originalBattlefield;
        $this->iteratedBattlefield = $iteratedBattlefield;
    }

    public function run() {
        $result = 0;
    
        $dataSet = $this->worker->getDataSet();
        $HashClassShared = null;
        $dataSet->synchronized(function ($dataSet) use(&$HashClassShared) {
            $HashClassShared = $dataSet['hashes'];
        }, $dataSet);
        $myHashClass = clone $HashClassShared;

        $thinkAhead = $this->thinkAhead;
        $innerIterator = $this->innerIterator;
        $originalBattlefield = $this->originalBattlefield;
        $iteratedBattlefield = $this->iteratedBattlefield;
    
        // the actual recursive function that will build the tree, and calculate a quantify for the move, this will use the hash I've created
        $result = $this->performThinkAheadMoves($thinkAhead, $innerIterator, $originalBattlefield, $iteratedBattlefield, $myHashClass);
    
        // I am trying to retrieve the common cache here, and upload the result of this thread
        $HashClassShared = null;
        $dataSet->synchronized(function($dataSet) use ($result, &$HashClassShared) {
            // I am storing the result of this thread
            $dataSet['results'][$this->taskId] = $result;
            // I am merging the data I've collected in this thread with the data that is stored in the `Volatile` object
            $HashClassShared = $dataSet['hashes'];
            $HashClassShared = $HashClassShared->merge($myHashClass);
        }, $dataSet);
    }
}
    

This is how I create my tasks, my Volatile, and my Pool:

class Battlefield {
    /* ... */

    public function step() {
      /* ... */
      /* get the possible moves for the current depth, that is 0, and store them in an array, named $moves */

      // $nextInnerIterator, is an int, which shows which hero must take an action after the current move
      // $StartingBattlefield, is the zero point Battlefield, which will be used in quantification
      foreach($moves as $moveid => $move) {
          $moves[$moveid]['quantify'] = new BattlefieldWork($moveid, self::$thinkAhead, $nextInnerIterator, $StartingBattlefield, $this);
      }

      $Volatile = new Volatile();
      $Volatile['results'] = array();
      $Volatile['hashes'] = $this->HashClass;

        
      $pool = new Pool(6, 'BattlefieldWorker', [$Volatile]);
      foreach ($moves as $moveid => $move) {
          if (is_a($moves[$moveid]['quantify'], 'BattlefieldWork')) {
              $pool->submit($moves[$moveid]['quantify']);
          }
      }

      while ($pool->collect());
      $pool->shutdown();

      $HashClass = $Volatile['hashes'];
      $this->HashClass = $Volatile['hashes'];

      foreach ($Volatile['results'] as $moveid => $partialResult) {
          $moves[$moveid]['quantify'] = $partialResult;
      }

      /* The moves are ordered based on quantify, one is selected, and then if the battle is not yet finished, step is called again */
    }
}

And here is how I am merging two hash classes:

class HashClass {
    public $id = null;
    public $cacheDir;

    public $battlefieldHashes = array();
    public $battlefieldCleanupHashes = array();
    public $battlefieldMoveHashes = array();

    public function merge(HashClass $HashClass) {
        $this->battlefieldCleanupHashes = array_merge($this->battlefieldCleanupHashes, $HashClass->battlefieldCleanupHashes);
        $this->battlefieldMoveHashes = array_merge($this->battlefieldMoveHashes, $HashClass->battlefieldMoveHashes);
    
        return $this;
    }
}

I've benchmarked each part of the code, to see where am I losing time, but everything seems to be fast enough to not warrant the time increase I am experiencing. What I am thinking is, that the problem lies in the Threads, sometimes, it seems that no job is being done at all, like they are waiting for some thread. Any insights on what could be the problem, would be greatly appreciated.

Community
  • 1
  • 1
Adam Baranyai
  • 3,635
  • 3
  • 29
  • 68
  • 3
    Have you considered using `Beanstalkd` as a queue rather then trying to implement it yourself? PHP is not really a great language to write a multithreadded application in. – Geoffrey Jan 16 '19 at 11:37
  • @Geoffrey I didn't hear about `Beanstalkd` before, but I checked it out now, and unfortunately, this script is running in a Windows environment, so it is out of the question for me:( – Adam Baranyai Jan 16 '19 at 11:44

0 Answers0