-1
$alias = \Drupal::service('path.alias_manager')->getPathByAlias($url);
$params = Url::fromUri("internal:" . $alias)->getRouteParameters();   

I get the node id from the above code.

\Drupal::entityTypeManager()
                ->getStorage('taxonomy_term')
                ->loadByProperties(['name' => $term]);

I get the tag id from the above code.

function  updateTaxonomy($url_tag_array) {
  foreach($url_tag_array as $node_id => $tag) {
    $node = \Drupal\node\Entity\Node::load($node_id);
    // May be if the node is not available
    if ($node !== NULL) {
        $node->field_ga_tag->target_id = $tag;
        $node->save();
    }
   }
 }

Finally I update the taxonomy with the above code.

It all works but it takes huge amount of time and take more than execution time, more than 15 mins.

Is there a way to update query less no of times? any possible using raw batch queries?

Abel
  • 2,371
  • 3
  • 15
  • 29

1 Answers1

0

You can try this with Cron running. Also, validate the contents not to be duplicated.