1

I'm getting the following fatal error when running the php script below:

Waiting for job to complete

**Fatal error: Uncaught Error: Call to undefined method 
Google\Cloud\BigQuery\CopyJobConfiguration::*reload()*
 in /opt/bitnami/apache2/htdocs/test.php:53 Stack trace: #0 [internal function]: 
 {closure}() #1 /opt/bitnami/apache2/htdocs/vendor/google/cloud/src/Core/ExponentialBackoff.php(74):
 call_user_func_array(Object(Closure), Array) #2 /opt/bitnami/apache2/htdocs/test.php(58):
 Google\Cloud\Core\ExponentialBackoff->execute(Object(Closure)) #3 /opt/bitnami/apache2/htdocs/test.php(36): 
 copy_table('aaaa', 'bbbb', 'cccc', 'dddd', 'eeee') #4 {main} thrown in /opt/bitnami/apache2/htdocs/test.php on line 54**

.

<?php

header("Cache-Control: no-store, no-cache, must-revalidate"); // HTTP/1.1
header("Cache-Control: post-check=0, pre-check=0", false);
header("Expires: Sat, 26 Jul 1997 05:00:00 GMT"); // Date in the past
header("Pragma: no-cache"); // HTTP/1.0
header("Last-Modified: " . gmdate("D, d M Y H:i:s") . " GMT");

ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);

# Includes the autoloader for libraries installed with composer

require "vendor/autoload.php";  

use Google\Cloud\BigQuery\BigQueryClient;
use Google\Cloud\ServiceBuilder;
use Google\Cloud\Core\ExponentialBackoff;

$client = new Google_Client();
putenv('GOOGLE_APPLICATION_CREDENTIALS='.dirname(__FILE__) . '/.ssh/xxx.json');
$client->useApplicationDefaultCredentials();

# Your Google Cloud Platform project stuff
$projectId = 'aaaa';
$datasetIdin = 'bbbb';
$datasetIdout = 'cccc';
$tableIdin = "dddd";
$tableIdout = "eeee";

$gcloud = new ServiceBuilder([ 
    'projectId' => $projectId
]);

copy_table($projectId, $datasetIdin, $datasetIdout, $tableIdin, $tableIdout);

function copy_table($projectId, $datasetIdin, $datasetIdout, $tableIdin, $tableIdout)
{
    $bigQuery = new BigQueryClient([
        'projectId' => $projectId,
    ]);
    $datasetin = $bigQuery->dataset($datasetIdin);
    $datasetout = $bigQuery->dataset($datasetIdout);
    $sourceTable = $datasetin->table($tableIdin);
    $destinationTable = $datasetout->table($tableIdout);
    $job = $sourceTable->copy($destinationTable);

    // poll the job until it is complete
    $backoff = new ExponentialBackoff(10);
    $backoff->execute(function () use ($job) {
        print('Waiting for job to complete' . PHP_EOL);
        $job->reload();
        if (!$job->isComplete()) {
            throw new Exception('Job has not yet completed', 500);
        }
    });
    // check if the job has errors
    if (isset($job->info()['status']['errorResult'])) {
        $error = $job->info()['status']['errorResult']['message'];
        printf('Error running job: %s' . PHP_EOL, $error);
    } else {
        print('Table copied successfully' . PHP_EOL);
    }

echo "table copied"; 
}

?>

It looks as 'though the appropriate class has not been loaded? The composer.json I am using is:

{
    "require": {
        "google/cloud": "^0.47.0",
        "google/apiclient": "^2.0"
    }
}

Any ideas as to why I'm hitting this error? Thanks!

Pentium10
  • 204,586
  • 122
  • 423
  • 502
  • You have a low rate. Important on SO, you have to mark accepted answers by using the tick on the left of the posted answer, below the voting. This will increase your rate. See how this works by visinting this link: http://meta.stackoverflow.com/questions/5234/how-does-accepting-an-answer-work#5235 – Pentium10 Dec 16 '17 at 22:43

1 Answers1

2

Something is off in your approach.

$job = $sourceTable->copy($destinationTable);

this doesn't return a Job.

Returns a copy job configuration to be passed to either Google\Cloud\BigQuery\Table::runJob() or Google\Cloud\BigQuery\Table::startJob().

so you further need to add code to pass this conf to the above functions.

http://googlecloudplatform.github.io/google-cloud-php/#/docs/google-cloud/v0.47.0/bigquery/table?method=copy

Pentium10
  • 204,586
  • 122
  • 423
  • 502
  • Thanks! I'm using the php function (or very close to it) at https://cloud.google.com/bigquery/docs/tables#copy-table, so I can't see what other code I need to add? – Chris Jones Dec 17 '17 at 06:29
  • on every page where the tutorial is off, you can signal it on the feedback page, you can find the link in the top of the page. you need to add something like `$jobConfig = $sourceTable->copy($destinationTable);$job = $table->startJob($jobConfig);` – Pentium10 Dec 17 '17 at 07:09
  • I have just left feedback! Thanks @Pentiom10! A further question - what does the variable $table in $job = $table->startJob($jobConfig); refer to? Must that be a pre-declared variable? – Chris Jones Dec 17 '17 at 07:29
  • I think it's just an encapsulation of a method under table entity it can be your `$sourceTable`, I don't have a shell to test out, but try it out. – Pentium10 Dec 17 '17 at 07:39