0

My pipeline is very simple.

Pipeline p = Pipeline.create();
         p.apply("Read", (BigtableIO.read().withBigtableOptions(BIGTABLE_OPTIONS)).withKeyRange(keyRange).withTableId("myTable"));
        p.run().waitUntilFinish();

But when I run I constantly receiving:

Exception in thread "main" java.lang.IllegalStateException: **Not started**
    at com.google.common.base.Preconditions.checkState(Preconditions.java:459)
    at io.grpc.internal.ClientCallImpl.request(ClientCallImpl.java:344)
    at io.grpc.ForwardingClientCall.request(ForwardingClientCall.java:52)
    at io.grpc.ForwardingClientCall.request(ForwardingClientCall.java:52)
    at io.grpc.ForwardingClientCall.request(ForwardingClientCall.java:52)
    at io.grpc.stub.ClientCalls.startCall(ClientCalls.java:276)
    at io.grpc.stub.ClientCalls.asyncUnaryRequestCall(ClientCalls.java:249)
    at io.grpc.stub.ClientCalls.futureUnaryCall(ClientCalls.java:186)
    at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:132)
    at com.google.bigtable.admin.v2.BigtableTableAdminGrpc$BigtableTableAdminBlockingStub.getTable(BigtableTableAdminGrpc.java:381)
    at com.google.cloud.bigtable.grpc.BigtableTableAdminGrpcClient.getTable(BigtableTableAdminGrpcClient.java:58)
    at org.apache.beam.sdk.io.gcp.bigtable.BigtableServiceImpl.tableExists(BigtableServiceImpl.java:82)
    at org.apache.beam.sdk.io.gcp.bigtable.BigtableIO$Read.validate(BigtableIO.java:294)
    at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:578)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:482)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:486)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$400(TransformHierarchy.java:235)
    at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:210)
    at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:409)
    at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:520)
    at org.apache.beam.sdk.Pipeline.run(Pipeline.java:294)
    at org.apache.beam.sdk.Pipeline.run(Pipeline.java:281)

I already tried the inside my IDE, outside through command line.

And every attempt I keep getting this error.

I'm running on Windows environment.

Any help would be appreciated.

Thanks.

  • What runner are you using to run this? Directrunner? – Pablo Aug 29 '17 at 17:56
  • I want to run it locally so I'm not providing any. I just run: mvn exec:java -Dexec.mainClass="xx.xx.ingestion.TimeSeriesLoad" to execute the main program. – Flávio Lúcio Pereira Aug 29 '17 at 18:01
  • I'm not completely sure of what could be your problem. One thing that happens in the Dataflow runner (perhaps others) is that `Read` transforms won't execute if they are not consumed. Have you tried adding a ParDo after the read in your pipeline? – Pablo Aug 29 '17 at 21:33

1 Answers1

0

I believe this is an issue in Cloud Bigtable client that has been fixed in a later version. Beam 2.1.0 uses the new version, please try using Beam 2.1.0 and let us know if the problem persists.

jkff
  • 17,623
  • 5
  • 53
  • 85
  • Yep. It works fine now. I just updated the version to 2.1.0 and its working as expected. Thank you for your help. – Flávio Lúcio Pereira Aug 30 '17 at 13:13
  • I'm the author of the java Cloud Bigtable client. We did indeed have a problem in the Cloud Bigtable (cbt) java client version 0.9.6 that Beam 2.0.0 used. This was fixed in the 0.9.7 cbt java client that Beam 2.1.0. For some context... everything worked fine when invoking operations in GCP in the 0.9.6 release, so we didn't notice the problem. The problem was a bit nuanced. This was the Pull Request that fixed the problem: https://github.com/GoogleCloudPlatform/cloud-bigtable-client/commit/9ad8d0c545a3c8c2beaf80f8a9e1111227f9a033 – Solomon Duskis Sep 01 '17 at 14:53