Questions tagged [google-cloud-bigtable]

Google Cloud Bigtable is a fast, fully managed, massively scalable NoSQL database service designed for applications requiring terabytes to petabytes of data. Cloud Bigtable doesn't require you to sacrifice speed, scale, or cost efficiency when your applications grow.

Google Cloud Bigtable is a fast, fully managed, massively scalable NoSQL database service designed for applications requiring terabytes to petabytes of data. Cloud Bigtable doesn't require you to sacrifice speed, scale, or cost efficiency when your applications grow.

Cloud Bigtable is exposed to applications through a supported extension to the Apache HBase 1.0 Java library.

628 questions
0
votes
1 answer

Row timestamps in Bigtable - when are they updated?

The definition of TimestampRangeFilter from Bigtable's Go API is: TimestampRangeFilter returns a filter that matches any rows whose timestamp is within the given time bounds. Is the row timestamp updated when: Any column value is written/changed…
user01380121
  • 527
  • 6
  • 18
0
votes
2 answers

checkAndPut always succeeds

I am keeping a counter in bigtable, and want to put some data in when the counter reaches X. I am Currently doing it like this: Put put = new Put(EXAMPLE_ROW_KEY); put.addColumn(EXAMPLE_COLUMN_FAMILY, NEW_QUALIFIER, NEW_VALUE); boolean success =…
safyia
  • 190
  • 2
  • 11
0
votes
1 answer

Bigtable design and querying with respect to number of column families

From Cloud Bigtable schema design documentation: Grouping data into column families allows you to retrieve data from a single family, or multiple families, rather than retrieving all of the data in each row. Group data as closely as you can to get…
mmziyad
  • 298
  • 1
  • 4
  • 16
0
votes
1 answer

Is it possible to read from BigTable async?

BigTable hbase API has the BigTable.get() function to read a list of gets. Sometimes this action can take quite a long time (like 100ms). I wonder if there is anyway we can get a future for it so that we can use this 100ms to do something else?…
Justin Zhang
  • 4,433
  • 1
  • 12
  • 9
0
votes
1 answer

Google App Engine Apis

I am new to Google App Engine. I am building a web app using python and django rest api framework. I am using cloud bigtable, cloud sql and cloud storage for my databases and storage. I wanted to know if I have to use Google App Engine Apis or can…
0
votes
1 answer

Column lookup and update in Bigtable

Assume that we have a Bigtable structure as follows: Table1: cf1 [“col1”, “col2”], cf2[“colX”, “colY”] Query from hbase client API: Get getT = new Get(Bytes.toBytes(rowKey)); getT.addColumn(byteArray_cf1, byteArray_col1); Result rt =…
0
votes
1 answer

What happens to an empty tablet?

Let's say I create a table with 10 initial split points and therefore 10 initial tablets, and after a while one of them gets to max size and is auto-split. Assuming my key is partition_counter, and my counter keeps increasing, I'll be inserting into…
minimo
  • 1,396
  • 1
  • 9
  • 9
0
votes
1 answer

Unable to encode element with HBaseMutationCoder

I'm trying to apply mutations (Increments) to Bigtable via Dataflow using cloud-bigtable-client (https://github.com/GoogleCloudPlatform/cloud-bigtable-client). Here is a high-level summary of what my job does: PCollection somedata =…
0
votes
1 answer

GoogleCredentialas.getApplicationDefault NoSuchMethodError

I'm trying to use the cloud bigtable v2 API and while I know it works in my Apache Beam job, when I try to use the API directly I encounter this error: Caused by: java.lang.NoSuchMethodError:…
0
votes
1 answer

Perform a google.cloud.happybase Bigtable RowKeyRegexFilter Scan

UPDATE: This only happens with Google Cloud Bigtable Emulator, not with actual development or production BigTable instances (Google Cloud SDK 149.0.0) I'm trying to do row filtering by Key regex filter, everything is working like a charm (filter…
danius
  • 2,664
  • 27
  • 33
0
votes
1 answer

PHP Cloud Datastore and Cloud Dataflow

I designed to keep raw data from IoT devices to Cloud DataStore via GAE Flex (PHP). I also want to bring those data to BigQuery via Cloud DataFlow. However, I cannot find the standard or official documents which express the ways to read and dump…
0
votes
1 answer

How to read data from bigtable in google data proc

I am tring to read data from Bigtable in Google cloud data proc. Below code i am using to read data from Bigdtable. PipelineOptions options = PipelineOptionsFactory.fromArgs(args).create(); …
0
votes
1 answer

How can I batch checkAndDelete in BigTable using HBase API?

I understand that we can do checkAndDelete on BigTable using the HBase API. I wonder if there is a way for me to submit a list of checkAndDelete so that I don't need to do them one by one? I mostly worry about the performance about non-batching…
Justin Zhang
  • 4,433
  • 1
  • 12
  • 9
0
votes
1 answer

BigtableConfiguration v BigtableOptions when running Bigtable emulator

I have some working code. It uses a BigtableConfiguration object to return to get a connection to Bigtable, like so: var connection = BigtableConfiguration.connect("myProject", "myCluster") The connection returned is of type…
Ciaran Archer
  • 12,316
  • 9
  • 38
  • 55
0
votes
1 answer

Google Bigtable performance: QPS vs CPU utilization

I am testing a small Bigtable cluster (minimum 3 nodes). I see on the Google console that as the Write QPS level approaches 10K, the CPU utilization approaches the recommended maximum of ~80%. From what I understand, the QPS metric is for the whole…
VS_FF
  • 2,353
  • 3
  • 16
  • 34