Questions tagged [google-cloud-bigtable]

Google Cloud Bigtable is a fast, fully managed, massively scalable NoSQL database service designed for applications requiring terabytes to petabytes of data. Cloud Bigtable doesn't require you to sacrifice speed, scale, or cost efficiency when your applications grow.

Google Cloud Bigtable is a fast, fully managed, massively scalable NoSQL database service designed for applications requiring terabytes to petabytes of data. Cloud Bigtable doesn't require you to sacrifice speed, scale, or cost efficiency when your applications grow.

Cloud Bigtable is exposed to applications through a supported extension to the Apache HBase 1.0 Java library.

628 questions
0
votes
1 answer

Bigtable import error

I generated a sequence file using hive and trying to import it in bigtable, my import job is failing with the error below. 2015-06-21 00:05:42,584 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:…
0
votes
1 answer

Will Google Cloud Bigtable be HIPAA compliant?

Will Google Cloud Bigtable be a HIPAA compliant data repository? In particular, will it support on disk encryption? And how much of the data will be stored concurrently with other users?
Joe W
  • 2,773
  • 15
  • 35
0
votes
1 answer

Unable to connect to Google Bigtable using HBase REST api

Following this example, running the test script "python put_get_with_client.py" results in a 400 error (Bad Request). Bad request java.lang.ClassCastException: org.apache.hadoop.hbase.client.BigtableConnection cannot be cast to…
aliasmrchips
  • 949
  • 9
  • 8
-1
votes
1 answer

How to setup staging, pre-prod for google dataflow jobs?

Say we have a dataflow job: Written in Apache Beam Java SDK which is a Gradle project. Uses pubsub stream as input, writes results to bigtable and writes logs to BigQuery. As with deploying a server, we can easily have a staging, pre-prod and prod…
-1
votes
1 answer

Will my BigTable schema result in hotspotting?

Heres my schema Heres some example data Rows of this row key structure $PipelineId--$PipelineRunTime will be written less often but with much larger data, not that it would be anywhere close to going over the row limit of data. And rows of this…
-1
votes
1 answer

Is BigTable appropriate for inserting single rows very frequently?

We have a streaming solution that takes messages from a pubsub topic and uses DataFlow to stream each message into a BigQuery table. This is a very appropriate use case for BigQuery. We would also like to take a subset of those messages and make…
jamiet
  • 10,501
  • 14
  • 80
  • 159
-1
votes
1 answer

Efficient way of deleting a empty row from google bigtable

we have set expiry for columns in bigtable. Over a period of time, the number of rows not holding any data(only keys) has been increased. I am looking for an efficient way to delete these empty rows from a table. For ex: key: key1 column1:…
deep
  • 31
  • 5
-1
votes
1 answer

Access to specific raw with partial information using Regex on Bigtable

I'm working with Google BigTable. I would like to use the convertToRegExpString method to access at specific Row Key having partial information. I have a row key like this: "AAAAA&BBBBB&CCCCC&DDDDD" I should pass to the method a regex that…
-1
votes
1 answer

how to fix "Maven Could not create local repository" in Google Cloud Bigtable

I have a problem with maven it error says "Could not create local repository". I try to .m2 already but it says don't have the repository name m2. I have check file it doesn't have m2 because Maven doesn't have permission to create the repository.…
-1
votes
1 answer

Intermittent errors: java.io.IOException: Failed to advance reader of source: BigtableSource{config=BigtableConfig{projectId=

I've been getting intermittent errors like the following trying to connect to bigtable to read about ~7millions row of data using Dataflow: java.io.IOException: Failed to advance reader of source: BigtableSource{config=BigtableConfig{projectId= --…
-1
votes
1 answer

How to convert row from bigtable to Avro generic records

I am reading bigtable in my Pcollection and then trying to convert the read records to Avro Generic Records .Is it possible to directly change read from big table to generic records without writing any function in the pCollection ? For example : i…
-2
votes
1 answer

Best Design for a Table in BigTable

I have a workload that I need to migrate a table to BigTable with the next information: Social Security Number (SSN) 600 values of Score for each SSN. This number of Scote can increase in the next years. We have about 240 million of SSN. The table…
-2
votes
1 answer

Bigtable data is removed automatically 30 minutes after insertion

I have a table in Bigtable named "orders" with one column family "order-family". It returns this configuration Column Family: order-family GC Rule: {"gcRule":{"maxAge":"86400s"}}. I can insert data into the "orders" table, but after 30 minutes the…
1 2 3
41
42