google-cloud-bigtable

Spark-HBase - GCP template (3/3) - Missing libraries?

左心房为你撑大大i 提交于 2021-01-15 19:38:15
问题 I'm trying to test the Spark-HBase connector in the GCP context and tried to follow the instructions, which asks to locally package the connector, and I get the following error when submitting the job on Dataproc (after having completed these steps). Command (base) gcloud dataproc jobs submit spark --cluster $SPARK_CLUSTER --class com.example.bigtable.spark.shc.BigtableSource --jars target/scala-2.11/cloud-bigtable-dataproc-spark-shc-assembly-0.1.jar --region us-east1 -- $BIGTABLE_TABLE Error

Spark-HBase - GCP template (3/3) - Missing libraries?

跟風遠走 提交于 2021-01-15 19:36:07
问题 I'm trying to test the Spark-HBase connector in the GCP context and tried to follow the instructions, which asks to locally package the connector, and I get the following error when submitting the job on Dataproc (after having completed these steps). Command (base) gcloud dataproc jobs submit spark --cluster $SPARK_CLUSTER --class com.example.bigtable.spark.shc.BigtableSource --jars target/scala-2.11/cloud-bigtable-dataproc-spark-shc-assembly-0.1.jar --region us-east1 -- $BIGTABLE_TABLE Error

Spark-HBase - GCP template (3/3) - Missing libraries?

☆樱花仙子☆ 提交于 2021-01-15 19:36:06
问题 I'm trying to test the Spark-HBase connector in the GCP context and tried to follow the instructions, which asks to locally package the connector, and I get the following error when submitting the job on Dataproc (after having completed these steps). Command (base) gcloud dataproc jobs submit spark --cluster $SPARK_CLUSTER --class com.example.bigtable.spark.shc.BigtableSource --jars target/scala-2.11/cloud-bigtable-dataproc-spark-shc-assembly-0.1.jar --region us-east1 -- $BIGTABLE_TABLE Error

How to delete a column of a single row in Google Cloud Bigtable with HBase API

空扰寡人 提交于 2020-12-11 04:47:34
问题 I'm using the HBase API to access Google Cloud Bigtable, but whenever I try to delete a column: Delete delete = new Delete(r.getRow()); delete.addColumn(CF, Bytes.toBytes(d.seqid())); delete.addColumn(CF, COL_LEASE); tasksTable.delete(delete); I'm getting an UnsupportedOperationException : java.lang.UnsupportedOperationException: Cannot delete single latest cell. at com.google.cloud.bigtable.hbase.adapters.DeleteAdapter.throwIfUnsupportedPointDelete(DeleteAdapter.java:85) at com.google.cloud

How to delete a column of a single row in Google Cloud Bigtable with HBase API

别来无恙 提交于 2020-12-11 04:40:11
问题 I'm using the HBase API to access Google Cloud Bigtable, but whenever I try to delete a column: Delete delete = new Delete(r.getRow()); delete.addColumn(CF, Bytes.toBytes(d.seqid())); delete.addColumn(CF, COL_LEASE); tasksTable.delete(delete); I'm getting an UnsupportedOperationException : java.lang.UnsupportedOperationException: Cannot delete single latest cell. at com.google.cloud.bigtable.hbase.adapters.DeleteAdapter.throwIfUnsupportedPointDelete(DeleteAdapter.java:85) at com.google.cloud

How to delete a column of a single row in Google Cloud Bigtable with HBase API

﹥>﹥吖頭↗ 提交于 2020-12-11 04:39:55
问题 I'm using the HBase API to access Google Cloud Bigtable, but whenever I try to delete a column: Delete delete = new Delete(r.getRow()); delete.addColumn(CF, Bytes.toBytes(d.seqid())); delete.addColumn(CF, COL_LEASE); tasksTable.delete(delete); I'm getting an UnsupportedOperationException : java.lang.UnsupportedOperationException: Cannot delete single latest cell. at com.google.cloud.bigtable.hbase.adapters.DeleteAdapter.throwIfUnsupportedPointDelete(DeleteAdapter.java:85) at com.google.cloud

How to delete a column of a single row in Google Cloud Bigtable with HBase API

萝らか妹 提交于 2020-12-11 04:38:59
问题 I'm using the HBase API to access Google Cloud Bigtable, but whenever I try to delete a column: Delete delete = new Delete(r.getRow()); delete.addColumn(CF, Bytes.toBytes(d.seqid())); delete.addColumn(CF, COL_LEASE); tasksTable.delete(delete); I'm getting an UnsupportedOperationException : java.lang.UnsupportedOperationException: Cannot delete single latest cell. at com.google.cloud.bigtable.hbase.adapters.DeleteAdapter.throwIfUnsupportedPointDelete(DeleteAdapter.java:85) at com.google.cloud

Google Cloud Bigtable backup and recovery

爷,独闯天下 提交于 2020-08-22 04:34:12
问题 I am new to Google Cloud Bigtable and have a very basic question as to whether the cloud offering protects my data against user error or application corruption? I see a lot of mention on the Google website that the data is safe and protected but not clear if the scenario above is covered because I did not see references to how I can go about restoring data from a previous point-in-time copy. I am sure someone on this forum knows! 回答1: Updated 7/24/2020 : Bigtable now supports both backups and

Big table vs Big Query usecase for timeseries data

点点圈 提交于 2020-06-10 05:12:23
问题 I am looking to finalize on Big table vs Big Query for my usecase of timeseries data. I had gone through https://cloud.google.com/bigtable/docs/schema-design-time-series This is for storing an Omniture data which contains information like website visitor key(some Long key), his cookie id(some Long key), timestamp series data web hits for his IP, cookie What can be used as the rowkey for Big table? I cannot be using timestamp or CookieId as a prefix, as I learn from the best practices. But

Why increments are not supported in Dataflow-BigTable connector?

狂风中的少年 提交于 2020-05-13 08:14:32
问题 We have a use case in the Streaming mode where we want to keep track of a counter on BigTable from the pipeline (something #items finished processing) for which we need the increment operation. From looking at https://cloud.google.com/bigtable/docs/dataflow-hbase, I see that append/increment operations of the HBase API are not supported by this client. The reason stated is the retry logic on batch mode but if Dataflow guarantees exactly-once, why would supporting it be a bad idea since I know