Questions tagged [apache-phoenix]

For questions about Apache Phoenix. For the Elixir web framework, use phoenix-framework.

For questions about Apache Phoenix. For the Elixir web framework, use .

702 questions
0
votes
1 answer

Type mismatch. VARCHAR and TIMESTAMP with Pheonix

Getting Error: ERROR 203 (22005): Type mismatch. VARCHAR and TIMESTAMP for '2017-08-30 06:21:46.732' SQLState: 22005 ErrorCode: 203 while executing the below query in SQuirrel SQL Client with apache.pheonix select * from USER_T where…
Digital
  • 549
  • 1
  • 7
  • 26
0
votes
2 answers

Phoenix sql query not working with large dataset

I have 5 millions records in hbase and tried to find total count of records then I am getting following error using phoenix command line. Error: org.apache.phoenix.exception.PhoenixIOException: Failed to get result within timeout, timeout=60000ms…
Jain Hemant
  • 150
  • 2
  • 19
0
votes
1 answer

Phoenix Hbase rowtimestamp feature not working when bulk loading

I am using hbase1.1 and phoenix 4.9.0. I am mapping my phoenix table's date field to hbase timestamp using phoenix rowtimestamp feature. it's working fine with upsert query. when i am bulk loading data into it, it's not taking effect.it takes only…
Rahul
  • 459
  • 2
  • 13
0
votes
0 answers

How to connect to phoenix hbase using PHP?

I know this sounds like a naive question but I have searched a lot and I could not find a way to connect to Phoenix from PHP code. Has anyone been successful in connecting to phoenix and running queries from php code? Even if you know the driver to…
Rohit Aila
  • 11
  • 3
0
votes
0 answers

Apache Phoenix Array Modification

I am using Phoenix for my project and I am stuck with Array types. My query is, suppose I have an array column in my table, I am inserting the data in that array, and after some time, weeks, months I will need to update the array, basically…
yugantar
  • 1,970
  • 1
  • 11
  • 17
0
votes
1 answer

Apache Phoenix 4.7 csvBulkLoad.run() throws ClassNotFoundException

We are running a spark streaming job to read from Kafka, convert to csv and then write to Hbase. I am using the API CSVBulkLoad to run a bulk load job. The spark job starts fine and converts to CSV but the csvBulkLoad.run() starts a new MR job but…
0
votes
1 answer

Phoenix View for huge HBase Table

I'm working on a Hortonworks Data Platform 2.6 cluster with HBase 1.1.2 and Phoenix 4.7 installed. I have a huge HBase table with lots of columns, where sometimes new columns are added if new data is added (data is added by HBase API's Put…
D. Müller
  • 3,336
  • 4
  • 36
  • 84
0
votes
0 answers

Apache Phoenix queries taking too long

I am using Apache Phoenix to run some queries but their performance look bad compared to what I was expecting. As an example, considering a table like: CREATE TABLE MY_SHORT_TABLE ( MPK BIGINT not null, ... 38 other columns ... CONSTRAINT pk…
ssobreiro
  • 11
  • 1
  • 6
0
votes
1 answer

phoenix through upload bulk csv file command not understand?

I want to upload bulk csv file using phonix but I can not understood below command. Can you explain me in details ? HADOOP_CLASSPATH=$(hbase mapredcp):/path/to/hbase/conf hadoop jar phoenix--client.jar…
Jain Hemant
  • 150
  • 2
  • 19
0
votes
0 answers

hbase table added index using phoenix but it is taking same time before adding index. Do I need to do something else?

I have created one hbase table using phoenix and added data using phoenix bulk upload and also created index table and added reference into index table as well but when I execute and SQL then it will take some time as before adding index table. I…
Jain Hemant
  • 150
  • 2
  • 19
0
votes
0 answers

error while launching apache Phoenix service

I have installed apache-phoenix-4.10.0-HBase-1.2-bin.tar on my cloudera ready VM(cloudera-quickstart-vm-5.10.0-0-virtualbox), and I think that I am able to launch the phoenix command line prompt by executing the below query from phoenix bin dir sudo…
user3521180
  • 1,044
  • 2
  • 20
  • 45
0
votes
0 answers

phoenix sql timeout to execute count query with 30 million records on hbase table

I have one hbase table that has 30 million records. I want to paginate the records on web application but when I execute phoenix sql query it takes 55 seconds or sometime timeouts. How to fix this?
Jain Hemant
  • 150
  • 2
  • 19
0
votes
1 answer

Exception when using the saveToPhoenix method to load/save a RDD on Hbase

I would like to use the apache-phoenix framework. The problem is that I keep having an exception telling me that the class HBaseConfiguration can't be found. Here is the code I want to use: import org.apache.spark.SparkContext import…
Omegaspard
  • 1,828
  • 2
  • 24
  • 52
0
votes
2 answers

Using phoenix to save a dataframe on Hbase

As the title says, I want to save my DataFrame with phoenix. I have a spark code in scala that I run on intellij IDEA. It is quite simple : import org.apache.spark.sql.SparkSession import org.apache.phoenix.spark._ object MainTest extends App { …
Omegaspard
  • 1,828
  • 2
  • 24
  • 52
0
votes
0 answers

spark submit with phoenix

I'm Trying to connect to phoenix using spark submit on secured cluster I have this Exception ConnectionQueryServicesImpl: Trying to connect to a secure cluster with keytab:/hbase-secure 17/06/29 11:54:44 ERROR Executor: Exception in task 0.0…