Questions tagged [apache-phoenix]

For questions about Apache Phoenix. For the Elixir web framework, use phoenix-framework.

For questions about Apache Phoenix. For the Elixir web framework, use .

702 questions
0
votes
2 answers

Query to get columns from system.catalog table and do a select query

I am trying to build a query to fetch columns from the system.CATALOG table and to continue querying based on the resultset. I looked at a few queries but seem to be unable to find anything that satisfies my requirements. I don't have much to show,…
Satya
  • 153
  • 5
  • 15
0
votes
1 answer

Browsing Hbase data in Hue through Phoenix

I am using CDH 5.4.4 and installed Phoenix parcel to be able to run SQL on hbase tables. Has anyone tried to browse that data using Hue? I know since we can connect using JDBC connection to Phoenix, there must be a way for Hue to connect to it too.
morfious902002
  • 916
  • 1
  • 11
  • 29
0
votes
3 answers

phoenix error: hbase table undefined, even though it is present

I am trying to access my hbase running on my local machine with zookeeper at localhost:2181. I installed phoenix-3.3.1-bin and trying to access an already existing hbase tabe, but could not. So, simply to test, i created a table using phoenix…
user553182
  • 39
  • 3
  • 7
0
votes
1 answer

Using phoenix-spark plugin to insert an ARRAY Type

I have a problem. I have a Spark RDD that I have to store inside an HBase table. We use the Apache-phoenix layer to dialog with the database. There a column of the table that is defined as an UNSIGNED_SMALLINT ARRAY: CREATE TABLE EXAMPLE (..., Col10…
riccardo.cardin
  • 7,971
  • 5
  • 57
  • 106
0
votes
2 answers

add HBase Timestamp using Phoenix-Spark API

How do I add an HBase Timestamp using Phoenix-Spark similiar to HBase API: Put(rowkey, timestamp.getMillis) This is my code: val rdd = processedRdd.map(r => Row.fromSeq(r)) val dataframe = sqlContext.createDataFrame(rdd,…
sophie
  • 991
  • 2
  • 15
  • 34
0
votes
1 answer

Does Phoenix-Spark API have a checkAndPut method like HBase API?

I am using Spark 1.3, HBase 1.1 and Phoenix 4.4. I have this in my code: val dataframe = sqlContext.createDataFrame(rdd, schema) dataframe.save("org.apache.phoenix.spark", SaveMode.Overwrite, Map("table" -> "TEST_SCHEMA.TEST_HTABLE", "zkUrl" ->…
sophie
  • 991
  • 2
  • 15
  • 34
0
votes
1 answer

Fetch Data from table 1 based on text in a column and a different text in a column of different table

I have two tables. Table 1: T1id1(pk) col1 col2 col3 Table 2: id(pk) T1id1(FK) col1 col2 col3 I get two texts from user, where table1.col1 has some text(like %text1%) and similarly for table2 col1 i get another text which is not equal to…
Satya
  • 153
  • 5
  • 15
0
votes
1 answer

Phoenix Hbase: ResultSet.next() runs into StackOverflowError

I am seeing a weird error while iterating over the ResultSet of a query. My code is in Java and I do something like the following: String sql = " SELECT * FROM TABLE_NAME WHERE DATE_TIME > 'something' ORDER BY DATE_TIME DESC LIMIT 1" ResultSet…
Roger
  • 2,823
  • 3
  • 25
  • 32
0
votes
1 answer

Apache Phoenix Double Datatype issue when writting MapReduce

I'm using Apache Phoenix to create table in Hbase because it provides secondary index features and also sql like datatypes. I Created a table using phoenix with columns as both Double and Varchar. CREATE TABLE INVOICE (ROWKEY VARCHAR NOT NULL…
0
votes
1 answer

I am using Hbase 1.0.0 and Apache phoenix 4.3.0 on CDH5.4. When I restart Hbase regionserver is down

I even tried CDH parcel for Apache Phoenix . But same problem exists. I added phoenix-4.3.0-server.jar in /usr/lib/hbase/lib/ and tried to restart Hbase cluster. But my region server is not coming up.
voldy
  • 359
  • 1
  • 8
  • 21
0
votes
2 answers

Order by and Join in SQL or spark or mapreduce

I have two tables whose content is as below. Table 1: ID1 ID2 ID3 ID4 NAME DESCR STATUS date 1 -12134 17773 8001300701101 name1 descr1 INACTIVE 20121203 2 -12136 17773 …
Satya
  • 153
  • 5
  • 15
0
votes
1 answer

Connection error while upgrading from Phoenix 4.0.0 to 4.3.1

Currently using phoenix 4.0.0 incubating for both client and server. Upgraded to 4.3.1(most recent) While trying to connect using the client in command line (using ./sqlline.py) the connection could not be success throwing the following…
Arun
  • 1,176
  • 5
  • 20
  • 48
0
votes
1 answer

Apache Phoenix's behavior with limit

I would appreciate if someone could help with my question regarding pheonix's functionality. Created a phoenix table and inserted 100,000 records(assume these are spread across different region servers). Now, when I issue a select query with a limit…
Sean
  • 11
  • 2
0
votes
1 answer

How to load data into Phoenix from Hive?

Hive has a HBase integration. Given that, would it be possible to insert into a Phoenix table from hive? (as a bulk load/insert overwrite) What about compound primary keys? Can we generate them in Hive?
Luís Bianchin
  • 2,327
  • 1
  • 28
  • 36
0
votes
1 answer

Phoenix - No current connection - HRegion.mutateRowsWithLocks : java.lang.NoSuchMethodError

I try to run Phoenix in localhost and can't resolve the error (can't find where is mutateRowsWithLocks). I would like a lot to run SQL queries on HBase so hope someone will help me: org.apache.hadoop.hbase.DoNotRetryIOException: …
tdebroc
  • 1,436
  • 13
  • 28