0

Thanks advance for your help.

I deploy spring-cloud-dataflow-server-yarn with ambari. But when I start a stream jdbc-source-kafka and hdfs-sink-kafka. The stream deploy not run. The streams config

jdbc-source-kafka --max-rows-per-poll=10 
--query='select t.* from clear_order_report t' 
--password=****** 
--driver-class-name=oracle.jdbc.OracleDriver 
--username=****** --url=jdbc:oracle:thin:@10.48.171.21:1521:******
| hdfs-sink-kafka 
--fs-uri=hdfs://master.99wuxian.com:8020 
--file-name=clear_order_report
--directory=/dataflows/apps/top

I aslo repackage the jdbc-source-kafka-10-1.1.1.RELEASE.jar and add oracle jdbc driver.

And the yarn log is below
Application Overview User: scdf Name: scdstream:app:oracle2hdfs Application Type: DATAFLOW Application Tags:
Application Priority: 0 (Higher Integer value indicates higher priority) YarnApplicationState: ACCEPTED: waiting for AM container to be allocated, launched and register with RM. Queue: default FinalStatus Reported by AM: Application has not completed yet. Started: 星期四 二月 09 17:38:33 +0800 2017 Elapsed: 21hrs, 18mins, 27sec Tracking URL: ApplicationMaster Log Aggregation Status NOT_START Diagnostics: [星期四 二月 09 17:38:34 +0800 2017] Application is added to the scheduler and is not yet activated. Queue's AM resource limit exceeded. Details : AM Partition = <DEFAULT_PARTITION>; AM Resource Request = <memory:1024, vCores:1>; Queue Resource Limit for AM = <memory:1280, vCores:1>; User AM Resource Limit of the queue = <memory:1280, vCores:1>; Queue AM Resource Usage = <memory:1024, vCores:1>; Unmanaged Application: false Application Node Label expression: <Not set> AM container Node Label expression: <DEFAULT_PARTITION>

user3172755
  • 137
  • 1
  • 10
  • What errors do you specifically see in the SCDF-server logs? What you posted above doesn't seem to include any errors pertaining to SCDF itself. Did the SCDF-server start successfully? – Sabby Anandan Feb 15 '17 at 03:37
  • Perhaps you could also make sure the applications can run standalone and connect to the Oracle DB before you register and use them in SCDF? A simple `java -jar ... ` approach could help identify whether the applications function normally. – Sabby Anandan Feb 15 '17 at 03:39
  • Now it's okay, it's cause by the kafka'version issue. Because the ambari default intergration kafka-0.10.0.0, but the dataflow provided cloud stream app starts 1.1.1.Release all is kafka-0.10.1.0. – user3172755 Feb 21 '17 at 08:23

0 Answers0