0

My Stream Definitions are as Follows:-

IngestToBroker = custom-producer --spring.cloud.stream.bindings.output.content-type=application/json --spring.cloud.stream.bindings.output.producer.headerMode=raw > :Kafkatopic1

DataProcessor = :Kafkatopic1> custom-processor--spring.cloud.stream.bindings.output.content-type=application/json --spring.cloud.stream.bindings.input.consumer.headerMode=raw --spring.cloud.stream.bindings.output.producer.headerMode=raw > :kafkatopic2

myCountsOne = :DataProcessor.custom-processor > field-value-counter --field-name=messageStatusOne

myCountsTwo = :DataProcessor.custom-processor > field-value-counter --field-name=messageStatusOne

MyFileSink = :kafkatopic2> file --directory=C:\usr\sdflogs --name=My-File-Sink2

Current Scenario

1) IngestToBroker and DataProcessor works good. I get the Required JSON output as follows, which is sent to KafkaTopic1 and then through processor and to kafkaTopic2 and finally to file.

{"messageStatusOne":"RECEIVED","DateTimeOne":"2017.09.26.22.03.34","messageStatusTwo":"PROCESSED","DateTimeTwo":"2017.09.26.22.03.34"} {"messageStatusOne":"ERROR","DateTimeOne":"2017.09.26.21.06.45","messageStatusTwo":"NOT AVAILABLE","DateTimeTwo":"2017.09.26.21.06.45"} {"messageStatusOne":"RECEIVED","DateTimeOne":"2017.09.26.21.06.52","messageStatusTwo":"PROCESSED","DateTimeTwo":"2017.09.26.21.06.52"} {"messageStatusOne":"REVIEW","DateTimeOne":"2017.09.26.21.06.59","messageStatusTwo":"HOLD","DateTimeTwo":"2017.09.26.21.06.59"}

Custom-Producer and Custom-Processor are standalone apps that I have deployed which basically changes the Data and the former pushed new JSON record every few seconds.

2) Same is the case with all the Stream Definitions . They all work good, I get the required JSON. EXCEPT for the field-value-counter definitions myCountsOne and myCountsTwo

Issue 1) Dashboard PIC for Analytics on my Local SCDF server

it seems to NOT be active. There are no drop downs for Metrics and Visualizations. I really need this to work and I am pretty new to SCDF. What am I missing ? how do I get to this to work ?

2) Also on a sideNote I am trying to configure MarkLogic with SCDF but will SCDF accept with ODBC drivers ? Please forgive me if I sound stupid in asking all this. Thank you.

Update I just noticed this error in my Spring Data Flow Startup console

"cannot get jedis connection nested exception is redis.clients.jedis.exceptions.jedisConnectionException"

Do we need to add any redis dependencies to my Custom-Processor which is deployed as a standalone application ? this is my Spring Boot Version -1.5.6.RELEASE. Thank you.

Kenny Weeler
  • 153
  • 1
  • 3
  • 15

2 Answers2

0

While using Analytics in SCDF, both the SCDF-server and the Analytics-apps must share a common redis-cluster configuration.

I just noticed this error in my Spring Data Flow Startup console "cannot get jedis connection nested exception is redis.clients.jedis.exceptions.jedisConnectionException"

This requirement is enabled by default; if the redis-cluster is not reachable by SCDF-server, you will notice this error. You may choose to disable it as needed.

Do we need to add any redis dependencies to my Custom-Processor which is deployed as a standalone application ?

If the processor is just emitting aggregated data to downstream apps, you don't need to. It is actually required only for the analytics-sink applications. In your case, please make sure SCDF-server and the field-value-counter shares the same redis-cluster configuration.

Lastly, the 1.3 release line is actively under development with milestone releases underway. Specifically, the dashboard is entirely re-written - there might be bugs. While it is OK to use this version (and welcome feedback and bug-reports), I'd recommend that you switch to the latest GA release. At this time of writing, 1.2.3.RELEASE is the latest production release.

Sabby Anandan
  • 5,636
  • 2
  • 12
  • 21
  • " In your case, please make sure SCDF-server and the field-value-counter shares the same redis-cluster configuration." -- How do I check this ? And I am using 1.2.3 version. – Kenny Weeler Oct 02 '17 at 14:44
  • It depends on the platform. If you're on CF, you bind "redis-service" to server and the sink. If you're on local, you'd run redis on `localhost` and by default, the server and sink will connect to it automatically. If you have to connect to a remote redis, you can override them as simple Spring Boot properties. For instance, here are the [properties](https://github.com/spring-cloud-stream-app-starters/field-value-counter/blob/master/spring-cloud-starter-stream-sink-field-value-counter/README.adoc#options) you can override for field-value-counter. – Sabby Anandan Oct 02 '17 at 15:29
  • I am running this on my Local server and and based on the Stream Definitions mentioned above ..>Could you please tell me what else I need to do to get Redis working ? Coz I have seen other videos and they didnt have anything specific to be able to generate Analytics data when running Local. Thank you again for your Time. – Kenny Weeler Oct 02 '17 at 15:39
  • Start Redis locally by: `redis-server` command in terminal (assuming redis is installed and it is in the path). You'll launch the server normally (it will connect to `localhost` automatically), and when you deploy the stream, the `field-value-counter` will also automatically connect to it. That's all. – Sabby Anandan Oct 02 '17 at 16:17
  • 1) java -jar spring-cloud-dataflow-server-local-1.2.3.RELEASE.jar --spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.kafka.binder.brokers=localhost:9092 --spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.kafka.binder.zkNodes=localhost:2181 -- This is what I use to start my Local server do you want me to include Redis parameters in this command ? – Kenny Weeler Oct 02 '17 at 16:18
  • :( - no. All you need to do is start Redis in `localhost` and on the default `port`; everything else will automatically work. You'd start Redis with `redis-server` command. That's' all! – Sabby Anandan Oct 02 '17 at 21:53
  • when you say default port you mean ? on the same one as SCDF server is ? – Kenny Weeler Oct 03 '17 at 18:48
  • Just start Redis with its default host/port combination - no need to override is what I was trying to infer in the previous comment. SCDF and the Apps will automatically connect to it. – Sabby Anandan Oct 03 '17 at 18:58
  • I am trying to run Redis on windows here. I did the following- 1) Download Redis-x64-2.8.2104.zip 2) Extract the zip to prepared directory 3) run redis-server.exe – Kenny Weeler Oct 03 '17 at 19:01
  • This is turning into a redis specific discussion at this point. Perhaps you can search online journals on how to run redis in windows. – Sabby Anandan Oct 03 '17 at 19:03
  • It now runs on default port 6379 and then I started my SCDF server which by default runs on 9393. and then I create the above Stream Defnitions and Deploy. Everything runs smooth. I get the data I need on Kafka Consumer Consoles for both the Topics and also the File. but when I click the "Analytics" Part, nothing happens, the Field-counters -'myCountsOne' and 'myCountsTwo' doesnt show up on my Analytics dashboard and no mention of any erros in my log files as well. Can you please help ? Thank you – Kenny Weeler Oct 03 '17 at 19:05
  • I am pretty sure it aint about Redis- I havre posted a pic of my redis running on my local. – Kenny Weeler Oct 03 '17 at 19:40
  • Also I have added how my current dashboard looks – Kenny Weeler Oct 03 '17 at 20:09
  • It appears you're mis-using `--field-name` property. Please refer to the [docs](https://github.com/spring-cloud-stream-app-starters/field-value-counter/blob/master/spring-cloud-starter-stream-sink-field-value-counter/README.adoc#options) re: `--field-name` and `--name` and what they are used for. Going by that, you are missing `--name` property in your stream definition. – Sabby Anandan Oct 03 '17 at 20:31
  • Stream create messageCountOne --definition ":DataProcessor.custom-processor > field-value-counter --field-name=messageStatusOne --name=someMsgCount" --deploy -->>This is what I gave. Stream Succesfully deployed But when I try "field-value-counter list" command, it doesnt display the app on the console. – Kenny Weeler Oct 03 '17 at 21:14
  • No luck. This has been happening way too long ! Im sorry about that. – Kenny Weeler Oct 03 '17 at 21:42
0

To further simplify the problem in hand, let's try the following stream in your environment. Once you have success with it, you can then review the other streams on what could go wrong.

Stream:

dataflow:>stream create foo --definition "http --port=9000 | field-value-counter --fieldName=messageStatusOne --name=bar" --deploy

Data:

dataflow:>http post --target http://localhost:9000 --data {"messageStatusOne":"RECEIVED","DateTimeOne":"2017.09.26.22.03.34","messageStatusTwo":"PROCESSED","DateTimeTwo":"2017.09.26.22.03.34"} POST (text/plain) http://localhost:9000 {"messageStatusOne":"RECEIVED","DateTimeOne":"2017.09.26.22.03.34","messageStatusTwo":"PROCESSED","DateTimeTwo":"2017.09.26.22.03.34"} 202 ACCEPTED

List:

dataflow:>field-value-counter display --name bar Displaying values for field value counter 'bar' ╔════════╤═════╗ ║ Value │Count║ ╠════════╪═════╣ ║RECEIVED│ 1║ ╚════════╧═════╝

Sabby Anandan
  • 5,636
  • 2
  • 12
  • 21
  • I am able to get success with this. Also able to view data on the Analytics dashboard. I guess there's something wrong with my Data ? or my App ! in terms of "type' of data being returned ? – Kenny Weeler Oct 03 '17 at 22:48
  • Also, I am able to get the data into my Analytics Dahsboard if I directly route it out through kafkatopic2 into the field counter i.e :kafkatopic2 > field-value-counter --fieldName=messageStatusOne --name=MessageOneCounts – Kenny Weeler Oct 03 '17 at 23:41
  • But even then...I am not getting real time values for the Bubble or Pie chart – Kenny Weeler Oct 04 '17 at 00:10
  • My hope was to show a simple example, so you can figure out what's missing on your end. It is hard to drill into all the specifics that you're sharing. Besides that, the whole back-and-forth comments are not very helpful. It is either the data or the way how you're converting the payload. You can troubleshoot by enabling DEBUG logs in each application and share your observations with the community once you solve it. – Sabby Anandan Oct 04 '17 at 01:13
  • Sure, thank you for your time, Sabby. I will close this thread. – Kenny Weeler Oct 04 '17 at 13:58
  • On a closer look, I'm not sure the role of `--spring.cloud.stream.bindings.output.producer.headerMode=raw` in your stream definition. If it is unintentional, you can try without it. Regardless, the DEBUG logs via `--logging.level.org.springframework.integration=DEBUG` at each app in the DSL should show "pre" and "post" header/payload combinations in the logs. That might reveal the problem. – Sabby Anandan Oct 04 '17 at 17:32