Questions tagged [data-stream]

In Connection-oriented communication, a data stream is a sequence of digitally encoded coherent signals (packets of data or data packets) used to transmit or receive information that is in the process of being transmitted.

In Connection-oriented communication, a data stream is a sequence of digitally encoded coherent signals (packets of data or data packets) used to transmit or receive information that is in the process of being transmitted.

In electronics and computer architecture, a data flow determines for which time which data item is scheduled to enter or leave which port of a systolic array, a Reconfigurable Data Path Array or similar pipe network, or other processing unit or block.

Often the data stream is seen as the counterpart of an instruction stream, since the von Neumann machine is instruction-stream-driven, whereas its counterpart, the Anti machine, is data stream driven.

The term "data stream" has many more meanings, such as by the definition from the context of systolic arrays.

Wikipedia: http://en.wikipedia.org/wiki/Data_stream

276 questions
2
votes
1 answer

how to send http response using stream

I would like to have simple API in my http server so everytime I write to HttpResponse I use stream. so I convert all object into stream, ie object->json->stream Stream> toStream(Object value) { var json = JSON.encode(value); var…
kamiseq
  • 593
  • 4
  • 17
2
votes
2 answers

Creating an efficient way of sending integers over a network. TCP

How can I convert integer values to byte arrays and then send them over a byte stream to the client program which converts the byte array back to an integer? My program is a pingpong game. Once run it creates a server which a client connects to over…
xtré
  • 132
  • 1
  • 10
1
vote
1 answer

Is mySQL suitable for storing a data stream coming in at 250ms update, having 30 bytes of information per update, and serving to the web?

The title says most of it, I'm wondering if MySQL is suitable, (and if not, what else would do better) for storing this data? It would be most likely 3 to 6 floating point numbers delivered every quarter second. (Somewhere between 1-3 GB per…
Alex Gosselin
  • 2,942
  • 21
  • 37
1
vote
1 answer

Kafka Consumer with Flask Application

I have Flask application running to serve the API endpoint, in the same application I have to implement also kafka consumer in order to consume events from kafka stream, but because of the kafka consumer should run forever, I am using python threads…
1
vote
1 answer

Cloud data fusion Permission denied due to datastream.connectionProfiles.discover

I am trying to create a cloud data fusion replication job from oracle to bigquery. Receiving the below error. Failed to connect to the database due to below error : io.grpc.StatusRuntimeException: PERMISSION_DENIED:…
1
vote
1 answer

Firebase Data stream not visible. No data received in past 48 hrs.(React Native)

I've integrated react native firebase analytics with custom events. I see data being present in GA, also in debug view I'm able to view all the events triggered. I'm unable to view the data in data stream. GA Data stream I get No data received in…
1
vote
1 answer

Valid website URL is required in Google Analytics

I try to put the following URL to connect my web application to Google Analytics and it gives the error shown in the screenshot: I tried changing the domain name and it didn't work either. If anyone has any idea how I can make it work, it would…
Ralk
  • 11
  • 2
1
vote
2 answers

Streaming and caching tabular data with fsspec, parquet and Pyarrow

I’m trying to stream data from parquet files stored in Dropbox (but it could be somewhere else, S3, gdrive, etc…) and reading in Pandas, while caching it. For that I’m trying to use fsspec for Python Following these instructions that’s what I’m…
Luiz Tauffer
  • 463
  • 6
  • 17
1
vote
1 answer

How to know that an AWS kinesis event has been successfully sent to a client via a lambda function?

I have an architecture where lambda function delivers the events in a kinesis stream to a client. If the event is successfully delivered then the the event should be popped off of the queue in the kinesis stream. If the event was not successfully…
1
vote
0 answers

What are the practical differences between Kafka Topics & Channels?

Conceptually, I get the difference. As per Kafka docs: [...] a topic is similar to a folder in a filesystem, and the events are the files in that folder. An example topic name could be "payments". Topics in Kafka are always multi-producer…
Rahul
  • 543
  • 1
  • 7
  • 24
1
vote
1 answer

Continous data generator from Azure Databricks to Azure Event Hubs using Spark with Kafka API but no data is streamed

I'm trying to implement a continuous data generator from Databricks to an Event Hub. My idea was to generate some data in a .csv file and then create a data frame with the data. In a loop I call a function that executes a query to stream that data…
1
vote
0 answers

Stream data from Azure Databricks to Azure Event Hub via Kafka API from CSV file

I am new to Azure Databricks and Event Hubs. I have been struggling for days to stream data from Databricks using Spark and Kafka API to an event hub.The data I want to stream is in a .csv file. The stream is starting but the Dashboard with the…
1
vote
0 answers

Best pratices for for ilm and dynamic indices in logstash elasticsearch output plugin

I am struggeling to find a suitable way to handle ilm rollover on dynamic indices from a logstash pipeline to elasticsearch. The pipeline looks like that: input { pipeline { address => "some_pipeline_name" } } output { elasticsearch { …
sbstnmrwld
  • 36
  • 5
1
vote
1 answer

Python program to use Elasticsearch as sink in Apache Flink

I am trying to read data from a kafka topic do some processing and dump the data into elasticsearch. But I could not find example in python ti use Elastisearch as sink. Can anyone help me with a snippet for the same. # add kafka connector…
1
vote
3 answers

Flink Python Datastream API Kafka Consumer

Im new to pyflink. Im tryig to write a python program to read data from kafka topic and prints data to stdout. I followed the link Flink Python Datastream API Kafka Producer Sink Serializaion. But i keep seeing NoSuchMethodError due to version…