1

I'm trying to write a .NET Core API Controller action that would basically act like a HTTP proxy:

  • Data comes in POSTed via Request.Body stream
  • Then POSTed to another backend service which processes the input data stream live and starts sending result back while still reading and processing input
  • Resulting POST response is read and streamed back to the caller of this 'proxy'

I tried a couple of things, playing with both HttpClient and old HttpWebRequest but still can't make this work properly, the process deadlocks somewhere.

I would also like to better control if possible the HTTP request output buffers as I observed that data starts to pile up in-memory if backend service can't accept it fast enough (stream.Write() calls don't block immediately and process memory spikes).

Any ideas or examples?

jazzman
  • 39
  • 4
  • I guess my first question is.... why? If I were consuming the API, this would be huge pain. Why not send all the data back at once, like a normal API? – Casey Crookston Nov 13 '20 at 15:26
  • I'm working with huge amounts of data, memory would crash. Plus, I want processed data to start transferring back to the client immediately. It is a necessary huge pain. – jazzman Nov 13 '20 at 15:36
  • 1
    When working with API's that might return large amounts of data, Paging is the standard way to handle this. If the data is such that paging is not an option, then maybe an API isn't the best tool. You might want to look into a websocket instead, which by design is meant to be an open stream of data. An API is not meant to be an open stram of data. API's by design are a Post/Respond or Send/Recieve model. If you are tying to stream large data via an API, then the deadlock you are seeing is entily normal. API's are not meant for streaming. – Casey Crookston Nov 13 '20 at 15:41

0 Answers0