0

I'm trying to push request/response into elasticsearch but I'm stuck after reading documentation when trying to get body of response. It's stated there: "While enabling buffering is possible, it's discouraged as it can add significant memory and latency overhead. Using a wrapped, streaming approach is recommended if the body must be examined or modified". So this part is quite understandable since buffered response might be saved into file. "See the ResponseCompression middleware for an example." (Full article)

I checked what is in there and I'm stuck. Should I create class that implements IHttpResponseBodyFeature?

I've implemented simple class that implements that interface:

internal class BodyReader : IHttpResponseBodyFeature, IDisposable
{
    private bool _disposedValue;

    public Stream Stream { get; } = new MemoryStream();

    public PipeWriter Writer => throw new NotImplementedException();

    public Task CompleteAsync()
    {
        return Task.CompletedTask;
    }

    public void DisableBuffering()
    {
        //throw new NotImplementedException();
    }

    public Task SendFileAsync(string path, long offset, long? count, CancellationToken cancellationToken = default)
    {
        throw new NotImplementedException();
    }

    public Task StartAsync(CancellationToken cancellationToken = default)
    {
        throw new NotImplementedException();
    }

    protected virtual void Dispose(bool disposing)
    {
        if (!_disposedValue)
        {
            if (disposing)
            {
                // TODO: dispose managed state (managed objects)
                Stream?.Dispose();
            }

            // TODO: free unmanaged resources (unmanaged objects) and override finalizer
            // TODO: set large fields to null
            _disposedValue = true;
        }
    }

    // // TODO: override finalizer only if 'Dispose(bool disposing)' has code to free unmanaged resources
    // ~Tmp()
    // {
    //     // Do not change this code. Put cleanup code in 'Dispose(bool disposing)' method
    //     Dispose(disposing: false);
    // }

    public void Dispose()
    {
        // Do not change this code. Put cleanup code in 'Dispose(bool disposing)' method
        Dispose(disposing: true);
        GC.SuppressFinalize(this);
    }
}

And then in middleware:

        var bodyReader = new BodyReader();
        context.Features.Set<IHttpResponseBodyFeature>(bodyReader);

        try
        {
            await _next(context);

            bodyReader.Stream.Position = 0;
            using (var sr = new StreamReader(bodyReader.Stream))
            {
                // here should be text response but unfortunately in variable is some garbage
                // I'm guessing ciphered response?
                var html = sr.ReadToEnd();
            }

            bodyReader.Dispose();
        }
        finally
        {
            context.Features.Set(originalBodyFeature);
        }

Seems that in html variable is some garbage - maybe ciphered? Also don't have an idea how to push response into pipe once again.

I'm not sure if approach is good? Maybe I shouldn't use middleware to logging or my implementation of IHttpResponseBodyFeature is incorrect?

Either way I need to push into elastic both request and response :)

Artur Siwiak
  • 352
  • 3
  • 14

1 Answers1

0

I asked about this on yarp's github and I got information that this is not because of https but compression (I simply forgot about it): https://github.com/microsoft/reverse-proxy/issues/1921#issuecomment-1301287432

Long story short it was enough to add:

builder.Services.AddReverseProxy()
    .ConfigureHttpClient((context, handler) =>
    {
        // this is required to decompress automatically
        handler.AutomaticDecompression = System.Net.DecompressionMethods.All; 
    })

Happy coding :)

Artur Siwiak
  • 352
  • 3
  • 14