1

I have gRPC service and Envoy proxy gateway.

enter image description here

The main problem that when I'am tring to make more than 6 tcp streams, all another streams are pending:

enter image description here

If I andrstood well, it happens because each browsers has a tcp limits:

Consequently I'am looking for a posibility to make 5 streams at 1 tcp connection.

envoy.yaml

static_resources:
  listeners:
  - name: grpc
    address:
      socket_address: { address: 0.0.0.0, port_value: 8080 }
    filter_chains:
    - filters:
      - name: envoy.filters.network.http_connection_manager
        typed_config:
          "@type": type.googleapis.com/envoy.extensions.filters.network.http_connection_manager.v3.HttpConnectionManager
          codec_type: AUTO
          stat_prefix: ingress_http
          stream_idle_timeout: 0s
          route_config:
            name: local_route
            virtual_hosts:
            - name: local_service
              domains: ["*"]
              routes:
              - match: { prefix: "/ligric.protobuf.UserApis" }
                route: { cluster: userapis_service, timeout: 0s }
              cors:
                allow_origin_string_match:
                    - exact: http://127.0.0.1:5000
                allow_methods: GET, PUT, DELETE, POST, OPTIONS
                allow_headers: grpc-accept-encoding,content-type,origin,referer,authorization,keep-alive,user-agent,cache-control,content-type,content-transfer-encoding,x-accept-content-transfer-encoding,x-accept-response-streaming,x-user-agent,x-grpc-web
                expose_headers: grpc-status,grpc-message,x-envoy-upstream-service-time,custom-header-1
                max_age: "1728000"
          http_filters:
          - name: envoy.filters.http.cors
            typed_config:
              "@type": type.googleapis.com/envoy.extensions.filters.http.cors.v3.Cors
          - name: envoy.filters.http.grpc_web
            typed_config:
              "@type": type.googleapis.com/envoy.extensions.filters.http.grpc_web.v3.GrpcWeb
          - name: envoy.filters.http.router
            typed_config:
              "@type": type.googleapis.com/envoy.extensions.filters.http.router.v3.Router

  clusters:
  # Apis
  - name: userapis_service
    type: logical_dns
    lb_policy: round_robin
    dns_lookup_family: V4_ONLY
    connect_timeout: 0.25s
    typed_extension_protocol_options:
      envoy.extensions.upstreams.http.v3.HttpProtocolOptions:
        "@type": type.googleapis.com/envoy.extensions.upstreams.http.v3.HttpProtocolOptions
        explicit_http_config:
          http2_protocol_options: {}
    load_assignment:
      cluster_name: grpc
      endpoints:
        - lb_endpoints:
            - endpoint:
                address:
                  socket_address:
                    address: host.docker.internal
                    port_value: 50052
limeniye
  • 31
  • 1
  • 6
  • 2
    If you have HTTP/1 from the client as shown in the picture you can not have multiple streams in one TCP connection. This can only be done with HTTP/2 – Steffen Ullrich May 30 '23 at 13:39
  • @SteffenUllrich , thank you for your answer. Can you proposit me something? As a browser platform I'am using WASM (PWA). I already tried to use HTTP2 with gRPC. But it doesn't works. – limeniye May 30 '23 at 14:44
  • It is hard to propose anything since not much detail is available about the problem. Your question is about having multiple streams in one TCP connection. The answer is that it cannot be done with HTTP/1. To solve the problem thus either don't use HTTP/1 or don't require multiple streams in one TCP connection. If you want the latter since you otherwise hit a limit due to the limited number of TCP connections in the browser than maybe limit how much parallel streams you need in your application design in the first place. – Steffen Ullrich May 30 '23 at 14:51
  • This is my Uno-Platform project: https://github.com/ligric/ligric This is my proto file on server part: https://github.com/ligric/ligric/blob/main/src/services/Ligric.Service.CryptoApisService/Ligric.Service.CryptoApisService.Api/protos/futures.proto This is my Sturtup.cs: https://github.com/ligric/ligric/blob/main/src/services/Ligric.Service.CryptoApisService/Ligric.Service.CryptoApisService.Api/Startup.cs#L112 This is implementation: https://github.com/ligric/ligric/blob/main/src/services/Ligric.Service.CryptoApisService/Ligric.Service.CryptoApisService.Api/Services/FuturesService.cs – limeniye May 30 '23 at 19:39
  • On client side I`am just clicking the button to start this 4 streams: https://github.com/ligric/ligric/blob/main/src/ui/Ligric.Business/Clients/Futures/Binance/FuturesCryptoClient.cs#L66 – limeniye May 30 '23 at 19:46
  • And I want attach different binance streams. As a fixing proposition I can change my proto file as following example: I can add method "AddToStream()". And instead of calling those streams a second time, I'm going to send a POST request that will give those streams the ability to return some more information. But it seems to me that it can be recorded in a different way. For example, I saw such functionality as enable_mptcp in Envoy ```Enable MPTCP (multi-path TCP) on this listener. Clients will be allowed to establish MPTCP connections. Non-MPTCP clients will fall back to regular TCP``` – limeniye May 30 '23 at 19:54
  • 1
    MPTCP has nothing to do with multiple logical data streams. It is instead about providing a single TCP stream on top of multiple (physical) links for better throughput and redundancy. – Steffen Ullrich May 30 '23 at 19:58

0 Answers0