1

I have 3 K8S clusters and 3 databases (Postgres 11) in 3 VPC (dev, stage, stage2). All databases have INTERNAL IPs. When I connect on dev to database - it's ok. When I try to connect on stage/stage2 I have time out. On all ENVs have equal auth, user & port.

Use:

K8S, Docker-containers, SQL Postgres 11, .Net application

Scheme:

  • dev: dev_vpc -> k8s + SQL - ok
  • stage: stage_vpc -> k8s + SQL - timeout
  • stage2: stage2_vpc -> k8s + SQL - timeout

appsettings.json (Connection string):

    {
  "DBConnections": {
    "Driver": "PostgreSQL",
    "ConnectionString": "Server={0};Port={1};Database={2};User Id={3};Password={4};",
    "DbName": "test",
    "Url": "localhost",
    "Port": "5432",
    "User": "default_test_user",
    "Password": "12345"
  },

Could you please help me with a strange problem.

  • Is it possible for you to connect to the postgres DB using the public IP? Are you using the SQLproxy? If so, try [connecting without the SQ proxy](https://cloud.google.com/sql/docs/postgres/connect-kubernetes-engine#private-ip), make sure you are pointing to the correct IP. – Kevin Quinzel Aug 17 '20 at 22:24
  • It's bit old question but were you able to solve this issue? Do you still have issue using newest versions? – PjoterS Feb 01 '21 at 08:30

0 Answers0