0

Python 3.8.10
streamlit==1.9.0
pandas==1.4.2
psycopg2-binary==2.9.3

Loading a Postgres table directly into a Pandas DataFrame with the following code.

df = pd.DataFrame(run_query("SELECT * FROM schema.tablename;"))  

Displaying it with either streamlit.dataframe(df) or streamlit.write(df) loses the column names.

No column names

In order to capture the column names, I use this kluge.

# Initialize connection.
@st.experimental_singleton
def init_connection():
    return psycopg2.connect(**st.secrets["postgresservername"])      
conn = init_connection()

# Perform query.
@st.experimental_memo(ttl=600)
def run_query(query):
    with conn.cursor() as cur:
        cur.execute(query)
        return cur.fetchall()
 
def load_table_as_dataframe(table):
    # This is super klugy. 
    data = run_query("SELECT * FROM schema.{};".format(str(table)))
    columns = run_query("SELECT *FROM information_schema.columns WHERE table_schema = 'schema' AND table_name = '{}';".format(str(table)))
    # Fish out the actual column names
    columns = [c[3] for c in columns]
    df = pd. DataFrame(data, columns = columns)
    return df
 
df = load_table_as_dataframe("tablename") 

Which works...

With column names

Is there a better way to collect the needed data (and columns names) into a Pandas DataFrame within Postgres and Streamlit?

RightmireM
  • 2,381
  • 2
  • 24
  • 42
  • 1
    Why not use [**`pandas.read_sql`**](https://pandas.pydata.org/docs/reference/api/pandas.read_sql.html)? – Parfait May 17 '22 at 16:14

1 Answers1

0

Using...

    df = pd.read_sql("SELECT * FROM schema.{};".format(str(table)), conn)

...solved the issue. (Thx @parfait)

RightmireM
  • 2,381
  • 2
  • 24
  • 42