I would like a piece of advice on handling the result of a join operation performed in SQLAlchemy and do the serialization with Pydantic (in FastAPI).
If I am not mistaken, the result of the join on two table leads to a list of tuples of SQLAlchemy models. Here is a mock for it, like_a_join
being my understanding for the result of a join query.
from pydantic import BaseModel
from sqlalchemy import Column, Integer
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class A(BaseModel):
a: int
class Config:
orm_mode = True
class B(BaseModel):
b: int
class Config:
orm_mode = True
class Am(Base):
__tablename__ = "A"
a = Column(Integer, primary_key=True, nullable=False)
class Bm(Base):
__tablename__ = "B"
b = Column(Integer, primary_key=True, nullable=False)
def like_a_join():
return [(Am(a=1), Bm(b=1))]
While it is possible to pass a model object to Pydantic with from_orm
for simple query (which is done for example on the FastAPI Postgres cookie cutter, here, this is not obvious to me how to best handle the join/tuple case.
Would it make sense to create a class for handling the tuple like the following?
from typing import List
from pydantic import parse_obj_as
from dataclasses import dataclass
@dataclass
class Cm:
a: Am
b: Bm
class C(BaseModel):
a: A
b: B
class Config:
orm_mode = True
def like_a_join_with_class():
return [Cm(a=Am(a=1), b=Bm(b=1))]
print(parse_obj_as(List[C], like_a_join_with_class()))
Would it be better to use a dictionary?
def like_a_join_with_dict():
return [{"a": Am(a=1), "b": Bm(b=1)}]
The idea behind would be to include the query result in a FastAPI endpoint with serialization handled automatically.
Thanks in advance for your help.