This post lands at the exact midpoint of the FastAPI series (12/20). In Part 11 you finished issuing JWT tokens, and now you glue dependencies, SQLModel, authentication, and configuration into one study-friendly task manager API to expose gaps before moving on. Treat this mini service as the moment you pause, breathe, and prove you can repeat “settings + DB + auth” without guessing.
What we will build
- Log in via
/auth/token. - Add tasks through
/tasks. - Refresh the page and confirm the data persists.
That short flow already exercises settings, the database, and authentication. Each section below shows where these three steps connect.
Key terms for this post
- Mini service: A compact, end-to-end API that bundles auth, DB, and configuration in one repo. “Mini” means you can understand it in under an hour, yet it is honest enough to deploy after adding observability.
- core/config: The
pydantic-settingsmodule that gathers token TTLs, DB URLs, and other shared values. - api/deps: A collection of FastAPI
Dependshelpers—sessions, current user, etc.—that every router can reuse. - Lifespan event: Hooks that run when the FastAPI app starts or stops. This is where you call
init_db()exactly once before serving the first request so every router sees an initialized database. - Token-protected router: A router that wraps endpoints with
get_current_userso/tasksCRUD stays protected.
Practice card
- Estimated time: 75 minutes
- Prereqs: Code from Parts 9–11, Docker/uvicorn experience
- Goal: Connect settings, DB, and auth routers into a working Todo API
Review the project structure
app/
├─ core/
│ ├─ config.py
│ └─ security.py
├─ db/
│ ├─ models.py
│ ├─ session.py
│ └─ init.py
├─ api/
│ ├─ deps.py
│ ├─ routes/
│ │ ├─ auth.py
│ │ └─ tasks.py
│ └─ __init__.py
└─ main.py
Split folders by concern so tests can swap modules with minimal churn. If you followed earlier parts, core/ arrived in Part 9, db/ in Part 10, and the auth routes in Part 11—this post simply wires them together. Keep imports flowing downward (core → db → api → main) to avoid circular import errors.
The diagram below is a quick legend for how modules pass data around. Each box is a file; arrows show who provides values to whom.
You should be able to explain how components meet using this one diagram. When in doubt, revisit the three-step flow at the top and map each step back to the file that implements it.
Configuration and security modules
core/config.py:
from pydantic_settings import BaseSettings
class Settings(BaseSettings):
secret_key: str
access_token_expire_minutes: int = 30
database_url: str = "sqlite:///./todo.db"
model_config = {"env_file": ".env", "env_prefix": "APP_"}
settings = Settings()
Example .env (keep it at the project root and never commit it):
APP_SECRET_KEY=change-me-now
APP_DATABASE_URL=sqlite:///./todo.db
APP_ACCESS_TOKEN_EXPIRE_MINUTES=30
For production, skip the .env file and inject variables before starting uvicorn:
export APP_SECRET_KEY="prod-secret"
export APP_DATABASE_URL="postgresql://user:pass@db:5432/todo"
uvicorn app.main:app --host 0.0.0.0
core/security.py keeps password hashing and JWT generation in one place so auditing and testing stay simple:
from datetime import datetime, timedelta, timezone
from jose import JWTError, jwt
from passlib.context import CryptContext
from core.config import settings
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
def hash_password(password: str) -> str:
return pwd_context.hash(password)
def verify_password(plain: str, hashed: str) -> bool:
return pwd_context.verify(plain, hashed)
def create_access_token(subject: str) -> str:
expire = datetime.now(timezone.utc) + timedelta(minutes=settings.access_token_expire_minutes)
payload = {"sub": subject, "exp": expire}
return jwt.encode(payload, settings.secret_key, algorithm="HS256")
def decode_token(token: str) -> dict | None:
try:
return jwt.decode(token, settings.secret_key, algorithms=["HS256"])
except JWTError:
return None
Database session and models
Define the engine and session dependency in db/session.py:
from typing import Generator
from sqlmodel import Session, SQLModel, create_engine
from sqlalchemy.orm import sessionmaker
from core.config import settings
from sqlalchemy import event
engine = create_engine(
settings.database_url,
connect_args={"check_same_thread": False} if "sqlite" in settings.database_url else {},
)
SessionLocal = sessionmaker(bind=engine, autoflush=False, autocommit=False, class_=Session)
def get_session() -> Generator[Session, None, None]:
with SessionLocal() as session:
yield session
if "sqlite" in settings.database_url:
@event.listens_for(engine, "connect")
def enable_sqlite_fk(dbapi_connection, connection_record):
cursor = dbapi_connection.cursor()
cursor.execute("PRAGMA foreign_keys=ON")
cursor.close()
Then declare models in db/models.py:
from sqlmodel import SQLModel, Field
class User(SQLModel, table=True):
id: Optional[int] = Field(default=None, primary_key=True)
email: str = Field(index=True, unique=True)
hashed_password: str
class UserCreate(SQLModel):
email: str = Field(min_length=1)
password: str = Field(min_length=8)
class Task(SQLModel, table=True):
id: Optional[int] = Field(default=None, primary_key=True)
title: str
done: bool = False
owner_id: int = Field(foreign_key="user.id")
class TaskCreate(SQLModel):
title: str = Field(min_length=1, max_length=255)
class TaskRead(SQLModel):
id: int
title: str
done: bool
owner_id carries a foreign-key constraint so each task stays attached to its owner. SQLite enforces this only when the engine enables foreign keys, so switch to PostgreSQL (or turn on the pragma) in production. Gather shared dependencies such as get_session and get_current_user in api/deps.py. Every router reuses them to avoid duplication; get_session yields a single SQLModel session per request and get_current_user decodes the JWT and raises 401 before your handler runs if the token is invalid.
# api/deps.py
from fastapi import Depends, HTTPException, status
from fastapi.security import OAuth2PasswordBearer
from sqlmodel import Session, select
from core.security import decode_token
from db.models import User
from db.session import get_session
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="/auth/token")
def get_current_user(
token: str = Depends(oauth2_scheme),
session: Session = Depends(get_session),
) -> User:
payload = decode_token(token)
if not payload or "sub" not in payload:
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Invalid token")
user = session.exec(select(User).where(User.email == payload["sub"])).first()
if not user:
raise HTTPException(status_code=404, detail="User not found")
return user
Router composition
api/routes/auth.py keeps the token flow from Part 10. It needs both register and token routes so the scenario tests have real endpoints:
from fastapi import APIRouter, Depends, HTTPException
from fastapi.security import OAuth2PasswordRequestForm
from sqlmodel import Session, select
auth_router = APIRouter(prefix="/auth", tags=["auth"])
@auth_router.post("/register", status_code=201)
def register_user(payload: UserCreate, session: Session = Depends(get_session)):
if session.exec(select(User).where(User.email == payload.email)).first():
raise HTTPException(status_code=400, detail="Email already exists")
user = User(email=payload.email, hashed_password=hash_password(payload.password))
session.add(user)
session.commit()
session.refresh(user)
return {"id": user.id, "email": user.email}
@auth_router.post("/token")
def login(form_data: OAuth2PasswordRequestForm = Depends(), session: Session = Depends(get_session)):
user = session.exec(select(User).where(User.email == form_data.username)).first()
if not user or not verify_password(form_data.password, user.hashed_password):
raise HTTPException(status_code=401, detail="Invalid credentials")
return {"access_token": create_access_token(user.email), "token_type": "bearer"}
api/routes/tasks.py exposes per-user CRUD.
from fastapi import APIRouter, Depends
from sqlmodel import Session, select
tasks_router = APIRouter(prefix="/tasks", tags=["tasks"])
@tasks_router.get("", response_model=list[TaskRead])
def list_tasks(
current_user: User = Depends(get_current_user),
session: Session = Depends(get_session),
):
statement = select(Task).where(Task.owner_id == current_user.id)
return session.exec(statement).all()
@tasks_router.post("", response_model=TaskRead, status_code=201)
def create_task(
payload: TaskCreate,
current_user: User = Depends(get_current_user),
session: Session = Depends(get_session),
):
record = Task(**payload.model_dump(), owner_id=current_user.id)
session.add(record)
session.commit()
session.refresh(record)
return record
app/main.py wires routers and runs init_db() inside the lifespan hook so the first request does not fail. init_db() lives in db/init.py, simply calls SQLModel.metadata.create_all(engine), and is idempotent, so it is safe to call on every startup.
# db/init.py
from sqlmodel import SQLModel
from db.session import engine
def init_db() -> None:
SQLModel.metadata.create_all(engine)
@asynccontextmanager
async def lifespan(app: FastAPI):
init_db()
yield
app = FastAPI(title="Todo API", lifespan=lifespan)
app.include_router(auth_router)
app.include_router(tasks_router)
Scenario-driven tests
- Create a user via
/auth/registeror a seed script. - Issue a token from
/auth/token. - Add the
Authorizationheader and call/tasksCRUD. - Keep
.envon SQLite locally and inject a PostgreSQL URL in production.
Automate the flow with Pytest TestClient plus dependency_overrides. Stub-heavy tests shorten the loop:
from fastapi.testclient import TestClient
def test_list_tasks_401(app: FastAPI):
client = TestClient(app)
assert client.get("/tasks").status_code == 401
def test_list_tasks_authed(app: FastAPI, fake_user: User):
app.dependency_overrides[get_current_user] = lambda: fake_user
client = TestClient(app)
response = client.get("/tasks")
assert response.status_code == 200
app.dependency_overrides.clear()
fake_user is just a pytest fixture that returns a stubbed User:
@pytest.fixture
def fake_user() -> User:
return User(id=1, email="[email protected]", hashed_password="stub")
⚠️
dependency_overrideslives on the app instance, so always clear it after each test or wrap it with a fixture that does the cleanup for you.
Expected output
When the mini service is healthy you should see a flow like this (the commands use HTTPie’s http CLI, but curl/Postman equivalents work too):
:::terminal{title="Mini service verification", showFinalPrompt="false"}
[
{ "cmd": "http POST :8000/auth/token [email protected] password=secret", "output": "HTTP/1.1 200 OK\n{\n \"access_token\": \"eyJhbGciOi...\",\n \"token_type\": \"bearer\"\n}", "delay": 500 },
{ "cmd": "http GET :8000/tasks", "output": "HTTP/1.1 401 Unauthorized\n{\n \"detail\": \"Not authenticated\"\n}", "delay": 400 },
{ "cmd": "http GET :8000/tasks \"Authorization:Bearer eyJhbGciOi...\"", "output": "HTTP/1.1 200 OK\n[]", "delay": 400 },
{ "cmd": "http POST :8000/tasks title='API review' \"Authorization:Bearer eyJhbGciOi...\"", "output": "HTTP/1.1 201 Created\n{\n \"id\": 1,\n \"title\": \"API review\",\n \"done\": false\n}", "delay": 400 }
]
:::
- Checkpoint 1:
/taskswithout headers should respond with HTTP 401{"detail": "Not authenticated"}. If it returns 200, re-check thatDepends(get_current_user)wraps the route. - Checkpoint 2: After adding the
Authorization: Bearer <token>header, the same path should respond with HTTP 200[]. If it still returns 401, issue a new token. - Checkpoint 3: Creating a task should return HTTP 201 with JSON
{ "id": <int>, "title": "...", "done": false }. If it returns 500, inspect DB logs and confirm migrations ran.
Optional deployment checklist
- Build a Docker image and push to a registry.
- Run DB migration scripts.
- Start
uvicornvia systemd or an orchestrator. - Pipe logs and health checks into your monitoring tool.
Practice
Follow along (30 min)
- Match the folder structure, wire settings/DB/router modules, and gate
/tasksbehind token auth. - Run
uvicorn app.main:app --reloadand confirm/tasksreturns 401 without a token.
Extend (20 min)
- Write Pytest flows for register → token → task creation, using
dependency_overridesto skip real login when necessary.
Debug (15 min)
- Scenario:
.envchanges but the app still connects to the old DB. Solution: restart the dev server so theSettingsobject reloads, and logsettings.database_urlat startup to confirm.
Done when
- One command boots the server,
/tasksflips between 401 and 200 depending on the header, and a newly created task survives a browser refresh.
Optional expansion ideas
- Add a WebSocket endpoint for realtime task updates.
- Connect a task queue such as Celery or RQ for long-running jobs.
- Define OAuth2 scopes in OpenAPI to issue role-based tokens.
- Pair with a frontend (React/Vue/Svelte) to expose the UI.
Wrap-up
This checkpoint only replays what you already learned. Because you just re-threaded settings, auth, and DB in one go, the upcoming posts on files, streaming, tests, and observability will stack cleanly. Stabilize this foundation now so the Part 20 capstone feels straightforward.
💬 댓글
이 글에 대한 의견을 남겨주세요