[FastAPI Series 14] Streaming Responses and Background Tasks

한국어 버전

This post introduces streaming responses that drip data over time and background tasks that continue after the API responds. StreamingResponse pushes data chunks from a generator straight into the HTTP response. BackgroundTasks holds callbacks that run after the response finishes. Both patterns help when you add file processing or report generation to the task manager.

Key terms

  1. StreamingResponse: A FastAPI response class that sends generator output chunk by chunk.
  2. Streaming: Sending partial results instead of waiting for the full calculation; ideal for downloads and realtime logs.
  3. text/event-stream: The MIME type for Server-Sent Events (SSE) so browsers keep the connection open and process messages sequentially.
  4. BackgroundTasks: A FastAPI helper that schedules callbacks to run in the same process after the response returns, useful for emails or thumbnails.
  5. Keep-alive timeout: The server setting that controls how long to hold an HTTP connection; you increase it so streaming responses do not drop.

Practice card

  • Estimated time: 50 minutes
  • Prereqs: Media handling from Part 13, basic asyncio
  • Goal: Combine StreamingResponse and BackgroundTasks to split long work

StreamingResponse basics

Streaming means the server drips data without waiting for the full result. Use it for downloads, log tailing, or SSE.

from fastapi import FastAPI
from fastapi.responses import StreamingResponse

app = FastAPI()

def iter_numbers(limit: int):
    for i in range(1, limit + 1):
        yield f"data: {i}\n"

@app.get("/stream")
def stream_numbers(limit: int = 5):
    return StreamingResponse(iter_numbers(limit), media_type="text/event-stream")

text/event-stream is the SSE MIME type. The client keeps the connection alive and receives events in order. Chunking the work keeps users informed during long computations.

File download streaming

Send large CSV files without buffering everything:

from io import StringIO

def generate_csv(rows: list[dict]):
    buffer = StringIO()
    writer = csv.DictWriter(buffer, fieldnames=rows[0].keys())
    writer.writeheader()
    for row in rows:
        writer.writerow(row)
        yield buffer.getvalue()
        buffer.seek(0)
        buffer.truncate(0)

@app.get("/reports.csv")
def export_report():
    return StreamingResponse(
        generate_csv(fetch_rows()),
        media_type="text/csv",
        headers={"Content-Disposition": "attachment; filename=report.csv"},
    )

fetch_rows() stands in for a DB call that yields rows lazily. This pattern limits memory usage and keeps download speed steady.

Add BackgroundTasks

BackgroundTasks keeps a queue of callbacks that run after the response is sent.

from fastapi import BackgroundTasks

def notify_user(email: str, report_url: str):
    print(f"{email}, your report is ready: {report_url}")

@app.post("/reports")
async def request_report(bg: BackgroundTasks, email: EmailStr):
    url = await generate_report_async(email)
    bg.add_task(notify_user, email, url)
    return {"detail": "Report in progress", "url": url}

Remember that background tasks share the same process. Heavy CPU work still belongs on a queue like Celery.

Combine streaming and background work

Upload a file, validate it stream-style, then spin up thumbnails asynchronously:

@app.post("/photos/async")
async def upload_photo(file: UploadFile, bg: BackgroundTasks):
    path = save_temp(file)
    bg.add_task(generate_thumbnail, path)
    return StreamingResponse(
        iter(["Upload complete, thumbnail pending..."]),
        media_type="text/plain",
    )

Keep the immediate response short and let clients poll another endpoint for progress.

Visual flow

ClientAPIWorkerStorageMailer POST /reportsadd background taskimmediate responsesave reportnotify user

In production the “worker” can be another container or queue. Use FastAPI’s built-in tasks to prototype the flow, then scale out when needed.

uvicorn settings

Streaming keeps connections open longer. Run uvicorn app.main:app --reload --timeout-keep-alive 75 (adjust to taste) so downloads do not break midstream.

Combining streaming and background callbacks gives users quick feedback while the server finishes heavier work. Next you will cover automated API tests.

Practice

  • Follow along: build /stream or /reports.csv and confirm StreamingResponse emits chunks.
  • Extend: attach a background task so report requests trigger an email log or thumbnail job.
  • Debug: when keep-alive or chunk sizes break the flow, tweak uvicorn settings until the response stabilizes.
  • Done when: streaming works in curl or a browser and background callbacks appear in logs.

Wrap-up

Chunk long-running tasks into streaming responses and background callbacks to keep users informed. Once you can separate work like this, reports, media processing, and notifications feel greatly simpler.

💬 댓글

이 글에 대한 의견을 남겨주세요