[Python Series 18] Build async foundations with async/await

한국어 버전

If you already organized interfaces with type hints, it is time to write asynchronous code that handles I/O efficiently. Since Python 3.5, the async/await keywords let a single thread juggle network and file operations. The vocabulary may look new, but remember the one-line summary: “When you must wait, briefly yield control.” Follow the examples and the model becomes straightforward.

Key terms

  1. Asynchronous: overlapping wait-heavy tasks to reduce total time
  2. Coroutine: an async def function that yields execution with await
  3. Task: an object that schedules a coroutine on the event loop
  4. Event loop: the scheduler that keeps track of waiting tasks and resumes them

Core ideas

Study notes

  • Time required: 70–90 minutes (with practice)
  • Prerequisites: experience with requests/file I/O, understanding of functions and type hints, basic testing habits
  • Goal: define coroutines and overlap I/O waits with asyncio.run, gather, and create_task
  • Asynchronous code overlaps long waits.
  • A coroutine is any function declared with async def.
  • A task registers a coroutine with the event loop so it can run.
  • The event loop schedules and resumes pending tasks.
  • Tackle the Core sections first; revisit the optional badges later.

Code examples

Synchronous vs. asynchronous (Core)

  • Synchronous: waits for the call to complete before moving on.
  • Asynchronous: yields control to the event loop whenever a call must wait, allowing other work to run.
  • Coroutine: created with async def.
  • Task: wraps a coroutine so the loop can execute it.
  • Event loop: the scheduler that wakes tasks once their I/O completes.

CPU-bound workloads still need multiprocessing or C extensions, but async pays off whenever network or disk latency dominates. In short, async fits I/O-bound tasks, not heavy CPU calculations.

Define and run a coroutine (Core)



async def fetch_user(user_id: int) -> dict:
    await asyncio.sleep(0.2)
    return {"id": user_id, "name": "민지"}


async def main():
    user = await fetch_user(1)
    print(user)


asyncio.run(main())
  • async def creates a coroutine.
  • await yields until the result is ready.
  • asyncio.run spins up the event loop and drives main.

Run work concurrently with asyncio.gather (Core)

async def gather_users():
    results = await asyncio.gather(
        fetch_user(1),
        fetch_user(2),
        fetch_user(3),
    )
    return results

gather schedules multiple coroutines at once and returns a list of results. While one call waits, the others keep progressing.

Compare runtimes

Asynchronous programming feels abstract, so start by looking at the numbers:

Synchronous run
- call fetch_user three times sequentially
- total time: 0.61s

Asynchronous run
- call all three via asyncio.gather
- total time: 0.22s

The speedup comes from overlapping idle time, not from faster computation.

Tasks and the event loop (Core → Plus)

Once gather makes sense, manipulate task objects directly. Annotate your code with “what am I waiting for now?” to build intuition.

  • Task: registers a coroutine with asyncio.create_task(coro).
  • Event loop: manages waiting work and resumes it when ready.
async def generate_reports(ids):
    tasks = [asyncio.create_task(fetch_user(i)) for i in ids]
    for task in tasks:
        user = await task
        print(user)
direction: right

loop: "Event loop
asyncio.run"
tasks: "create_task
Task1~N"
io: "Async I/O
await fetch_user"
results: "Consume results
print/log"

loop -> tasks: "schedule"
tasks -> io: "await hands off control"
tasks -> results: "deliver data"

The loop rotates through tasks, slipping other work in between long waits.

Beware blocking code (Core)

  • Calls like time.sleep block the loop—replace them with await asyncio.sleep().
  • Move CPU-heavy work to asyncio.to_thread or concurrent.futures so async tasks stay responsive.

Async-friendly libraries (Optional)

  • HTTP: httpx, aiohttp
  • Databases: asyncpg, databases
  • Web frameworks: FastAPI, Quart

Confirm the library offers async variants and avoid mixing sync/async calls in the same layer. When running async code from sync code, stick to a single entry point such as asyncio.run.

Why it matters

  • async/await overlaps I/O waits to increase throughput.
  • Knowing how coroutines, tasks, and event loops interact makes architecture decisions clearer.
  • Keep blocking calls and async calls separated so boundaries stay predictable.

Practice

  • Follow along: Recreate fetch_user/main and measure asyncio.run(main()).
  • Extend: Use asyncio.gather and create_task to run at least three coroutines concurrently and log their completion order; try an httpx async request if you have time.
  • Debug: Intentionally insert time.sleep to freeze the loop, then replace it with await asyncio.sleep to verify the fix.
  • Definition of done: Produce a script that prints the synchronous vs. asynchronous timing difference and exercises two or more asyncio helpers.

Wrap-up

Next, we will explore the standard library, organize pyproject.toml, and practice packaging your code for release.

💬 댓글

이 글에 대한 의견을 남겨주세요