• axzxc1236@lemm.ee
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    22 days ago

    asyncio provides “cooperative concurrency” for python.

    Lets say you need to download 10 webpages in python, someone might do

    result1 = requests.get(...)
    result2 = requests.get(...)
    ....
    result10 = requests.get(...)
    

    Down side is that each requests.get() blocks until HTTP request is done and if each webpage loads in 5 seconds your programs needs 50 seconds to download all pages.

    You can do something like spawn 10 threads, but threading has it’s own downside.

    What coopertive concurrency does is allowing these coroutine(tasks) that can tell Python to do something else while a function is waiting for something… I think it’s the best to read some Python examples. https://docs.python.org/3/library/asyncio-task.html#coroutines

    examples that solves requests.get() problem with asyncio but it’s probably better to use libraries that builds around asyncio.