Asynchronous Requests with Python requests

Asynchronous Requests with Python requests


The below answer is not applicable to requests v0.13.0+. The asynchronous functionality was moved to grequests after this question was written. However, you could just replace requests with grequests below and it should work.

Ive left this answer as is to reflect the original question which was about using requests < v0.13.0.

To do multiple tasks with asynchronously you have to:

  1. Define a function for what you want to do with each object (your task)
  2. Add that function as an event hook in your request
  3. Call on a list of all the requests / actions


from requests import async
# If using requests > v0.13.0, use
# from grequests import async

urls = [,,,

# A simple task to do to each response object
def do_something(response):
    print response.url

# A list to hold our things to do via async
async_list = []

for u in urls:
    # The hooks = {... part is where you define what you want to do
    # Note the lack of parentheses following do_something, this is
    # because the response will be used as the first argument automatically
    action_item = async.get(u, hooks = {response : do_something})

    # Add the task to our list of things to do via async

# Do our list of things to do via async

async is now an independent module : grequests.

See here :

And there: Ideal method for sending multiple HTTP requests over Python?


$ pip install grequests


build a stack:

import grequests

urls = [,,,,

rs = (grequests.get(u) for u in urls)

send the stack

result looks like

[<Response [200]>, <Response [200]>, <Response [200]>, <Response [200]>, <Response [200]>]

grequests dont seem to set a limitation for concurrent requests, ie when multiple requests are sent to the same server.

Asynchronous Requests with Python requests

I tested both requests-futures and grequests. Grequests is faster but brings monkey patching and additional problems with dependencies. requests-futures is several times slower than grequests. I decided to write my own and simply wrapped requests into ThreadPoolExecutor and it was almost as fast as grequests, but without external dependencies.

import requests
import concurrent.futures

def get_urls():
    return [url1,url2]

def load_url(url, timeout):
    return requests.get(url, timeout = timeout)

with concurrent.futures.ThreadPoolExecutor(max_workers=20) as executor:

    future_to_url = {executor.submit(load_url, url, 10): url for url in     get_urls()}
    for future in concurrent.futures.as_completed(future_to_url):
        url = future_to_url[future]
            data = future.result()
        except Exception as exc:
            resp_err = resp_err + 1
            resp_ok = resp_ok + 1

Leave a Reply

Your email address will not be published. Required fields are marked *