web crawler – Sending User-agent using Requests library in Python

web crawler – Sending User-agent using Requests library in Python

The user-agent should be specified as a field in the header.

Here is a list of HTTP header fields, and youd probably be interested in request-specific fields, which includes User-Agent.

If youre using requests v2.13 and newer

The simplest way to do what you want is to create a dictionary and specify your headers directly, like so:

import requests

url = SOME URL

headers = {
    User-Agent: My User Agent 1.0,
    From: [email protected]  # This is another valid field
}

response = requests.get(url, headers=headers)

If youre using requests v2.12.x and older

Older versions of requests clobbered default headers, so youd want to do the following to preserve default headers and then add your own to them.

import requests

url = SOME URL

# Get a copy of the default headers that requests would use
headers = requests.utils.default_headers()

# Update the headers with your custom ones
# You dont have to worry about case-sensitivity with
# the dictionary keys, because default_headers uses a custom
# CaseInsensitiveDict implementation within requests source code.
headers.update(
    {
        User-Agent: My User Agent 1.0,
    }
)

response = requests.get(url, headers=headers)

Its more convenient to use a session, this way you dont have to remember to set headers each time:

session = requests.Session()
session.headers.update({User-Agent: Custom user agent})

session.get(https://httpbin.org/headers)

By default, session also manages cookies for you. In case you want to disable that, see this question.

web crawler – Sending User-agent using Requests library in Python

simply you can do it like below:

import requests

url = requests.post(URL, headers={FUser:your username,FPass:your password,user-agent: your custom text for the user agent })

Leave a Reply

Your email address will not be published. Required fields are marked *