Extensions for Python HTTP libraries to support sending and receiving custom proxy headers during HTTPS CONNECT tunneling.
When making HTTPS requests through a proxy, the connection is established via a CONNECT tunnel. During this process:
-
Sending headers to the proxy - Most Python HTTP libraries don't provide an easy way to send custom headers (like
X-ProxyMesh-Country) to the proxy server during the CONNECT handshake. -
Receiving headers from the proxy - The proxy's response headers from the CONNECT request are typically discarded, making it impossible to read custom headers (like
X-ProxyMesh-IP) that the proxy sends back.
This library solves both problems for popular Python HTTP libraries.
| Library | Module | Use Case |
|---|---|---|
| urllib3 | urllib3_proxy_manager |
Low-level HTTP client |
| requests | requests_adapter |
Simple HTTP requests |
| aiohttp | aiohttp_proxy |
Async HTTP client |
| httpx | httpx_proxy |
Modern HTTP client |
| pycurl | pycurl_proxy |
libcurl bindings |
| cloudscraper | cloudscraper_proxy |
Cloudflare bypass |
| autoscraper | autoscraper_proxy |
Automatic web scraping |
pip install python-proxy-headersThen install the HTTP library you want to use (e.g., pip install requests).
Note: This package has no dependencies by default - install only what you need.
from python_proxy_headers.requests_adapter import ProxySession
with ProxySession(proxy_headers={'X-ProxyMesh-Country': 'US'}) as session:
session.proxies = {'https': 'http://user:pass@proxy.example.com:8080'}
response = session.get('https://httpbin.org/ip')
# Proxy headers are merged into response.headers
print(response.headers.get('X-ProxyMesh-IP'))from python_proxy_headers.httpx_proxy import get
response = get(
'https://httpbin.org/ip',
proxy='http://user:pass@proxy.example.com:8080'
)
# Proxy CONNECT response headers are merged into response.headers
print(response.headers.get('X-ProxyMesh-IP'))import asyncio
from python_proxy_headers.aiohttp_proxy import ProxyClientSession
async def main():
async with ProxyClientSession() as session:
async with session.get(
'https://httpbin.org/ip',
proxy='http://user:pass@proxy.example.com:8080'
) as response:
# Proxy headers merged into response.headers
print(response.headers.get('X-ProxyMesh-IP'))
asyncio.run(main())import pycurl
from python_proxy_headers.pycurl_proxy import set_proxy_headers, HeaderCapture
c = pycurl.Curl()
c.setopt(pycurl.URL, 'https://httpbin.org/ip')
c.setopt(pycurl.PROXY, 'http://proxy.example.com:8080')
# Add these two lines to any existing pycurl code
set_proxy_headers(c, {'X-ProxyMesh-Country': 'US'})
capture = HeaderCapture(c)
c.perform()
print(capture.proxy_headers) # Headers from proxy CONNECT response
c.close()from python_proxy_headers.cloudscraper_proxy import create_scraper
# Drop-in replacement for cloudscraper.create_scraper()
scraper = create_scraper(proxy_headers={'X-ProxyMesh-Country': 'US'})
scraper.proxies = {'https': 'http://proxy.example.com:8080'}
response = scraper.get('https://example.com')
# All CloudScraper features (Cloudflare bypass) preservedA test harness is included to verify proxy header functionality:
# Set your proxy
export PROXY_URL='http://user:pass@proxy.example.com:8080'
# Test all modules
python test_proxy_headers.py
# Test specific modules
python test_proxy_headers.py requests httpx
# Verbose output (show header values)
python test_proxy_headers.py -vFor detailed documentation, API reference, and more examples:
- Full Documentation: python-proxy-headers.readthedocs.io
- Example Code: proxy-examples for Python
- scrapy-proxy-headers - Proxy header support for Scrapy
Created by ProxyMesh to help our customers use custom headers to control proxy behavior. Works with any proxy that supports custom headers.
MIT License