Proxy.py Dashboard (#141)

* Remove redundant variables

* Initialize frontend dashboard app (written in typescript)

* Add a WebsocketFrame.text method to quickly build a text frame raw packet, also close connection for static file serving, atleast Google Chrome seems to hang up instead of closing the connection

* Add read_and_build_static_file_response method for reusability in plugins

* teardown websocket connection when opcode CONNECTION_CLOSE is received

* First draft of proxy.py dashboard

* Remove uglify, obfuscator is superb enough

* Correct generic V

* First draft of dashboard

* ProtocolConfig is now Flags

* First big refactor toward no-single-file-module

* Working tests

* Update dashboard for refactored imports

* Remove proxy.py as now we can just call python -m proxy -h

* Fix setup.py for refactored code

* Banner update

* Lint check

* Fix dashboard static serving and no UNDER_TEST constant necessary

* Add support for plugin imports when specified in path/to/module.MyPlugin

* Update README with instructions to run proxy.py after refactor

* Move dashboard under /dashboard path

* Rename to devtools.ts

* remove unused

* Update github workflow for new directory structure

* Update test command too

* Fix coverage generation

* *.py is an invalid syntax on windows

* No * on windows

* Enable execution via github zip downloads

* Github Zip downloads cannot be executed as Github puts project under a folder named after Github project, this breaks python interpreter expectation of finding a __main__.py in the root directory

* Forget zip runs for now

* Initialize ProxyDashboard on page load rather than within typescript i.e. on script load

* Enforce eslint with standard style

* Add .editorconfig to make editor compatible with various style requirements (Makefile, Typescript, Python)

* Remove extra empty line

* Add ability to pass headers with HttpRequestRejected exception, also remove proxy agent header for HttpRequestRejected

* Add ability to pass headers with HttpRequestRejected exception, also remove proxy agent header for HttpRequestRejected

* Fix tests

* Move common code under common sub-module

* Move flags under common module

* Move acceptor under core

* Move connection under core submodule

* Move chunk_parser under http

* Move http_parser as http/parser

* Move http_methods as http/methods

* Move http_proxy as http/proxy

* Move web_server as http/server

* Move status_codes as http/codes

* move websocket as http/websocket

* Move exception under http/exception, also move http/proxy exceptions under http/exceptions

* move protocol_handler as http/handler

* move devtools as http/devtools

* Move version under common/version

* Lifecycle if now core Event

* autopep8

* Add core event queue

* Register / unregister handler

* Enable inspection support for frontend dashboard

* Dont give an illusion of exception for HttpProtocolExceptions

* Update readme for refactored codebase

* DictQueueType everywhere

* Move all websocket API related code under WebsocketApi class

* Inspection enabled on tab switch.

1. Additionally now acceptors are assigned an int id.
2. Fix tests to match change in constructor.

* Corresponding ends of the work queues can be closed immediately.

Since work queues between AcceptorPool and Acceptor process is used only
once, close corresponding ends asap instead of at shutdown.

* No need of a manager for shared multiprocess Lock.

This unnecessarily creates additional manager process.

* Move threadless into its own module

* Merge acceptor and acceptor_pool tests

* Defer os.close

* Change content display with tab clicks.

Also ensure relay manager shutdown.

* Remove --cov flags

* Use right type for SyncManager

* Ensure coverage again

* Print help to discover flags, --cov certainly not available on Travis for some reason

* Add pytest-cov to requirements-testing

* Re-add windows on .travis also add changelog to readme

* Use 3.7 and no pip upgrade since it fails on travis windows

* Attempt to fix pip install on windows

* Disable windows on travis, it fails and uses 3.8.  Try reporting coverage from github actions

* Move away from coveralls, use codecov

* Codecov app installation either didnt work or token still needs to be passed

* Remove travis CI

* Use https://github.com/codecov/codecov-action for coverage uploads

* Remove run codecov

* Ha, codecov action only works on linux, what a mess

* Add cookie.js though unable to use it with es5/es6 modules yet

* Enable testing for python 3.8 also Build dashboard during testing

* No python 3.8 on github actions yet

* Autopep8

* Add separate workflows for library (python) and dashboard (node) app

* Type jobs not job

* Add checkout

* Fix parsing node version

* Fix dashboard build on windows

* Show codecov instead of coveralls
This commit is contained in:
Abhinav Singh 2019-10-28 14:57:33 -07:00 committed by GitHub
parent 3b2b2e5dd5
commit e14548252c
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
98 changed files with 12024 additions and 6268 deletions

18
.editorconfig Normal file
View File

@ -0,0 +1,18 @@
root = true
[*]
charset = utf-8
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
[Makefile]
indent_size = tab
[*.py]
indent_style = space
indent_size = 4
[*.ts]
indent_style = space
indent_size = 2

30
.github/workflows/test-dashboard.yml vendored Normal file
View File

@ -0,0 +1,30 @@
name: Proxy.py Dashboard
on: [push]
jobs:
build:
runs-on: ${{ matrix.os }}-latest
name: Node ${{ matrix.node }} on ${{ matrix.os }}
strategy:
matrix:
os: [macOS, ubuntu, windows]
node: [10.x]
max-parallel: 4
fail-fast: false
steps:
- uses: actions/checkout@v1
- name: Setup Node
uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node }}
- name: Install Dependencies
run: |
cd dashboard
npm install
cd ..
- name: Build Dashboard
run: |
cd dashboard
npm run build
cd ..

View File

@ -1,4 +1,4 @@
name: Proxy.py
name: Proxy.py Library
on: [push]
@ -18,17 +18,17 @@ jobs:
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python }}-dev
architecture: x64
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements-testing.txt
- name: Quality Check
run: |
# The GitHub editor is 127 chars wide
# W504 screams for line break after binary operators
flake8 --ignore=W504 --max-line-length=127 proxy.py plugin_examples.py tests.py setup.py benchmark.py
# mypy compliance check
mypy --strict --ignore-missing-imports proxy.py plugin_examples.py tests.py setup.py benchmark.py
flake8 --ignore=W504 --max-line-length=127 proxy/ tests/ benchmark/ plugin_examples/ dashboard/dashboard.py setup.py
mypy --strict --ignore-missing-imports proxy/ tests/ benchmark/ plugin_examples/ dashboard/dashboard.py setup.py
- name: Run Tests
run: pytest tests.py
run: pytest --cov=proxy tests/
- name: Upload coverage to Codecov
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
run: codecov

3
.gitignore vendored
View File

@ -1,10 +1,11 @@
.coverage
.coverage*
.idea
.vscode
.project
.pydevproject
.settings
.mypy_cache
coverage.xml
node_modules
venv
cover

View File

@ -1,17 +0,0 @@
language: python
env:
- TESTING_ON_TRAVIS=1
matrix:
include:
- name: "Python 3.7 on Xenial Linux"
python: 3.7
- name: "Python 3.7 on macOS"
os: osx
osx_image: xcode11
language: shell
python: 3.7
install:
- pip3 install -r requirements-testing.txt
script: python3 -m coverage run --source=proxy tests.py || python -m coverage run --source=proxy tests.py
after_success:
- coveralls

View File

@ -1,2 +1,3 @@
include LICENSE
include README.md
include requirements.txt

View File

@ -2,7 +2,7 @@ SHELL := /bin/bash
NS ?= abhinavsingh
IMAGE_NAME ?= proxy.py
VERSION ?= v$(shell python proxy.py --version)
VERSION ?= v$(shell python -m proxy --version)
LATEST_TAG := $(NS)/$(IMAGE_NAME):latest
IMAGE_TAG := $(NS)/$(IMAGE_NAME):$(VERSION)
@ -15,7 +15,7 @@ CA_SIGNING_KEY_FILE_PATH := ca-signing-key.pem
.PHONY: all clean test package test-release release coverage lint autopep8
.PHONY: container run-container release-container https-certificates ca-certificates
.PHONY: profile
.PHONY: profile dashboard clean-dashboard
all: clean test
@ -24,10 +24,14 @@ clean:
find . -name '*.pyo' -exec rm -f {} +
find . -name '*~' -exec rm -f {} +
rm -f .coverage
rm -rf htmlcov dist build .pytest_cache proxy.py.egg-info
rm -rf htmlcov
rm -rf dist
rm -rf build
rm -rf proxy.py.egg-info
rm -rf .pytest_cache
test:
python -m unittest tests
test: lint
python -m unittest tests/*.py
package: clean
python setup.py sdist bdist_wheel
@ -39,18 +43,21 @@ release: package
twine upload dist/*
coverage:
coverage3 run --source=proxy,plugin_examples tests.py
coverage3 html
pytest --cov=proxy --cov-report=html tests/
open htmlcov/index.html
lint:
flake8 --ignore=W504 --max-line-length=127 proxy.py plugin_examples.py tests.py setup.py benchmark.py
mypy --strict --ignore-missing-imports proxy.py plugin_examples.py tests.py setup.py benchmark.py
flake8 --ignore=W504 --max-line-length=127 proxy/ tests/ benchmark/ plugin_examples/ dashboard/dashboard.py setup.py
mypy --strict --ignore-missing-imports proxy/ tests/ benchmark/ plugin_examples/ dashboard/dashboard.py setup.py
autopep8:
autopep8 --recursive --in-place --aggressive proxy.py
autopep8 --recursive --in-place --aggressive tests.py
autopep8 --recursive --in-place --aggressive plugin_examples.py
autopep8 --recursive --in-place --aggressive proxy/*.py
autopep8 --recursive --in-place --aggressive proxy/*/*.py
autopep8 --recursive --in-place --aggressive tests/*.py
autopep8 --recursive --in-place --aggressive plugin_examples/*.py
autopep8 --recursive --in-place --aggressive benchmark/*.py
autopep8 --recursive --in-place --aggressive dashboard/*.py
autopep8 --recursive --in-place --aggressive setup.py
container:
docker build -t $(LATEST_TAG) -t $(IMAGE_TAG) .
@ -79,3 +86,9 @@ ca-certificates:
profile:
sudo py-spy -F -f profile.svg -d 3600 proxy.py
dashboard:
pushd dashboard && npm run build && popd
clean-dashboard:
rm -rf public/dashboard

View File

@ -1,2 +0,0 @@
# See https://devcenter.heroku.com/articles/procfile
web: python3 proxy.py --hostname 0.0.0.0 --port $PORT

214
README.md
View File

@ -5,7 +5,7 @@
[![Docker Pulls](https://img.shields.io/docker/pulls/abhinavsingh/proxy.py?color=green)](https://hub.docker.com/r/abhinavsingh/proxy.py)
[![Build Status](https://travis-ci.org/abhinavsingh/proxy.py.svg?branch=develop)](https://travis-ci.org/abhinavsingh/proxy.py/)
[![No Dependencies](https://img.shields.io/static/v1?label=dependencies&message=none&color=green)](https://github.com/abhinavsingh/proxy.py)
[![Coverage](https://coveralls.io/repos/github/abhinavsingh/proxy.py/badge.svg?branch=develop)](https://coveralls.io/github/abhinavsingh/proxy.py?branch=develop)
[![Coverage](https://codecov.io/gh/abhinavsingh/proxy.py/branch/develop/graph/badge.svg)](https://codecov.io/gh/abhinavsingh/proxy.py)
[![Tested With MacOS](https://img.shields.io/static/v1?label=tested%20with&message=mac%20OS%20%F0%9F%92%BB&color=brightgreen)](https://developer.apple.com/library/archive/documentation/IDEs/Conceptual/iOS_Simulator_Guide/Introduction/Introduction.html)
[![Tested With Ubuntu](https://img.shields.io/static/v1?label=tested%20with&message=Ubuntu%20%F0%9F%96%A5&color=brightgreen)](https://developer.apple.com/library/archive/documentation/IDEs/Conceptual/iOS_Simulator_Guide/Introduction/Introduction.html)
@ -34,14 +34,16 @@ Table of Contents
* [Stable version](#stable-version)
* [Development version](#development-version)
* [Start proxy.py](#start-proxypy)
* [Command Line](#command-line)
* [From command line when installed using PIP](#from-command-line-when-installed-using-pip)
* [From command line using repo source](#from-command-line-using-repo-source)
* [Docker Image](#docker-image)
* [Stable version](#stable-version-from-docker-hub)
* [Development version](#build-development-version-locally)
* [Customize Startup Flags](#customize-startup-flags)
* [Plugin Examples](#plugin-examples)
* [ShortLinkPlugin](#shortlinkplugin)
* [ModifyPostDataPlugin](#modifypostdataplugin)
* [ProposedRestApiPlugin](#proposedrestapiplugin)
* [MockRestApiPlugin](#mockrestapiplugin)
* [RedirectToCustomServerPlugin](#redirecttocustomserverplugin)
* [FilterByUpstreamHostPlugin](#filterbyupstreamhostplugin)
* [CacheResponsesPlugin](#cacheresponsesplugin)
@ -61,17 +63,20 @@ Table of Contents
* [proxy.WebsocketClient](#proxywebsocketclient)
* [Embed proxy.py](#embed-proxypy)
* [Plugin Developer and Contributor Guide](#plugin-developer-and-contributor-guide)
* [Start proxy.py from repo source](#start-proxypy-from-repo-source)
* [Everything is a plugin](#everything-is-a-plugin)
* [Internal Architecture](#internal-architecture)
* [Internal Documentation](#internal-documentation)
* [Sending a Pull Request](#sending-a-pull-request)
* [Frequently Asked Questions](#frequently-asked-questions)
* [SyntaxError: invalid syntax](#syntaxerror-invalid-syntax)
* [Unable to connect with proxy.py from remote host](#unable-to-connect-with-proxypy-from-remote-host)
* [Basic auth not working with a browser](#basic-auth-not-working-with-a-browser)
* [Docker image not working on MacOS](#docker-image-not-working-on-macos)
* [Unable to load custom plugins](#unable-to-load-custom-plugins)
* [ValueError: filedescriptor out of range in select](#valueerror-filedescriptor-out-of-range-in-select)
* [Flags](#flags)
* [Changelog](#changelog)
Features
========
@ -100,7 +105,6 @@ Features
0.022 [332] |■
```
- Lightweight
- Distributed as a single file module `~100KB`
- Uses only `~5-20MB` RAM
- No external dependency other than standard Python library
- Programmable
@ -108,6 +112,10 @@ Features
- Customize proxy and http routing via [plugins](https://github.com/abhinavsingh/proxy.py/blob/develop/plugin_examples.py)
- Enable plugin using command line option e.g. `--plugins plugin_examples.CacheResponsesPlugin`
- Plugin API is currently in development state, expect breaking changes.
- Realtime Dashboard
- Optionally enable bundled dashboard. Available at `http://localhost:8899/dashboard`.
- Inspect, Monitor, Control and Configure `proxy.py` at runtime.
- Extend dashboard using plugins.
- Secure
- Enable end-to-end encryption between clients and `proxy.py` using TLS
- See [End-to-End Encryption](#end-to-end-encryption)
@ -138,38 +146,32 @@ or from GitHub `master` branch
$ pip install git+https://github.com/abhinavsingh/proxy.py.git@master
or simply `wget` it:
$ wget -q https://raw.githubusercontent.com/abhinavsingh/proxy.py/master/proxy.py
or download from here [proxy.py](https://raw.githubusercontent.com/abhinavsingh/proxy.py/master/proxy.py)
## Development version
$ pip install git+https://github.com/abhinavsingh/proxy.py.git@develop
For `Docker` usage see [Docker Image](#docker-image).
For `Docker` installation see [Docker Image](#docker-image).
Start proxy.py
==============
## Command line
## From command line when installed using PIP
Simply type `proxy.py` on command line to start it with default configuration.
Simply type `proxy` on command line to start it with default configuration.
```
$ proxy.py
...[redacted]... - Loaded plugin <class 'proxy.HttpProxyPlugin'>
$ proxy
...[redacted]... - Loaded plugin proxy.http_proxy.HttpProxyPlugin
...[redacted]... - Starting 8 workers
...[redacted]... - Started server on ::1:8899
```
Things to notice from above logs:
- `Loaded plugin` - `proxy.py` will load `HttpProxyPlugin` by default. It adds `http(s)`
proxy server capabilities to `proxy.py`
- `Loaded plugin` - `proxy.py` will load `proxy.http.proxy.HttpProxyPlugin` by default.
As name suggests, this core plugin adds `http(s)` proxy server capabilities to `proxy.py`
- `Started N workers` - Use `--num-workers` flag to customize number of `Worker` processes.
- `Started N workers` - Use `--num-workers` flag to customize number of worker processes.
By default, `proxy.py` will start as many workers as there are CPU cores on the machine.
- `Started server on ::1:8899` - By default, `proxy.py` listens on IPv6 `::1`, which
@ -184,9 +186,9 @@ All the logs above are `INFO` level logs, default `--log-level` for `proxy.py`.
Lets start `proxy.py` with `DEBUG` level logging:
```
$ proxy.py --log-level d
$ proxy --log-level d
...[redacted]... - Open file descriptor soft limit set to 1024
...[redacted]... - Loaded plugin <class 'proxy.HttpProxyPlugin'>
...[redacted]... - Loaded plugin proxy.http_proxy.HttpProxyPlugin
...[redacted]... - Started 8 workers
...[redacted]... - Started server on ::1:8899
```
@ -199,6 +201,24 @@ As we can see, before starting up:
See [flags](#flags) for full list of available configuration options.
## From command line using repo source
When `proxy.py` is installed using `pip`,
a binary file named `proxy` is added under the `bin` folder.
If you are trying to run `proxy.py` from source code,
there is no binary file named `proxy` in the source code.
To start `proxy.py` from source code, use:
```
$ git clone https://github.com/abhinavsingh/proxy.py.git
$ cd proxy.py
$ python -m proxy
```
Also see [Plugin Developer and Contributor Guide](#plugin-developer-and-contributor-guide)
if you plan to work with `proxy.py` source code.
## Docker image
#### Stable Version from Docker Hub
@ -210,7 +230,9 @@ See [flags](#flags) for full list of available configuration options.
$ git clone https://github.com/abhinavsingh/proxy.py.git
$ cd proxy.py
$ make container
$ docker run -it -p 8899:8899 --rm abhinavsingh/proxy.py:v$(./proxy.py -v)
$ docker run -it -p 8899:8899 --rm abhinavsingh/proxy.py:latest
### Customize startup flags
By default `docker` binary is started with IPv4 networking flags:
@ -230,7 +252,7 @@ For example, to check `proxy.py` version within Docker image:
Plugin Examples
===============
See [plugin_examples.py](https://github.com/abhinavsingh/proxy.py/blob/develop/plugin_examples.py) for full code.
See [plugin_examples](https://github.com/abhinavsingh/proxy.py/tree/develop/plugin_examples) for full code.
All the examples below also works with `https` traffic but require additional flags and certificate generation.
See [TLS Interception](#tls-interception).
@ -242,8 +264,8 @@ Add support for short links in your favorite browsers / applications.
Start `proxy.py` as:
```
$ proxy.py \
--plugins plugin_examples.ShortLinkPlugin
$ proxy \
--plugins plugin_examples/shortlink.ShortLinkPlugin
```
Now you can speed up your daily browsing experience by visiting your
@ -271,8 +293,8 @@ Modifies POST request body before sending request to upstream server.
Start `proxy.py` as:
```
$ proxy.py \
--plugins plugin_examples.ModifyPostDataPlugin
$ proxy \
--plugins plugin_examples/modify_post_data.ModifyPostDataPlugin
```
By default plugin replaces POST body content with hardcoded `b'{"key": "modified"}'`
@ -305,7 +327,7 @@ Note following from the response above:
1. POST data was modified `"data": "{\"key\": \"modified\"}"`.
Original `curl` command data was `{"key": "value"}`.
2. Our `curl` command didn't add any `Content-Type` header,
2. Our `curl` command did not add any `Content-Type` header,
but our plugin did add one `"Content-Type": "application/json"`.
Same can also be verified by looking at `json` field in the output above:
```
@ -316,7 +338,7 @@ Note following from the response above:
3. Our plugin also added a `Content-Length` header to match length
of modified body.
## ProposedRestApiPlugin
## MockRestApiPlugin
Mock responses for your server REST API.
Use to test and develop client side applications
@ -325,8 +347,8 @@ without need of an actual upstream REST API server.
Start `proxy.py` as:
```
$ proxy.py \
--plugins plugin_examples.ProposedRestApiPlugin
$ proxy \
--plugins plugin_examples/mock_rest_api.ProposedRestApiPlugin
```
Verify mock API response using `curl -x localhost:8899 http://api.example.com/v1/users/`
@ -356,9 +378,9 @@ also running on `8899` port.
Start `proxy.py` and enable inbuilt web server:
```
$ proxy.py \
$ proxy \
--enable-web-server \
--plugins plugin_examples.RedirectToCustomServerPlugin
--plugins plugin_examples/redirect_to_custom_server.RedirectToCustomServerPlugin
```
Verify using `curl -v -x localhost:8899 http://google.com`
@ -390,8 +412,8 @@ By default, plugin drops traffic for `google.com` and `www.google.com`.
Start `proxy.py` as:
```
$ proxy.py \
--plugins plugin_examples.FilterByUpstreamHostPlugin
$ proxy \
--plugins plugin_examples/filter_by_upstream.FilterByUpstreamHostPlugin
```
Verify using `curl -v -x localhost:8899 http://google.com`:
@ -410,7 +432,7 @@ Above `418 I'm a tea pot` is sent by our plugin.
Verify the same by inspecting logs for `proxy.py`:
```
2019-09-24 19:21:37,893 - ERROR - pid:50074 - handle_readables:1347 - ProtocolException type raised
2019-09-24 19:21:37,893 - ERROR - pid:50074 - handle_readables:1347 - HttpProtocolException type raised
Traceback (most recent call last):
... [redacted] ...
2019-09-24 19:21:37,897 - INFO - pid:50074 - access_log:1157 - ::1:49911 - GET None:None/ - None None - 0 bytes
@ -423,8 +445,8 @@ Caches Upstream Server Responses.
Start `proxy.py` as:
```
$ proxy.py \
--plugins plugin_examples.CacheResponsesPlugin
$ proxy \
--plugins plugin_examples/cache_responses.CacheResponsesPlugin
```
Verify using `curl -v -x localhost:8899 http://httpbin.org/get`:
@ -499,8 +521,8 @@ Modifies upstream server responses.
Start `proxy.py` as:
```
$ proxy.py \
--plugins plugin_examples.ManInTheMiddlePlugin
$ proxy \
--plugins plugin_examples/man_in_the_middle.ManInTheMiddlePlugin
```
Verify using `curl -v -x localhost:8899 http://google.com`:
@ -547,7 +569,7 @@ make https-certificates
Start `proxy.py` as:
```
$ proxy.py \
$ proxy \
--cert-file https-cert.pem \
--key-file https-key.pem
```
@ -570,7 +592,7 @@ Verify using `curl -x https://localhost:8899 --proxy-cacert https-cert.pem https
TLS Interception
=================
By default, `proxy.py` doesn't decrypt `https` traffic between client and server.
By default, `proxy.py` will not decrypt `https` traffic between client and server.
To enable TLS interception first generate CA certificates:
```
@ -581,8 +603,8 @@ Lets also enable `CacheResponsePlugin` so that we can verify decrypted
response from the server. Start `proxy.py` as:
```
$ proxy.py \
--plugins plugin_examples.CacheResponsesPlugin \
$ proxy \
--plugins plugin_examples/cache_responses.CacheResponsesPlugin \
--ca-key-file ca-key.pem \
--ca-cert-file ca-cert.pem \
--ca-signing-key-file ca-signing-key.pem
@ -754,6 +776,16 @@ for all available classes and utility methods.
Plugin Developer and Contributor Guide
======================================
## Start proxy.py from repo source
Contributors must start `proxy.py` from source to verify and develop new features / fixes.
Start `proxy.py` as:
$ git clone https://github.com/abhinavsingh/proxy.py.git
$ cd proxy.py
$ python -m proxy
## Everything is a plugin
As you might have guessed by now, in `proxy.py` everything is a plugin.
@ -767,30 +799,30 @@ As you might have guessed by now, in `proxy.py` everything is a plugin.
Example, [FilterByUpstreamHostPlugin](#filterbyupstreamhostplugin).
- We also enabled inbuilt web server using `--enable-web-server`.
Inbuilt web server implements `ProtocolHandlerPlugin` plugin.
See documentation of [ProtocolHandlerPlugin](https://github.com/abhinavsingh/proxy.py/blob/b03629fa0df1595eb4995427bc601063be7fdca9/proxy.py#L793-L850)
for available lifecycle hooks. Use `ProtocolHandlerPlugin` to add
Inbuilt web server implements `HttpProtocolHandlerPlugin` plugin.
See documentation of [HttpProtocolHandlerPlugin](https://github.com/abhinavsingh/proxy.py/blob/b03629fa0df1595eb4995427bc601063be7fdca9/proxy.py#L793-L850)
for available lifecycle hooks. Use `HttpProtocolHandlerPlugin` to add
new features for http(s) clients. Example,
[HttpWebServerPlugin](https://github.com/abhinavsingh/proxy.py/blob/b03629fa0df1595eb4995427bc601063be7fdca9/proxy.py#L1185-L1260).
- There also is a `--disable-http-proxy` flag. It disables inbuilt proxy server.
Use this flag with `--enable-web-server` flag to run `proxy.py` as a programmable
http(s) server. [HttpProxyPlugin](https://github.com/abhinavsingh/proxy.py/blob/b03629fa0df1595eb4995427bc601063be7fdca9/proxy.py#L941-L1182)
also implements `ProtocolHandlerPlugin`.
also implements `HttpProtocolHandlerPlugin`.
## Internal Architecture
- [ProtocolHandler](https://github.com/abhinavsingh/proxy.py/blob/b03629fa0df1595eb4995427bc601063be7fdca9/proxy.py#L1263-L1440)
- [HttpProtocolHandler](https://github.com/abhinavsingh/proxy.py/blob/b03629fa0df1595eb4995427bc601063be7fdca9/proxy.py#L1263-L1440)
thread is started with the accepted [TcpClientConnection](https://github.com/abhinavsingh/proxy.py/blob/b03629fa0df1595eb4995427bc601063be7fdca9/proxy.py#L230-L237).
`ProtocolHandler` is responsible for parsing incoming client request and invoking
`ProtocolHandlerPlugin` lifecycle hooks.
`HttpProtocolHandler` is responsible for parsing incoming client request and invoking
`HttpProtocolHandlerPlugin` lifecycle hooks.
- `HttpProxyPlugin` which implements `ProtocolHandlerPlugin` also has its own plugin
- `HttpProxyPlugin` which implements `HttpProtocolHandlerPlugin` also has its own plugin
mechanism. Its responsibility is to establish connection between client and
upstream [TcpServerConnection](https://github.com/abhinavsingh/proxy.py/blob/b03629fa0df1595eb4995427bc601063be7fdca9/proxy.py#L204-L227)
and invoke `HttpProxyBasePlugin` lifecycle hooks.
- `ProtocolHandler` threads are started by [Acceptor](https://github.com/abhinavsingh/proxy.py/blob/b03629fa0df1595eb4995427bc601063be7fdca9/proxy.py#L424-L472)
- `HttpProtocolHandler` threads are started by [Acceptor](https://github.com/abhinavsingh/proxy.py/blob/b03629fa0df1595eb4995427bc601063be7fdca9/proxy.py#L424-L472)
processes.
- `--num-workers` `Acceptor` processes are started by
@ -799,7 +831,7 @@ and invoke `HttpProxyBasePlugin` lifecycle hooks.
- `AcceptorPool` listens on server socket and pass the handler to `Acceptor` processes.
Workers are responsible for accepting new client connections and starting
`ProtocolHandler` thread.
`HttpProtocolHandler` thread.
## Sending a Pull Request
@ -829,55 +861,26 @@ Example:
```
$ pydoc3 proxy
CLASSES
abc.ABC(builtins.object)
HttpProxyBasePlugin
HttpWebServerBasePlugin
DevtoolsWebsocketPlugin
HttpWebServerPacFilePlugin
ProtocolHandlerPlugin
DevtoolsProtocolPlugin
HttpProxyPlugin
HttpWebServerPlugin
TcpConnection
TcpClientConnection
TcpServerConnection
WebsocketClient
ThreadlessWork
ProtocolHandler(threading.Thread, ThreadlessWork)
builtins.Exception(builtins.BaseException)
ProtocolException
HttpRequestRejected
ProxyAuthenticationFailed
ProxyConnectionFailed
TcpConnectionUninitializedException
builtins.object
AcceptorPool
ChunkParser
HttpParser
ProtocolConfig
WebsocketFrame
builtins.tuple(builtins.object)
ChunkParserStates
HttpMethods
HttpParserStates
HttpParserTypes
HttpProtocolTypes
HttpStatusCodes
TcpConnectionTypes
WebsocketOpcodes
contextlib.ContextDecorator(builtins.object)
socket_connection
multiprocessing.context.Process(multiprocessing.process.BaseProcess)
Acceptor
Threadless
threading.Thread(builtins.object)
ProtocolHandler(threading.Thread, ThreadlessWork)
PACKAGE CONTENTS
__main__
common (package)
core (package)
http (package)
main
FILE
/Users/abhinav/Dev/proxy.py/proxy/__init__.py
```
Frequently Asked Questions
==========================
## SyntaxError: invalid syntax
Make sure you are using `Python 3`. Verify the version before running `proxy.py`:
`$ python --version`
## Unable to connect with proxy.py from remote host
Make sure `proxy.py` is listening on correct network interface.
@ -1077,3 +1080,20 @@ optional arguments:
Proxy.py not working? Report at:
https://github.com/abhinavsingh/proxy.py/issues/new
```
Changelog
=========
- `v2.x`
- No longer ~~a single file module~~.
- Added dashboard app.
- `v1.x`
- `Python3` only.
- Deprecated support for ~~Python 2.x~~.
- Added support for multi accept.
- Added plugin support.
- `v0.x`
- Single file.
- Single threaded server.
For detailed changelog refer

View File

@ -14,7 +14,10 @@ import sys
import time
from typing import List, Tuple
import proxy
from proxy.common.constants import __homepage__, DEFAULT_BUFFER_SIZE
from proxy.common.utils import build_http_request
from proxy.http.methods import httpMethods
from proxy.http.parser import httpParserStates, httpParserTypes, HttpParser
DEFAULT_N = 1
@ -27,13 +30,14 @@ def init_parser() -> argparse.ArgumentParser:
'keep-alive connections are opened. Over each opened '
'connection multiple pipelined request / response '
'packets are exchanged with proxy.py web server.',
epilog='Proxy.py not working? Report at: %s/issues/new' % proxy.__homepage__
epilog='Proxy.py not working? Report at: %s/issues/new' % __homepage__
)
parser.add_argument(
'--n', '-n',
type=int,
default=DEFAULT_N,
help='Default: ' + str(DEFAULT_N) + '. See description above for meaning of N.'
help='Default: ' + str(DEFAULT_N) +
'. See description above for meaning of N.'
)
return parser
@ -42,7 +46,8 @@ class Benchmark:
def __init__(self, n: int = DEFAULT_N) -> None:
self.n = n
self.clients: List[Tuple[asyncio.StreamReader, asyncio.StreamWriter]] = []
self.clients: List[Tuple[asyncio.StreamReader,
asyncio.StreamWriter]] = []
async def open_connections(self) -> None:
for _ in range(self.n):
@ -53,18 +58,18 @@ class Benchmark:
async def send(writer: asyncio.StreamWriter) -> None:
try:
while True:
writer.write(proxy.build_http_request(
proxy.httpMethods.GET, b'/'
writer.write(build_http_request(
httpMethods.GET, b'/'
))
await asyncio.sleep(0.01)
except KeyboardInterrupt:
pass
@staticmethod
def parse_pipeline_response(response: proxy.HttpParser, raw: bytes, counter: int = 0) -> \
Tuple[proxy.HttpParser, int]:
def parse_pipeline_response(response: HttpParser, raw: bytes, counter: int = 0) -> \
Tuple[HttpParser, int]:
response.parse(raw)
if response.state != proxy.httpParserStates.COMPLETE:
if response.state != httpParserStates.COMPLETE:
# Need more data
return response, counter
@ -72,22 +77,25 @@ class Benchmark:
# No more buffer left to parse
return response, counter + 1
# For pipelined requests we may have pending buffer, try parse them as responses
pipelined_response = proxy.HttpParser(proxy.httpParserTypes.RESPONSE_PARSER)
return Benchmark.parse_pipeline_response(pipelined_response, response.buffer, counter + 1)
# For pipelined requests we may have pending buffer, try parse them as
# responses
pipelined_response = HttpParser(httpParserTypes.RESPONSE_PARSER)
return Benchmark.parse_pipeline_response(
pipelined_response, response.buffer, counter + 1)
@staticmethod
async def recv(idd: int, reader: asyncio.StreamReader) -> None:
print_every = 1000
last_print = time.time()
num_completed_requests: int = 0
response = proxy.HttpParser(proxy.httpParserTypes.RESPONSE_PARSER)
response = HttpParser(httpParserTypes.RESPONSE_PARSER)
try:
while True:
raw = await reader.read(proxy.DEFAULT_BUFFER_SIZE)
response, total_parsed = Benchmark.parse_pipeline_response(response, raw)
if response.state == proxy.httpParserStates.COMPLETE:
response = proxy.HttpParser(proxy.httpParserTypes.RESPONSE_PARSER)
raw = await reader.read(DEFAULT_BUFFER_SIZE)
response, total_parsed = Benchmark.parse_pipeline_response(
response, raw)
if response.state == httpParserStates.COMPLETE:
response = HttpParser(httpParserTypes.RESPONSE_PARSER)
if total_parsed > 0:
num_completed_requests += total_parsed
# print('total parsed %d' % total_parsed)

23
dashboard/.eslintrc.json Normal file
View File

@ -0,0 +1,23 @@
{
"env": {
"browser": true,
"commonjs": true,
"es6": true
},
"extends": [
"standard"
],
"globals": {
"Atomics": "readonly",
"SharedArrayBuffer": "readonly"
},
"parser": "@typescript-eslint/parser",
"parserOptions": {
"ecmaVersion": 2018
},
"plugins": [
"@typescript-eslint"
],
"rules": {
}
}

162
dashboard/dashboard.py Normal file
View File

@ -0,0 +1,162 @@
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import os
import json
import queue
import logging
import threading
import multiprocessing
import uuid
from typing import List, Tuple, Optional, Any
from proxy.http.server import HttpWebServerPlugin, HttpWebServerBasePlugin, httpProtocolTypes
from proxy.http.parser import HttpParser
from proxy.http.websocket import WebsocketFrame
from proxy.http.codes import httpStatusCodes
from proxy.common.utils import build_http_response, bytes_
from proxy.common.types import DictQueueType
from proxy.core.connection import TcpClientConnection
logger = logging.getLogger(__name__)
class ProxyDashboard(HttpWebServerBasePlugin):
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.inspection_enabled: bool = False
self.relay_thread: Optional[threading.Thread] = None
self.relay_shutdown: Optional[threading.Event] = None
self.relay_manager: Optional[multiprocessing.managers.SyncManager] = None
self.relay_channel: Optional[DictQueueType] = None
self.relay_sub_id: Optional[str] = None
def routes(self) -> List[Tuple[int, bytes]]:
return [
# Redirects to /dashboard/
(httpProtocolTypes.HTTP, b'/dashboard'),
# Redirects to /dashboard/
(httpProtocolTypes.HTTPS, b'/dashboard'),
# Redirects to /dashboard/
(httpProtocolTypes.HTTP, b'/dashboard/proxy.html'),
# Redirects to /dashboard/
(httpProtocolTypes.HTTPS, b'/dashboard/proxy.html'),
(httpProtocolTypes.HTTP, b'/dashboard/'),
(httpProtocolTypes.HTTPS, b'/dashboard/'),
(httpProtocolTypes.WEBSOCKET, b'/dashboard'),
]
def handle_request(self, request: HttpParser) -> None:
if request.path == b'/dashboard/':
self.client.queue(
HttpWebServerPlugin.read_and_build_static_file_response(
os.path.join(self.flags.static_server_dir, 'dashboard', 'proxy.html')))
elif request.path in (
b'/dashboard',
b'/dashboard/proxy.html'):
self.client.queue(build_http_response(
httpStatusCodes.PERMANENT_REDIRECT, reason=b'Permanent Redirect',
headers={
b'Location': b'/dashboard/',
b'Content-Length': b'0',
b'Connection': b'close',
}
))
def on_websocket_open(self) -> None:
logger.info('app ws opened')
def on_websocket_message(self, frame: WebsocketFrame) -> None:
try:
assert frame.data
message = json.loads(frame.data)
except UnicodeDecodeError:
logger.error(frame.data)
logger.info(frame.opcode)
return
if message['method'] == 'ping':
self.reply_pong(message['id'])
elif message['method'] == 'enable_inspection':
# inspection can only be enabled if --enable-events is used
if not self.flags.enable_events:
self.client.queue(
WebsocketFrame.text(
bytes_(
json.dumps(
{'id': message['id'], 'response': 'not enabled'})
)
)
)
else:
self.inspection_enabled = True
self.relay_shutdown = threading.Event()
self.relay_manager = multiprocessing.Manager()
self.relay_channel = self.relay_manager.Queue()
self.relay_thread = threading.Thread(
target=self.relay_events,
args=(self.relay_shutdown, self.relay_channel, self.client))
self.relay_thread.start()
self.relay_sub_id = uuid.uuid4().hex
self.event_queue.subscribe(
self.relay_sub_id, self.relay_channel)
elif message['method'] == 'disable_inspection':
if self.inspection_enabled:
self.shutdown_relay()
self.inspection_enabled = False
else:
logger.info(frame.data)
logger.info(frame.opcode)
def shutdown_relay(self) -> None:
assert self.relay_manager
assert self.relay_shutdown
assert self.relay_thread
self.relay_shutdown.set()
self.relay_thread.join()
self.relay_manager.shutdown()
self.relay_manager = None
self.relay_thread = None
self.relay_shutdown = None
self.relay_channel = None
self.relay_sub_id = None
def on_websocket_close(self) -> None:
logger.info('app ws closed')
if self.inspection_enabled:
self.shutdown_relay()
def reply_pong(self, idd: int) -> None:
self.client.queue(
WebsocketFrame.text(
bytes_(
json.dumps({'id': idd, 'response': 'pong'}))))
@staticmethod
def relay_events(
shutdown: threading.Event,
channel: DictQueueType,
client: TcpClientConnection) -> None:
while not shutdown.is_set():
try:
ev = channel.get(timeout=1)
client.queue(
WebsocketFrame.text(
bytes_(
json.dumps(ev))))
except queue.Empty:
pass
except EOFError:
break
except KeyboardInterrupt:
break

3707
dashboard/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

53
dashboard/package.json Normal file
View File

@ -0,0 +1,53 @@
{
"name": "proxy.py",
"version": "1.0.0",
"description": "Frontend dashboard for proxy.py",
"main": "index.js",
"scripts": {
"clean": "rm -rf build",
"lint": "eslint --global $ src/*.ts",
"pretest": "npm run clean && npm run lint && tsc --target es5 --outDir build test/test.ts",
"test": "jasmine build/test/test.js",
"build": "npm test && rollup -c",
"start": "pushd ../public && http-server -g true -i false -d false -c-1 --no-dotfiles . && popd",
"watch": "rollup -c -w"
},
"repository": {
"type": "git",
"url": "git+https://github.com/abhinavsingh/proxy.py.git"
},
"author": "Abhinav Singh",
"license": "BSD-3-Clause",
"bugs": {
"url": "https://github.com/abhinavsingh/proxy.py/issues"
},
"homepage": "https://github.com/abhinavsingh/proxy.py#readme",
"devDependencies": {
"@types/jasmine": "^3.4.4",
"@types/jquery": "^3.3.31",
"@types/js-cookie": "^2.2.4",
"@typescript-eslint/eslint-plugin": "^2.5.0",
"@typescript-eslint/parser": "^2.5.0",
"chrome-devtools-frontend": "^1.0.706688",
"eslint": "^6.5.1",
"eslint-config-standard": "^14.1.0",
"eslint-plugin-import": "^2.18.2",
"eslint-plugin-node": "^10.0.0",
"eslint-plugin-promise": "^4.2.1",
"eslint-plugin-standard": "^4.0.1",
"http-server": "^0.11.1",
"jasmine": "^3.5.0",
"jasmine-ts": "^0.3.0",
"jquery": "^3.4.1",
"js-cookie": "^2.2.1",
"jsdom": "^15.2.0",
"ncp": "^2.0.0",
"rollup": "^1.24.0",
"rollup-plugin-copy": "^3.1.0",
"rollup-plugin-javascript-obfuscator": "^1.0.4",
"rollup-plugin-typescript": "^1.0.1",
"ts-node": "^7.0.1",
"typescript": "^3.6.4",
"ws": "^7.2.0"
}
}

View File

@ -0,0 +1,39 @@
const typescript = require('rollup-plugin-typescript');
const copy = require('rollup-plugin-copy');
const obfuscatorPlugin = require('rollup-plugin-javascript-obfuscator');
module.exports = {
input: 'src/proxy.ts',
output: {
file: '../public/dashboard/proxy.js',
format: 'umd',
name: 'projectbundle',
sourcemap: true
},
plugins: [
typescript(),
copy({
targets: [{
src: 'static/**/*',
dest: '../public/dashboard',
}, {
src: 'src/proxy.html',
dest: '../public/dashboard',
}, {
src: 'src/proxy.css',
dest: '../public/dashboard',
}],
}),
obfuscatorPlugin({
log: false,
sourceMap: true,
compact: true,
stringArray: true,
rotateStringArray: true,
transformObjectKeys: true,
stringArrayThreshold: 1,
stringArrayEncoding: 'rc4',
identifierNamesGenerator: 'mangled',
})
]
};

View File

@ -0,0 +1,9 @@
let jsdom = require('jsdom');
let WebSocket = require('ws')
const window = new jsdom.JSDOM('<!DOCTYPE html><head><title></title></head><body></body></html>').window;
global.jQuery = global.$ = require('jquery')(window);
global.window = window;
global.document = window.document;
global.WebSocket = WebSocket;

View File

@ -0,0 +1,11 @@
{
"spec_dir": "spec",
"spec_files": [
"**/*[sS]pec.js"
],
"helpers": [
"helpers/**/*.js"
],
"stopSpecOnExpectationFailure": false,
"random": true
}

37
dashboard/src/devtools.ts Normal file
View File

@ -0,0 +1,37 @@
/*
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
*/
const path = require('path')
const fs = require('fs')
const ncp = require('ncp').ncp
ncp.limit = 16
const publicFolderPath = path.join(__dirname, 'public')
const destinationFolderPath = path.join(publicFolderPath, 'devtools')
const publicFolderExists = fs.existsSync(publicFolderPath)
if (!publicFolderExists) {
console.error(publicFolderPath + ' folder doesn\'t exist, make sure you are in the right directory.')
process.exit(1)
}
const destinationFolderExists = fs.existsSync(destinationFolderPath)
if (!destinationFolderExists) {
console.error(destinationFolderPath + ' folder doesn\'t exist, make sure you are in the right directory.')
process.exit(1)
}
const chromeDevTools = path.dirname(require.resolve('chrome-devtools-frontend/front_end/inspector.html'))
console.log(chromeDevTools + ' ---> ' + destinationFolderPath)
ncp(chromeDevTools, destinationFolderPath, (err: any) => {
if (err) {
return console.error(err)
}
console.log('Copy successful!!!')
})

35
dashboard/src/proxy.css Normal file
View File

@ -0,0 +1,35 @@
#app {
background-color: #eeeeee;
height: 100%;
}
#app .remove-api-spec {
top: 10px;
right: 15px;
}
.api-path-spec .list-group-item {
padding-left: 50px;
}
#app-header {
padding-top: 10px;
padding-bottom: 5px;
}
#app-body .list-group {
margin-left: 15px;
margin-right: 15px;
margin-bottom: 10px;
}
.sunny-morning-white-gradient{
background-image: linear-gradient(120deg,#eeeeee 0,#ffecd2 100%);
}
.mean-fruit-white-gradient{
background-image:linear-gradient(120deg,#eeeeee 0,#ffefbf 100%);
}
.proxy-data {
display: none;
}

147
dashboard/src/proxy.html Normal file
View File

@ -0,0 +1,147 @@
<html lang="en">
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8">
<link rel="stylesheet" href="proxy.css">
<link rel="stylesheet" href="bootstrap-4.3.1.min.css">
<link rel="stylesheet" href="font-awesome-4.7.0.min.css">
<title>Proxy.py Dashboard</title>
</head>
<body>
<nav class="navbar navbar-expand-lg navbar-dark bg-dark">
<a class="navbar-brand" href="#">
<i class="fa fa-fw fa-dashboard"></i>
PROXY.PY
</a>
<button class="navbar-toggler" type="button" data-toggle="collapse"
data-target="#proxyTopNav" aria-controls="proxyTopNav"
aria-expanded="false" aria-label="Toggle navigation">
<span class="navbar-toggler-icon"></span>
</button>
<div class="collapse navbar-collapse" id="proxyTopNav">
<ul class="navbar-nav ml-auto">
<li class="nav-item">
<a class="nav-link" href="#" id="proxyHome">
<i class="fa fa-fw fa-home"></i>
Home
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="#" id="proxyApiDevelopment">
<i class="fa fa-fw fa-connectdevelop"></i>
API Development
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="#" id="proxyInspect">
<i class="fa fa-fw fa-binoculars"></i>
Inspect Traffic
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="#" id="proxyShortLinks">
<i class="fa fa-fw fa-bolt"></i>
Short Links
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="#" id="proxyControls">
<i class="fa fa-fw fa-lock"></i>
Traffic Controls
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="#" id="proxySettings">
<i class="fa fa-fw fa-cogs"></i>
Settings
</a>
</li>
</ul>
</div>
</nav>
<div id="app">
<div id="home" class="proxy-data"></div>
<div id="api-development" class="proxy-data">
<div id="app-header">
<div class="container-fluid">
<div class="row">
<div class="col-6">
<p class="h3">API Development</p>
</div>
<div class="col-6 text-right">
<button type="button" class="btn btn-primary">
<i class="fa fa-fw fa-plus-circle"></i>
Create New API
</button>
</div>
</div>
</div>
</div>
<div id="app-body">
<div class="list-group position-relative">
<a href="#" class="list-group-item default text-decoration-none bg-light"
data-toggle="collapse" data-target="#api-example-com-path-specs" data-parent="#app">
api.example.com <span class="badge badge-info">3 Resources</span>
</a>
<button class="position-absolute fa fa-close ml-auto btn btn-danger remove-api-spec"
title="Delete api.example.com"></button>
<div id="api-example-com-path-specs" class="collapse api-path-spec">
<div class="list-group-item bg-light">
/v1/users/
</div>
<div class="list-group-item bg-light">
/v1/groups/
</div>
<div class="list-group-item bg-light">
/v1/messages/
</div>
</div>
</div>
<div class="list-group position-relative">
<a href="#" class="list-group-item default text-decoration-none bg-light"
data-toggle="collapse" data-target="#my-api" data-parent="#app">
my.api <span class="badge badge-info">1 Resource</span>
</a>
<button class="position-absolute fa fa-close ml-auto btn btn-danger remove-api-spec"
title="Delete my.api"></button>
<div id="my-api" class="collapse api-path-spec">
<div class="list-group-item bg-light">
/api/
</div>
</div>
</div>
</div>
</div>
<div id="inspect-traffic" class="proxy-data"></div>
<div id="short-links" class="proxy-data"></div>
<div id="traffic-controls" class="proxy-data"></div>
<div id="settings" class="proxy-data"></div>
</div>
<nav class="navbar fixed-bottom navbar-dark bg-dark" id="proxyBottomNav">
<div class="mr-auto text-danger">
<a class="text-reset small" href="#" id="proxyServerStatus">
<i class="fa fa-fw fa-signal"></i>
Server Status <i id="proxyServerStatusSummary"></i>
</a>
</div>
<div class="ml-auto text-white-50">
<a class="text-reset small" href="#">
<i class="fa fa-fw fa-copyright"></i>
Abhinav Singh and contributors
</a>
</div>
</nav>
<script src="jquery-3.3.1.slim.min.js"></script>
<script src="popper-1.14.7.min.js"></script>
<script src="bootstrap-4.3.1.min.js"></script>
<script src="js.cookie-v3.0.0-beta.0.min.js"></script>
<script src="renderjson-b31d877.js"></script>
<script src="proxy.js"></script>
<script>
document.addEventListener('DOMContentLoaded', function() {
let proxyDashboard = new window.ProxyDashboard();
}, false);
</script>
</body>
</html>

217
dashboard/src/proxy.ts Normal file
View File

@ -0,0 +1,217 @@
/*
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
*/
class ApiDevelopment {
private specs: Map<string, Map<string, JSON>>;
constructor () {
this.specs = new Map()
this.fetchExistingSpecs()
}
private fetchExistingSpecs () {
// TODO: Fetch list of currently configured APIs from the backend
const apiExampleOrgSpec = new Map()
apiExampleOrgSpec.set('/v1/users/', {
count: 2,
next: null,
previous: null,
results: [
{
email: 'you@example.com',
groups: [],
url: 'api.example.org/v1/users/1/',
username: 'admin'
},
{
email: 'someone@example.com',
groups: [],
url: 'api.example.org/v1/users/2/',
username: 'someone'
}
]
})
this.specs.set('api.example.org', apiExampleOrgSpec)
}
}
class WebsocketApi {
private hostname: string = 'localhost';
private port: number = 8899;
private wsPrefix: string = '/dashboard';
private wsScheme: string = 'ws';
private ws: WebSocket;
private wsPath: string = this.wsScheme + '://' + this.hostname + ':' + this.port + this.wsPrefix;
private mid: number = 0;
private lastPingId: number;
private lastPingTime: number;
private readonly schedulePingEveryMs: number = 1000;
private readonly scheduleReconnectEveryMs: number = 5000;
private serverPingTimer: number;
private serverConnectTimer: number;
private inspectionEnabled: boolean;
constructor () {
this.scheduleServerConnect(0)
}
public enableInspection () {
// TODO: Set flag to true only once response has been received from the server
this.inspectionEnabled = true
this.ws.send(JSON.stringify({ id: this.mid, method: 'enable_inspection' }))
this.mid++
}
public disableInspection () {
this.inspectionEnabled = false
this.ws.send(JSON.stringify({ id: this.mid, method: 'disable_inspection' }))
this.mid++
}
private scheduleServerConnect (after_ms: number = this.scheduleReconnectEveryMs) {
this.clearServerConnectTimer()
this.serverConnectTimer = window.setTimeout(
this.connectToServer.bind(this), after_ms)
}
private connectToServer () {
this.ws = new WebSocket(this.wsPath)
this.ws.onopen = this.onServerWSOpen.bind(this)
this.ws.onmessage = this.onServerWSMessage.bind(this)
this.ws.onerror = this.onServerWSError.bind(this)
this.ws.onclose = this.onServerWSClose.bind(this)
}
private clearServerConnectTimer () {
if (this.serverConnectTimer == null) {
return
}
window.clearTimeout(this.serverConnectTimer)
this.serverConnectTimer = null
}
private scheduleServerPing (after_ms: number = this.schedulePingEveryMs) {
this.clearServerPingTimer()
this.serverPingTimer = window.setTimeout(
this.pingServer.bind(this), after_ms)
}
private pingServer () {
this.lastPingId = this.mid
this.lastPingTime = ProxyDashboard.getTime()
this.mid++
// console.log('Pinging server with id:%d', this.last_ping_id);
this.ws.send(JSON.stringify({ id: this.lastPingId, method: 'ping' }))
}
private clearServerPingTimer () {
if (this.serverPingTimer != null) {
window.clearTimeout(this.serverPingTimer)
this.serverPingTimer = null
}
this.lastPingTime = null
this.lastPingId = null
}
private onServerWSOpen (ev: MessageEvent) {
this.clearServerConnectTimer()
ProxyDashboard.setServerStatusSuccess('Connected...')
this.scheduleServerPing(0)
}
private onServerWSMessage (ev: MessageEvent) {
const message = JSON.parse(ev.data)
if (message.id === this.lastPingId) {
ProxyDashboard.setServerStatusSuccess(
String((ProxyDashboard.getTime() - this.lastPingTime) + ' ms'))
this.clearServerPingTimer()
this.scheduleServerPing()
} else {
console.log(message)
}
}
private onServerWSError (ev: MessageEvent) {
ProxyDashboard.setServerStatusDanger()
}
private onServerWSClose (ev: MessageEvent) {
this.clearServerPingTimer()
this.scheduleServerConnect()
ProxyDashboard.setServerStatusDanger()
}
}
export class ProxyDashboard {
private websocketApi: WebsocketApi
private apiDevelopment: ApiDevelopment
constructor () {
this.websocketApi = new WebsocketApi()
const that = this
$('#proxyTopNav>ul>li>a').on('click', function () {
that.switchTab(this)
})
this.apiDevelopment = new ApiDevelopment()
}
public static getTime () {
const date = new Date()
return date.getTime()
}
public static setServerStatusDanger () {
$('#proxyServerStatus').parent('div')
.removeClass('text-success')
.addClass('text-danger')
$('#proxyServerStatusSummary').text('')
}
public static setServerStatusSuccess (summary: string) {
$('#proxyServerStatus').parent('div')
.removeClass('text-danger')
.addClass('text-success')
$('#proxyServerStatusSummary').text(
'(' + summary + ')')
}
private switchTab (element: HTMLElement) {
const activeLi = $('#proxyTopNav>ul>li.active')
const activeTabId = activeLi.children('a').attr('id')
const clickedTabId = $(element).attr('id')
const clickedTabContentId = $(element).text().trim().toLowerCase().replace(' ', '-')
activeLi.removeClass('active')
$(element.parentNode).addClass('active')
console.log('Clicked id %s, showing %s', clickedTabId, clickedTabContentId)
$('#app>div.proxy-data').hide()
$('#' + clickedTabContentId).show()
// TODO: Tab ids shouldn't be hardcoded.
// Templatize proxy.html and refer to tab_id via enum or constants
//
// 1. Enable inspection if user moved to inspect tab
// 2. Disable inspection if user moved away from inspect tab
// 3. Do nothing if activeTabId == clickedTabId
if (clickedTabId !== activeTabId) {
if (clickedTabId === 'proxyInspect') {
this.websocketApi.enableInspection()
} else if (activeTabId === 'proxyInspect') {
this.websocketApi.disableInspection()
}
}
}
}
(window as any).ProxyDashboard = ProxyDashboard

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Binary file not shown.

Binary file not shown.

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,2 @@
/*! js-cookie v3.0.0-beta.0 | MIT */
!function(e,n){"object"==typeof exports&&"undefined"!=typeof module?module.exports=n():"function"==typeof define&&define.amd?define(n):(e=e||self,function(){var t=e.Cookies,o=e.Cookies=n();o.noConflict=function(){return e.Cookies=t,o}}())}(this,function(){"use strict";function e(){for(var e={},n=0;n<arguments.length;n++){var t=arguments[n];for(var o in t)e[o]=t[o]}return e}function n(e){return e.replace(/(%[\dA-F]{2})+/gi,decodeURIComponent)}return function t(o){function r(n,t,r){if("undefined"!=typeof document){"number"==typeof(r=e(i.defaults,r)).expires&&(r.expires=new Date(1*new Date+864e5*r.expires)),r.expires&&(r.expires=r.expires.toUTCString()),t=o.write?o.write(t,n):encodeURIComponent(String(t)).replace(/%(23|24|26|2B|3A|3C|3E|3D|2F|3F|40|5B|5D|5E|60|7B|7D|7C)/g,decodeURIComponent),n=encodeURIComponent(String(n)).replace(/%(23|24|26|2B|5E|60|7C)/g,decodeURIComponent).replace(/[()]/g,escape);var c="";for(var f in r)r[f]&&(c+="; "+f,!0!==r[f]&&(c+="="+r[f].split(";")[0]));return document.cookie=n+"="+t+c}}var i={defaults:{path:"/"},set:r,get:function(e){if("undefined"!=typeof document&&(!arguments.length||e)){for(var t=document.cookie?document.cookie.split("; "):[],r={},i=0;i<t.length;i++){var c=t[i].split("="),f=c.slice(1).join("=");'"'===f.charAt(0)&&(f=f.slice(1,-1));try{var u=n(c[0]);if(r[u]=(o.read||o)(f,u)||n(f),e===u)break}catch(e){}}return e?r[e]:r}},remove:function(n,t){r(n,"",e(t,{expires:-1}))},withConverter:t};return i}(function(){})});

5
dashboard/static/popper-1.14.7.min.js vendored Normal file

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,189 @@
// Copyright © 2013-2014 David Caldwell <david@porkrind.org>
//
// Permission to use, copy, modify, and/or distribute this software for any
// purpose with or without fee is hereby granted, provided that the above
// copyright notice and this permission notice appear in all copies.
//
// THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
// WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
// MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY
// SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
// WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION
// OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN
// CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
// Usage
// -----
// The module exports one entry point, the `renderjson()` function. It takes in
// the JSON you want to render as a single argument and returns an HTML
// element.
//
// Options
// -------
// renderjson.set_icons("+", "-")
// This Allows you to override the disclosure icons.
//
// renderjson.set_show_to_level(level)
// Pass the number of levels to expand when rendering. The default is 0, which
// starts with everything collapsed. As a special case, if level is the string
// "all" then it will start with everything expanded.
//
// renderjson.set_max_string_length(length)
// Strings will be truncated and made expandable if they are longer than
// `length`. As a special case, if `length` is the string "none" then
// there will be no truncation. The default is "none".
//
// renderjson.set_sort_objects(sort_bool)
// Sort objects by key (default: false)
//
// Theming
// -------
// The HTML output uses a number of classes so that you can theme it the way
// you'd like:
// .disclosure ("⊕", "⊖")
// .syntax (",", ":", "{", "}", "[", "]")
// .string (includes quotes)
// .number
// .boolean
// .key (object key)
// .keyword ("null", "undefined")
// .object.syntax ("{", "}")
// .array.syntax ("[", "]")
var module;
(module||{}).exports = renderjson = (function() {
var themetext = function(/* [class, text]+ */) {
var spans = [];
while (arguments.length)
spans.push(append(span(Array.prototype.shift.call(arguments)),
text(Array.prototype.shift.call(arguments))));
return spans;
};
var append = function(/* el, ... */) {
var el = Array.prototype.shift.call(arguments);
for (var a=0; a<arguments.length; a++)
if (arguments[a].constructor == Array)
append.apply(this, [el].concat(arguments[a]));
else
el.appendChild(arguments[a]);
return el;
};
var prepend = function(el, child) {
el.insertBefore(child, el.firstChild);
return el;
}
var isempty = function(obj) { for (var k in obj) if (obj.hasOwnProperty(k)) return false;
return true; }
var text = function(txt) { return document.createTextNode(txt) };
var div = function() { return document.createElement("div") };
var span = function(classname) { var s = document.createElement("span");
if (classname) s.className = classname;
return s; };
var A = function A(txt, classname, callback) { var a = document.createElement("a");
if (classname) a.className = classname;
a.appendChild(text(txt));
a.href = '#';
a.onclick = function() { callback(); return false; };
return a; };
function _renderjson(json, indent, dont_indent, show_level, max_string, sort_objects) {
var my_indent = dont_indent ? "" : indent;
var disclosure = function(open, placeholder, close, type, builder) {
var content;
var empty = span(type);
var show = function() { if (!content) append(empty.parentNode,
content = prepend(builder(),
A(renderjson.hide, "disclosure",
function() { content.style.display="none";
empty.style.display="inline"; } )));
content.style.display="inline";
empty.style.display="none"; };
append(empty,
A(renderjson.show, "disclosure", show),
themetext(type+ " syntax", open),
A(placeholder, null, show),
themetext(type+ " syntax", close));
var el = append(span(), text(my_indent.slice(0,-1)), empty);
if (show_level > 0)
show();
return el;
};
if (json === null) return themetext(null, my_indent, "keyword", "null");
if (json === void 0) return themetext(null, my_indent, "keyword", "undefined");
if (typeof(json) == "string" && json.length > max_string)
return disclosure('"', json.substr(0,max_string)+" ...", '"', "string", function () {
return append(span("string"), themetext(null, my_indent, "string", JSON.stringify(json)));
});
if (typeof(json) != "object") // Strings, numbers and bools
return themetext(null, my_indent, typeof(json), JSON.stringify(json));
if (json.constructor == Array) {
if (json.length == 0) return themetext(null, my_indent, "array syntax", "[]");
return disclosure("[", " ... ", "]", "array", function () {
var as = append(span("array"), themetext("array syntax", "[", null, "\n"));
for (var i=0; i<json.length; i++)
append(as,
_renderjson(json[i], indent+" ", false, show_level-1, max_string, sort_objects),
i != json.length-1 ? themetext("syntax", ",") : [],
text("\n"));
append(as, themetext(null, indent, "array syntax", "]"));
return as;
});
}
// object
if (isempty(json))
return themetext(null, my_indent, "object syntax", "{}");
return disclosure("{", "...", "}", "object", function () {
var os = append(span("object"), themetext("object syntax", "{", null, "\n"));
for (var k in json) var last = k;
var keys = Object.keys(json);
if (sort_objects)
keys = keys.sort();
for (var i in keys) {
var k = keys[i];
append(os, themetext(null, indent+" ", "key", '"'+k+'"', "object syntax", ': '),
_renderjson(json[k], indent+" ", true, show_level-1, max_string, sort_objects),
k != last ? themetext("syntax", ",") : [],
text("\n"));
}
append(os, themetext(null, indent, "object syntax", "}"));
return os;
});
}
var renderjson = function renderjson(json)
{
var pre = append(document.createElement("pre"), _renderjson(json, "", false, renderjson.show_to_level, renderjson.max_string_length, renderjson.sort_objects));
pre.className = "renderjson";
return pre;
}
renderjson.set_icons = function(show, hide) { renderjson.show = show;
renderjson.hide = hide;
return renderjson; };
renderjson.set_show_to_level = function(level) { renderjson.show_to_level = typeof level == "string" &&
level.toLowerCase() === "all" ? Number.MAX_VALUE
: level;
return renderjson; };
renderjson.set_max_string_length = function(length) { renderjson.max_string_length = typeof length == "string" &&
length.toLowerCase() === "none" ? Number.MAX_VALUE
: length;
return renderjson; };
renderjson.set_sort_objects = function(sort_bool) { renderjson.sort_objects = sort_bool;
return renderjson; };
// Backwards compatiblity. Use set_show_to_level() for new code.
renderjson.set_show_by_default = function(show) { renderjson.show_to_level = show ? Number.MAX_VALUE : 0;
return renderjson; };
renderjson.set_icons('⊕', '⊖');
renderjson.set_show_by_default(false);
renderjson.set_sort_objects(false);
renderjson.set_max_string_length("none");
return renderjson;
})();

7
dashboard/test/test.ts Normal file
View File

@ -0,0 +1,7 @@
import { ProxyDashboard } from "../src/proxy";
describe("test suite", () => {
it("initializes", () => {
expect(new ProxyDashboard()).toBeTruthy();
});
});

10
dashboard/tsconfig.json Normal file
View File

@ -0,0 +1,10 @@
{
"compilerOptions": {
"noImplicitAny": true,
"sourceMap": true,
"target": "es5"
},
"include": [
"src/**/*"
]
}

9
helper/Procfile Normal file
View File

@ -0,0 +1,9 @@
# proxy.py
# ~~~~~~~~
# ⚡⚡⚡ Fast, Lightweight, Programmable Proxy Server in a single Python file.
#
# :copyright: (c) 2013-present by Abhinav Singh and contributors.
# :license: BSD, see LICENSE for more details.
#
# See https://devcenter.heroku.com/articles/procfile
web: python3 proxy.py --hostname 0.0.0.0 --port $PORT

View File

@ -11,7 +11,7 @@
# ./chrome_with_proxy <proxy-py-address=localhost:8899>
PROXY_PY_ADDR=$1
if [ -z "$PROXY_PY_ADDR" ]; then
if [[ -z "$PROXY_PY_ADDR" ]]; then
PROXY_PY_ADDR="localhost:8899"
fi
@ -19,6 +19,6 @@ fi
--no-first-run \
--no-default-browser-check \
--user-data-dir="$(mktemp -d -t 'chrome-remote_data_dir')" \
--proxy-server=$PROXY_PY_ADDR \
--proxy-server=${PROXY_PY_ADDR} \
--ignore-urlfetcher-cert-requests \
--ignore-certificate-errors

View File

@ -1,3 +1,10 @@
# proxy.py
# ~~~~~~~~
# ⚡⚡⚡ Fast, Lightweight, Programmable Proxy Server in a single Python file.
#
# :copyright: (c) 2013-present by Abhinav Singh and contributors.
# :license: BSD, see LICENSE for more details.
#
# google-fluentd (Stackdriver) log input configuration file
#
# 1. Copy this configuration file as proxy.py.conf under:

51
package-lock.json generated
View File

@ -1,51 +0,0 @@
{
"name": "proxy.py",
"version": "1.0.1",
"lockfileVersion": 1,
"requires": true,
"dependencies": {
"async-limiter": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/async-limiter/-/async-limiter-1.0.1.tgz",
"integrity": "sha512-csOlWGAcRFJaI6m+F2WKdnMKr4HhdhFVBk0H/QbJFMCr+uO2kwohwXQPxw/9OCxp05r5ghVBFSyioixx3gfkNQ==",
"dev": true
},
"chrome-devtools-frontend": {
"version": "1.0.702145",
"resolved": "https://registry.npmjs.org/chrome-devtools-frontend/-/chrome-devtools-frontend-1.0.702145.tgz",
"integrity": "sha512-/GtRYVUWETCifkvlfYhrJkPg/SrTVcMwNNj7Q2Ax0LELwCil7AUF5QYY/jMMjv5QYLHSgSzWX0yICK3qqUDkhw==",
"dev": true
},
"chrome-remote-interface": {
"version": "0.28.0",
"resolved": "https://registry.npmjs.org/chrome-remote-interface/-/chrome-remote-interface-0.28.0.tgz",
"integrity": "sha512-md2qSn6rc/fADlN+Blk2UWNg0SGPYjH2s68piaPN9e62HItKm6uWeXXHh0+28Bq10oaWw8fzNAm1itDFJ+nS4w==",
"dev": true,
"requires": {
"commander": "2.11.x",
"ws": "^6.1.0"
}
},
"commander": {
"version": "2.11.0",
"resolved": "https://registry.npmjs.org/commander/-/commander-2.11.0.tgz",
"integrity": "sha512-b0553uYA5YAEGgyYIGYROzKQ7X5RAqedkfjiZxwi0kL1g3bOaBNNZfYkzt/CL0umgD5wc9Jec2FbB98CjkMRvQ==",
"dev": true
},
"ncp": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/ncp/-/ncp-2.0.0.tgz",
"integrity": "sha1-GVoh1sRuNh0vsSgbo4uR6d9727M=",
"dev": true
},
"ws": {
"version": "6.2.1",
"resolved": "https://registry.npmjs.org/ws/-/ws-6.2.1.tgz",
"integrity": "sha512-GIyAXC2cB7LjvpgMt9EKS2ldqr0MTrORaleiOno6TweZ6r3TKtoFQWay/2PceJ3RuBasOHzXNn5Lrw1X0bEjqA==",
"dev": true,
"requires": {
"async-limiter": "~1.0.0"
}
}
}
}

View File

@ -1,25 +0,0 @@
{
"name": "proxy.py",
"version": "1.0.1",
"description": "Lightweight, Programmable, TLS interceptor Proxy for HTTP(S), HTTP2, WebSockets protocols in a single Python file.",
"main": "proxy.js",
"scripts": {
"start": "node proxy.js",
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": "git+https://github.com/abhinavsingh/proxy.py.git"
},
"author": "Abhinav Singh",
"license": "BSD-3-Clause",
"bugs": {
"url": "https://github.com/abhinavsingh/proxy.py/issues"
},
"homepage": "https://github.com/abhinavsingh/proxy.py#readme",
"devDependencies": {
"chrome-devtools-frontend": "^1.0.702145",
"chrome-remote-interface": "^0.28.0",
"ncp": "^2.0.0"
}
}

View File

@ -1,305 +0,0 @@
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import json
import os
import tempfile
import time
from typing import Optional, BinaryIO, List, Tuple
from urllib import parse as urlparse
import proxy
from proxy import HttpParser
class ShortLinkPlugin(proxy.HttpProxyBasePlugin):
"""Add support for short links in your favorite browsers / applications.
Enable ShortLinkPlugin and speed up your daily browsing experience.
Example:
* f/ for facebook.com
* g/ for google.com
* t/ for twitter.com
* y/ for youtube.com
* proxy/ for proxy.py internal web servers.
Customize map below for your taste and need.
Paths are also preserved. E.g. t/imoracle will
resolve to http://twitter.com/imoracle.
"""
SHORT_LINKS = {
b'a': b'amazon.com',
b'i': b'instagram.com',
b'l': b'linkedin.com',
b'f': b'facebook.com',
b'g': b'google.com',
b't': b'twitter.com',
b'w': b'web.whatsapp.com',
b'y': b'youtube.com',
b'proxy': b'localhost:8899',
}
def before_upstream_connection(self, request: HttpParser) -> Optional[HttpParser]:
if request.host and request.host != b'localhost' and proxy.DOT not in request.host:
# Avoid connecting to upstream
return None
return request
def handle_client_request(self, request: HttpParser) -> Optional[HttpParser]:
if request.host and request.host != b'localhost' and proxy.DOT not in request.host:
if request.host in self.SHORT_LINKS:
path = proxy.SLASH if not request.path else request.path
self.client.queue(proxy.build_http_response(
proxy.httpStatusCodes.SEE_OTHER, reason=b'See Other',
headers={
b'Location': b'http://' + self.SHORT_LINKS[request.host] + path,
b'Content-Length': b'0',
b'Connection': b'close',
}
))
else:
self.client.queue(proxy.build_http_response(
proxy.httpStatusCodes.NOT_FOUND, reason=b'NOT FOUND',
headers={
b'Content-Length': b'0',
b'Connection': b'close',
}
))
return None
return request
def handle_upstream_chunk(self, chunk: bytes) -> bytes:
return chunk
def on_upstream_connection_close(self) -> None:
pass
class ModifyPostDataPlugin(proxy.HttpProxyBasePlugin):
"""Modify POST request body before sending to upstream server."""
MODIFIED_BODY = b'{"key": "modified"}'
def before_upstream_connection(self, request: proxy.HttpParser) -> Optional[proxy.HttpParser]:
return request
def handle_client_request(self, request: proxy.HttpParser) -> Optional[proxy.HttpParser]:
if request.method == proxy.httpMethods.POST:
request.body = ModifyPostDataPlugin.MODIFIED_BODY
# Update Content-Length header only when request is NOT chunked encoded
if not request.is_chunked_encoded():
request.add_header(b'Content-Length', proxy.bytes_(len(request.body)))
# Enforce content-type json
if request.has_header(b'Content-Type'):
request.del_header(b'Content-Type')
request.add_header(b'Content-Type', b'application/json')
return request
def handle_upstream_chunk(self, chunk: bytes) -> bytes:
return chunk
def on_upstream_connection_close(self) -> None:
pass
class ProposedRestApiPlugin(proxy.HttpProxyBasePlugin):
"""Mock responses for your upstream REST API.
Used to test and develop client side applications
without need of an actual upstream REST API server.
Returns proposed REST API mock responses to the client
without establishing upstream connection.
Note: This plugin won't work if your client is making
HTTPS connection to api.example.com.
"""
API_SERVER = b'api.example.com'
REST_API_SPEC = {
b'/v1/users/': {
'count': 2,
'next': None,
'previous': None,
'results': [
{
'email': 'you@example.com',
'groups': [],
'url': proxy.text_(API_SERVER) + '/v1/users/1/',
'username': 'admin',
},
{
'email': 'someone@example.com',
'groups': [],
'url': proxy.text_(API_SERVER) + '/v1/users/2/',
'username': 'someone',
},
]
},
}
def before_upstream_connection(self, request: proxy.HttpParser) -> Optional[proxy.HttpParser]:
# Return None to disable establishing connection to upstream
# Most likely our api.example.com won't even exist under development scenario
return None
def handle_client_request(self, request: proxy.HttpParser) -> Optional[proxy.HttpParser]:
if request.host != self.API_SERVER:
return request
assert request.path
if request.path in self.REST_API_SPEC:
self.client.queue(proxy.build_http_response(
proxy.httpStatusCodes.OK,
reason=b'OK',
headers={b'Content-Type': b'application/json'},
body=proxy.bytes_(json.dumps(
self.REST_API_SPEC[request.path]))
))
else:
self.client.queue(proxy.build_http_response(
proxy.httpStatusCodes.NOT_FOUND,
reason=b'NOT FOUND', body=b'Not Found'
))
return None
def handle_upstream_chunk(self, chunk: bytes) -> bytes:
return chunk
def on_upstream_connection_close(self) -> None:
pass
class RedirectToCustomServerPlugin(proxy.HttpProxyBasePlugin):
"""Modifies client request to redirect all incoming requests to a fixed server address."""
UPSTREAM_SERVER = b'http://localhost:8899/'
def before_upstream_connection(self, request: proxy.HttpParser) -> Optional[proxy.HttpParser]:
# Redirect all non-https requests to inbuilt WebServer.
if request.method != proxy.httpMethods.CONNECT:
request.set_url(self.UPSTREAM_SERVER)
# Update Host header too, otherwise upstream can reject our request
if request.has_header(b'Host'):
request.del_header(b'Host')
request.add_header(b'Host', urlparse.urlsplit(self.UPSTREAM_SERVER).netloc)
return request
def handle_client_request(self, request: proxy.HttpParser) -> Optional[proxy.HttpParser]:
return request
def handle_upstream_chunk(self, chunk: bytes) -> bytes:
return chunk
def on_upstream_connection_close(self) -> None:
pass
class FilterByUpstreamHostPlugin(proxy.HttpProxyBasePlugin):
"""Drop traffic by inspecting upstream host."""
FILTERED_DOMAINS = [b'google.com', b'www.google.com']
def before_upstream_connection(self, request: proxy.HttpParser) -> Optional[proxy.HttpParser]:
if request.host in self.FILTERED_DOMAINS:
raise proxy.HttpRequestRejected(
status_code=proxy.httpStatusCodes.I_AM_A_TEAPOT, reason=b'I\'m a tea pot')
return request
def handle_client_request(self, request: proxy.HttpParser) -> Optional[proxy.HttpParser]:
return request
def handle_upstream_chunk(self, chunk: bytes) -> bytes:
return chunk
def on_upstream_connection_close(self) -> None:
pass
class CacheResponsesPlugin(proxy.HttpProxyBasePlugin):
"""Caches Upstream Server Responses."""
CACHE_DIR = tempfile.gettempdir()
def __init__(
self,
config: proxy.ProtocolConfig,
client: proxy.TcpClientConnection) -> None:
super().__init__(config, client)
self.cache_file_path: Optional[str] = None
self.cache_file: Optional[BinaryIO] = None
def before_upstream_connection(self, request: proxy.HttpParser) -> Optional[proxy.HttpParser]:
# Ideally should only create file if upstream connection succeeds.
self.cache_file_path = os.path.join(
self.CACHE_DIR,
'%s-%s.txt' % (proxy.text_(request.host), str(time.time())))
self.cache_file = open(self.cache_file_path, "wb")
return request
def handle_client_request(self, request: proxy.HttpParser) -> Optional[proxy.HttpParser]:
return request
def handle_upstream_chunk(self,
chunk: bytes) -> bytes:
if self.cache_file:
self.cache_file.write(chunk)
return chunk
def on_upstream_connection_close(self) -> None:
if self.cache_file:
self.cache_file.close()
proxy.logger.info('Cached response at %s', self.cache_file_path)
class ManInTheMiddlePlugin(proxy.HttpProxyBasePlugin):
"""Modifies upstream server responses."""
def before_upstream_connection(self, request: proxy.HttpParser) -> Optional[proxy.HttpParser]:
return request
def handle_client_request(self, request: proxy.HttpParser) -> Optional[proxy.HttpParser]:
return request
def handle_upstream_chunk(self, chunk: bytes) -> bytes:
return proxy.build_http_response(
proxy.httpStatusCodes.OK,
reason=b'OK', body=b'Hello from man in the middle')
def on_upstream_connection_close(self) -> None:
pass
class WebServerPlugin(proxy.HttpWebServerBasePlugin):
"""Demonstration of inbuilt web server routing via plugin."""
def routes(self) -> List[Tuple[int, bytes]]:
return [
(proxy.httpProtocolTypes.HTTP, b'/http-route-example'),
(proxy.httpProtocolTypes.HTTPS, b'/https-route-example'),
(proxy.httpProtocolTypes.WEBSOCKET, b'/ws-route-example'),
]
def handle_request(self, request: proxy.HttpParser) -> None:
if request.path == b'/http-route-example':
self.client.queue(proxy.build_http_response(
proxy.httpStatusCodes.OK, body=b'HTTP route response'))
elif request.path == b'/https-route-example':
self.client.queue(proxy.build_http_response(
proxy.httpStatusCodes.OK, body=b'HTTPS route response'))
def on_websocket_open(self) -> None:
proxy.logger.info('Websocket open')
def on_websocket_message(self, frame: proxy.WebsocketFrame) -> None:
proxy.logger.info(frame.data)
def on_websocket_close(self) -> None:
proxy.logger.info('Websocket close')

View File

@ -0,0 +1,9 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""

View File

@ -0,0 +1,60 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import os
import tempfile
import time
import logging
from typing import Optional, BinaryIO
from proxy.common.flags import Flags
from proxy.core.connection import TcpClientConnection
from proxy.http.parser import HttpParser
from proxy.http.proxy import HttpProxyBasePlugin
from proxy.common.utils import text_
logger = logging.getLogger(__name__)
class CacheResponsesPlugin(HttpProxyBasePlugin):
"""Caches Upstream Server Responses."""
CACHE_DIR = tempfile.gettempdir()
def __init__(
self,
config: Flags,
client: TcpClientConnection) -> None:
super().__init__(config, client)
self.cache_file_path: Optional[str] = None
self.cache_file: Optional[BinaryIO] = None
def before_upstream_connection(
self, request: HttpParser) -> Optional[HttpParser]:
# Ideally should only create file if upstream connection succeeds.
self.cache_file_path = os.path.join(
self.CACHE_DIR,
'%s-%s.txt' % (text_(request.host), str(time.time())))
self.cache_file = open(self.cache_file_path, "wb")
return request
def handle_client_request(
self, request: HttpParser) -> Optional[HttpParser]:
return request
def handle_upstream_chunk(self,
chunk: bytes) -> bytes:
if self.cache_file:
self.cache_file.write(chunk)
return chunk
def on_upstream_connection_close(self) -> None:
if self.cache_file:
self.cache_file.close()
logger.info('Cached response at %s', self.cache_file_path)

View File

@ -0,0 +1,42 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
from typing import Optional
from proxy.http.proxy import HttpProxyBasePlugin
from proxy.http.exception import HttpRequestRejected
from proxy.http.parser import HttpParser
from proxy.http.codes import httpStatusCodes
class FilterByUpstreamHostPlugin(HttpProxyBasePlugin):
"""Drop traffic by inspecting upstream host."""
FILTERED_DOMAINS = [b'google.com', b'www.google.com']
def before_upstream_connection(
self, request: HttpParser) -> Optional[HttpParser]:
if request.host in self.FILTERED_DOMAINS:
raise HttpRequestRejected(
status_code=httpStatusCodes.I_AM_A_TEAPOT, reason=b'I\'m a tea pot',
headers={
b'Connection': b'close',
}
)
return request
def handle_client_request(
self, request: HttpParser) -> Optional[HttpParser]:
return request
def handle_upstream_chunk(self, chunk: bytes) -> bytes:
return chunk
def on_upstream_connection_close(self) -> None:
pass

View File

@ -0,0 +1,35 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
from typing import Optional
from proxy.http.proxy import HttpProxyBasePlugin
from proxy.http.parser import HttpParser
from proxy.http.codes import httpStatusCodes
from proxy.common.utils import build_http_response
class ManInTheMiddlePlugin(HttpProxyBasePlugin):
"""Modifies upstream server responses."""
def before_upstream_connection(
self, request: HttpParser) -> Optional[HttpParser]:
return request
def handle_client_request(
self, request: HttpParser) -> Optional[HttpParser]:
return request
def handle_upstream_chunk(self, chunk: bytes) -> bytes:
return build_http_response(
httpStatusCodes.OK,
reason=b'OK', body=b'Hello from man in the middle')
def on_upstream_connection_close(self) -> None:
pass

View File

@ -0,0 +1,87 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import json
from typing import Optional
from proxy.http.parser import HttpParser
from proxy.http.proxy import HttpProxyBasePlugin
from proxy.http.codes import httpStatusCodes
from proxy.common.utils import bytes_, build_http_response, text_
class ProposedRestApiPlugin(HttpProxyBasePlugin):
"""Mock responses for your upstream REST API.
Used to test and develop client side applications
without need of an actual upstream REST API server.
Returns proposed REST API mock responses to the client
without establishing upstream connection.
Note: This plugin won't work if your client is making
HTTPS connection to api.example.com.
"""
API_SERVER = b'api.example.com'
REST_API_SPEC = {
b'/v1/users/': {
'count': 2,
'next': None,
'previous': None,
'results': [
{
'email': 'you@example.com',
'groups': [],
'url': text_(API_SERVER) + '/v1/users/1/',
'username': 'admin',
},
{
'email': 'someone@example.com',
'groups': [],
'url': text_(API_SERVER) + '/v1/users/2/',
'username': 'someone',
},
]
},
}
def before_upstream_connection(
self, request: HttpParser) -> Optional[HttpParser]:
# Return None to disable establishing connection to upstream
# Most likely our api.example.com won't even exist under development
# scenario
return None
def handle_client_request(
self, request: HttpParser) -> Optional[HttpParser]:
if request.host != self.API_SERVER:
return request
assert request.path
if request.path in self.REST_API_SPEC:
self.client.queue(build_http_response(
httpStatusCodes.OK,
reason=b'OK',
headers={b'Content-Type': b'application/json'},
body=bytes_(json.dumps(
self.REST_API_SPEC[request.path]))
))
else:
self.client.queue(build_http_response(
httpStatusCodes.NOT_FOUND,
reason=b'NOT FOUND', body=b'Not Found'
))
return None
def handle_upstream_chunk(self, chunk: bytes) -> bytes:
return chunk
def on_upstream_connection_close(self) -> None:
pass

View File

@ -0,0 +1,46 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
from typing import Optional
from proxy.http.parser import HttpParser
from proxy.http.proxy import HttpProxyBasePlugin
from proxy.http.methods import httpMethods
from proxy.common.utils import bytes_
class ModifyPostDataPlugin(HttpProxyBasePlugin):
"""Modify POST request body before sending to upstream server."""
MODIFIED_BODY = b'{"key": "modified"}'
def before_upstream_connection(
self, request: HttpParser) -> Optional[HttpParser]:
return request
def handle_client_request(
self, request: HttpParser) -> Optional[HttpParser]:
if request.method == httpMethods.POST:
request.body = ModifyPostDataPlugin.MODIFIED_BODY
# Update Content-Length header only when request is NOT chunked
# encoded
if not request.is_chunked_encoded():
request.add_header(b'Content-Length',
bytes_(len(request.body)))
# Enforce content-type json
if request.has_header(b'Content-Type'):
request.del_header(b'Content-Type')
request.add_header(b'Content-Type', b'application/json')
return request
def handle_upstream_chunk(self, chunk: bytes) -> bytes:
return chunk
def on_upstream_connection_close(self) -> None:
pass

View File

@ -0,0 +1,44 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
from urllib import parse as urlparse
from typing import Optional
from proxy.http.proxy import HttpProxyBasePlugin
from proxy.http.parser import HttpParser
from proxy.http.methods import httpMethods
class RedirectToCustomServerPlugin(HttpProxyBasePlugin):
"""Modifies client request to redirect all incoming requests to a fixed server address."""
UPSTREAM_SERVER = b'http://localhost:8899/'
def before_upstream_connection(
self, request: HttpParser) -> Optional[HttpParser]:
# Redirect all non-https requests to inbuilt WebServer.
if request.method != httpMethods.CONNECT:
request.set_url(self.UPSTREAM_SERVER)
# Update Host header too, otherwise upstream can reject our request
if request.has_header(b'Host'):
request.del_header(b'Host')
request.add_header(
b'Host', urlparse.urlsplit(
self.UPSTREAM_SERVER).netloc)
return request
def handle_client_request(
self, request: HttpParser) -> Optional[HttpParser]:
return request
def handle_upstream_chunk(self, chunk: bytes) -> bytes:
return chunk
def on_upstream_connection_close(self) -> None:
pass

View File

@ -0,0 +1,83 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
from typing import Optional
from proxy.http.proxy import HttpProxyBasePlugin
from proxy.http.parser import HttpParser
from proxy.http.codes import httpStatusCodes
from proxy.common.constants import DOT, SLASH
from proxy.common.utils import build_http_response
class ShortLinkPlugin(HttpProxyBasePlugin):
"""Add support for short links in your favorite browsers / applications.
Enable ShortLinkPlugin and speed up your daily browsing experience.
Example:
* f/ for facebook.com
* g/ for google.com
* t/ for twitter.com
* y/ for youtube.com
* proxy/ for py internal web servers.
Customize map below for your taste and need.
Paths are also preserved. E.g. t/imoracle will
resolve to http://twitter.com/imoracle.
"""
SHORT_LINKS = {
b'a': b'amazon.com',
b'i': b'instagram.com',
b'l': b'linkedin.com',
b'f': b'facebook.com',
b'g': b'google.com',
b't': b'twitter.com',
b'w': b'web.whatsapp.com',
b'y': b'youtube.com',
b'proxy': b'localhost:8899',
}
def before_upstream_connection(
self, request: HttpParser) -> Optional[HttpParser]:
if request.host and request.host != b'localhost' and DOT not in request.host:
# Avoid connecting to upstream
return None
return request
def handle_client_request(
self, request: HttpParser) -> Optional[HttpParser]:
if request.host and request.host != b'localhost' and DOT not in request.host:
if request.host in self.SHORT_LINKS:
path = SLASH if not request.path else request.path
self.client.queue(build_http_response(
httpStatusCodes.SEE_OTHER, reason=b'See Other',
headers={
b'Location': b'http://' + self.SHORT_LINKS[request.host] + path,
b'Content-Length': b'0',
b'Connection': b'close',
}
))
else:
self.client.queue(build_http_response(
httpStatusCodes.NOT_FOUND, reason=b'NOT FOUND',
headers={
b'Content-Length': b'0',
b'Connection': b'close',
}
))
return None
return request
def handle_upstream_chunk(self, chunk: bytes) -> bytes:
return chunk
def on_upstream_connection_close(self) -> None:
pass

View File

@ -0,0 +1,47 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import logging
from typing import List, Tuple
from proxy.http.server import HttpWebServerBasePlugin, httpProtocolTypes
from proxy.http.websocket import WebsocketFrame
from proxy.http.parser import HttpParser
from proxy.http.codes import httpStatusCodes
from proxy.common.utils import build_http_response
logger = logging.getLogger(__name__)
class WebServerPlugin(HttpWebServerBasePlugin):
"""Demonstration of inbuilt web server routing via plugin."""
def routes(self) -> List[Tuple[int, bytes]]:
return [
(httpProtocolTypes.HTTP, b'/http-route-example'),
(httpProtocolTypes.HTTPS, b'/https-route-example'),
(httpProtocolTypes.WEBSOCKET, b'/ws-route-example'),
]
def handle_request(self, request: HttpParser) -> None:
if request.path == b'/http-route-example':
self.client.queue(build_http_response(
httpStatusCodes.OK, body=b'HTTP route response'))
elif request.path == b'/https-route-example':
self.client.queue(build_http_response(
httpStatusCodes.OK, body=b'HTTPS route response'))
def on_websocket_open(self) -> None:
logger.info('Websocket open')
def on_websocket_message(self, frame: WebsocketFrame) -> None:
logger.info(frame.data)
def on_websocket_close(self) -> None:
logger.info('Websocket close')

View File

@ -1,37 +0,0 @@
/*
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
*/
const path = require('path');
const fs = require('fs');
const ncp = require('ncp').ncp;
ncp.limit = 16;
const publicFolderPath = path.join(__dirname, 'public');
const destinationFolderPath = path.join(publicFolderPath, 'devtools');
const publicFolderExists = fs.existsSync(publicFolderPath);
if (!publicFolderExists) {
console.error(publicFolderPath + ' folder doesn\'t exist, make sure you are in the right directory.');
process.exit(1);
}
const destinationFolderExists = fs.existsSync(destinationFolderPath);
if (!destinationFolderExists) {
console.error(destinationFolderPath + ' folder doesn\'t exist, make sure you are in the right directory.');
process.exit(1);
}
const chromeDevTools = path.dirname(require.resolve('chrome-devtools-frontend/front_end/inspector.html'));
console.log(chromeDevTools + ' ---> ' + destinationFolderPath);
ncp(chromeDevTools, destinationFolderPath, (err) => {
if (err) {
return console.error(err);
}
console.log('Copy successful!!!');
});

3243
proxy.py

File diff suppressed because it is too large Load Diff

9
proxy/__init__.py Normal file
View File

@ -0,0 +1,9 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""

13
proxy/__main__.py Normal file
View File

@ -0,0 +1,13 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
from .main import entry_point
if __name__ == '__main__':
entry_point()

9
proxy/common/__init__.py Normal file
View File

@ -0,0 +1,9 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""

78
proxy/common/constants.py Normal file
View File

@ -0,0 +1,78 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import os
import time
import ipaddress
from typing import List
from .version import __version__
__description__ = '⚡⚡⚡ Fast, Lightweight, Programmable Proxy Server in a single Python file.'
__author__ = 'Abhinav Singh'
__author_email__ = 'mailsforabhinav@gmail.com'
__homepage__ = 'https://github.com/abhinavsingh/proxy.py'
__download_url__ = '%s/archive/master.zip' % __homepage__
__license__ = 'BSD'
PROXY_PY_DIR = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
PROXY_PY_START_TIME = time.time()
CRLF = b'\r\n'
COLON = b':'
WHITESPACE = b' '
COMMA = b','
DOT = b'.'
SLASH = b'/'
HTTP_1_1 = b'HTTP/1.1'
PROXY_AGENT_HEADER_KEY = b'Proxy-agent'
PROXY_AGENT_HEADER_VALUE = b'proxy.py v' + \
__version__.encode('utf-8', 'strict')
PROXY_AGENT_HEADER = PROXY_AGENT_HEADER_KEY + \
COLON + WHITESPACE + PROXY_AGENT_HEADER_VALUE
# Defaults
DEFAULT_BACKLOG = 100
DEFAULT_BASIC_AUTH = None
DEFAULT_BUFFER_SIZE = 1024 * 1024
DEFAULT_CA_CERT_DIR = None
DEFAULT_CA_CERT_FILE = None
DEFAULT_CA_KEY_FILE = None
DEFAULT_CA_SIGNING_KEY_FILE = None
DEFAULT_CERT_FILE = None
DEFAULT_CLIENT_RECVBUF_SIZE = DEFAULT_BUFFER_SIZE
DEFAULT_DEVTOOLS_WS_PATH = b'/devtools'
DEFAULT_DISABLE_HEADERS: List[bytes] = []
DEFAULT_DISABLE_HTTP_PROXY = False
DEFAULT_ENABLE_DEVTOOLS = False
DEFAULT_ENABLE_EVENTS = False
DEFAULT_EVENTS_QUEUE = None
DEFAULT_ENABLE_STATIC_SERVER = False
DEFAULT_ENABLE_WEB_SERVER = False
DEFAULT_IPV4_HOSTNAME = ipaddress.IPv4Address('127.0.0.1')
DEFAULT_IPV6_HOSTNAME = ipaddress.IPv6Address('::1')
DEFAULT_KEY_FILE = None
DEFAULT_LOG_FILE = None
DEFAULT_LOG_FORMAT = '%(asctime)s - pid:%(process)d [%(levelname)-.1s] %(funcName)s:%(lineno)d - %(message)s'
DEFAULT_LOG_LEVEL = 'INFO'
DEFAULT_NUM_WORKERS = 0
DEFAULT_OPEN_FILE_LIMIT = 1024
DEFAULT_PAC_FILE = None
DEFAULT_PAC_FILE_URL_PATH = b'/'
DEFAULT_PID_FILE = None
DEFAULT_PLUGINS = ''
DEFAULT_PORT = 8899
DEFAULT_SERVER_RECVBUF_SIZE = DEFAULT_BUFFER_SIZE
DEFAULT_STATIC_SERVER_DIR = os.path.join(
os.path.dirname(PROXY_PY_DIR), 'public')
DEFAULT_THREADLESS = False
DEFAULT_TIMEOUT = 10
DEFAULT_VERSION = False

320
proxy/common/flags.py Normal file
View File

@ -0,0 +1,320 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import argparse
import ipaddress
import os
import socket
import multiprocessing
import pathlib
from typing import Optional, Union, Dict, List
from .utils import text_
from .types import DictQueueType
from .constants import DEFAULT_LOG_LEVEL, DEFAULT_LOG_FILE, DEFAULT_LOG_FORMAT, DEFAULT_BACKLOG, DEFAULT_BASIC_AUTH
from .constants import DEFAULT_TIMEOUT, DEFAULT_DEVTOOLS_WS_PATH, DEFAULT_DISABLE_HTTP_PROXY, DEFAULT_DISABLE_HEADERS
from .constants import DEFAULT_ENABLE_STATIC_SERVER, DEFAULT_ENABLE_EVENTS, DEFAULT_ENABLE_DEVTOOLS
from .constants import DEFAULT_ENABLE_WEB_SERVER, DEFAULT_THREADLESS, DEFAULT_CERT_FILE, DEFAULT_KEY_FILE
from .constants import DEFAULT_CA_CERT_DIR, DEFAULT_CA_CERT_FILE, DEFAULT_CA_KEY_FILE, DEFAULT_CA_SIGNING_KEY_FILE
from .constants import DEFAULT_PAC_FILE_URL_PATH, DEFAULT_PAC_FILE, DEFAULT_PLUGINS, DEFAULT_PID_FILE, DEFAULT_PORT
from .constants import DEFAULT_NUM_WORKERS, DEFAULT_VERSION, DEFAULT_OPEN_FILE_LIMIT, DEFAULT_IPV6_HOSTNAME
from .constants import DEFAULT_SERVER_RECVBUF_SIZE, DEFAULT_CLIENT_RECVBUF_SIZE, DEFAULT_STATIC_SERVER_DIR
from .constants import COMMA
from .constants import __homepage__
from .version import __version__
def init_parser() -> argparse.ArgumentParser:
"""Initializes and returns argument parser."""
parser = argparse.ArgumentParser(
description='proxy.py v%s' % __version__,
epilog='Proxy.py not working? Report at: %s/issues/new' % __homepage__
)
# Argument names are ordered alphabetically.
parser.add_argument(
'--backlog',
type=int,
default=DEFAULT_BACKLOG,
help='Default: 100. Maximum number of pending connections to proxy server')
parser.add_argument(
'--basic-auth',
type=str,
default=DEFAULT_BASIC_AUTH,
help='Default: No authentication. Specify colon separated user:password '
'to enable basic authentication.')
parser.add_argument(
'--ca-key-file',
type=str,
default=DEFAULT_CA_KEY_FILE,
help='Default: None. CA key to use for signing dynamically generated '
'HTTPS certificates. If used, must also pass --ca-cert-file and --ca-signing-key-file'
)
parser.add_argument(
'--ca-cert-dir',
type=str,
default=DEFAULT_CA_CERT_DIR,
help='Default: ~/.proxy.py. Directory to store dynamically generated certificates. '
'Also see --ca-key-file, --ca-cert-file and --ca-signing-key-file'
)
parser.add_argument(
'--ca-cert-file',
type=str,
default=DEFAULT_CA_CERT_FILE,
help='Default: None. Signing certificate to use for signing dynamically generated '
'HTTPS certificates. If used, must also pass --ca-key-file and --ca-signing-key-file'
)
parser.add_argument(
'--ca-signing-key-file',
type=str,
default=DEFAULT_CA_SIGNING_KEY_FILE,
help='Default: None. CA signing key to use for dynamic generation of '
'HTTPS certificates. If used, must also pass --ca-key-file and --ca-cert-file'
)
parser.add_argument(
'--cert-file',
type=str,
default=DEFAULT_CERT_FILE,
help='Default: None. Server certificate to enable end-to-end TLS encryption with clients. '
'If used, must also pass --key-file.'
)
parser.add_argument(
'--client-recvbuf-size',
type=int,
default=DEFAULT_CLIENT_RECVBUF_SIZE,
help='Default: 1 MB. Maximum amount of data received from the '
'client in a single recv() operation. Bump this '
'value for faster uploads at the expense of '
'increased RAM.')
parser.add_argument(
'--devtools-ws-path',
type=str,
default=DEFAULT_DEVTOOLS_WS_PATH,
help='Default: /devtools. Only applicable '
'if --enable-devtools is used.'
)
parser.add_argument(
'--disable-headers',
type=str,
default=COMMA.join(DEFAULT_DISABLE_HEADERS),
help='Default: None. Comma separated list of headers to remove before '
'dispatching client request to upstream server.')
parser.add_argument(
'--disable-http-proxy',
action='store_true',
default=DEFAULT_DISABLE_HTTP_PROXY,
help='Default: False. Whether to disable proxy.HttpProxyPlugin.')
parser.add_argument(
'--enable-devtools',
action='store_true',
default=DEFAULT_ENABLE_DEVTOOLS,
help='Default: False. Enables integration with Chrome Devtool Frontend.'
)
parser.add_argument(
'--enable-events',
action='store_true',
default=DEFAULT_ENABLE_EVENTS,
help='Default: False. Enables core to dispatch lifecycle events. '
'Plugins can be used to subscribe for core events.'
)
parser.add_argument(
'--enable-static-server',
action='store_true',
default=DEFAULT_ENABLE_STATIC_SERVER,
help='Default: False. Enable inbuilt static file server. '
'Optionally, also use --static-server-dir to serve static content '
'from custom directory. By default, static file server serves '
'from public folder.'
)
parser.add_argument(
'--enable-web-server',
action='store_true',
default=DEFAULT_ENABLE_WEB_SERVER,
help='Default: False. Whether to enable proxy.HttpWebServerPlugin.')
parser.add_argument('--hostname',
type=str,
default=str(DEFAULT_IPV6_HOSTNAME),
help='Default: ::1. Server IP address.')
parser.add_argument(
'--key-file',
type=str,
default=DEFAULT_KEY_FILE,
help='Default: None. Server key file to enable end-to-end TLS encryption with clients. '
'If used, must also pass --cert-file.'
)
parser.add_argument(
'--log-level',
type=str,
default=DEFAULT_LOG_LEVEL,
help='Valid options: DEBUG, INFO (default), WARNING, ERROR, CRITICAL. '
'Both upper and lowercase values are allowed. '
'You may also simply use the leading character e.g. --log-level d')
parser.add_argument('--log-file', type=str, default=DEFAULT_LOG_FILE,
help='Default: sys.stdout. Log file destination.')
parser.add_argument('--log-format', type=str, default=DEFAULT_LOG_FORMAT,
help='Log format for Python logger.')
parser.add_argument('--num-workers', type=int, default=DEFAULT_NUM_WORKERS,
help='Defaults to number of CPU cores.')
parser.add_argument(
'--open-file-limit',
type=int,
default=DEFAULT_OPEN_FILE_LIMIT,
help='Default: 1024. Maximum number of files (TCP connections) '
'that proxy.py can open concurrently.')
parser.add_argument(
'--pac-file',
type=str,
default=DEFAULT_PAC_FILE,
help='A file (Proxy Auto Configuration) or string to serve when '
'the server receives a direct file request. '
'Using this option enables proxy.HttpWebServerPlugin.')
parser.add_argument(
'--pac-file-url-path',
type=str,
default=text_(DEFAULT_PAC_FILE_URL_PATH),
help='Default: %s. Web server path to serve the PAC file.' %
text_(DEFAULT_PAC_FILE_URL_PATH))
parser.add_argument(
'--pid-file',
type=str,
default=DEFAULT_PID_FILE,
help='Default: None. Save parent process ID to a file.')
parser.add_argument(
'--plugins',
type=str,
default=DEFAULT_PLUGINS,
help='Comma separated plugins')
parser.add_argument('--port', type=int, default=DEFAULT_PORT,
help='Default: 8899. Server port.')
parser.add_argument(
'--server-recvbuf-size',
type=int,
default=DEFAULT_SERVER_RECVBUF_SIZE,
help='Default: 1 MB. Maximum amount of data received from the '
'server in a single recv() operation. Bump this '
'value for faster downloads at the expense of '
'increased RAM.')
parser.add_argument(
'--static-server-dir',
type=str,
default=DEFAULT_STATIC_SERVER_DIR,
help='Default: "public" folder in directory where proxy.py is placed. '
'This option is only applicable when static server is also enabled. '
'See --enable-static-server.'
)
parser.add_argument(
'--threadless',
action='store_true',
default=DEFAULT_THREADLESS,
help='Default: False. When disabled a new thread is spawned '
'to handle each client connection.'
)
parser.add_argument(
'--timeout',
type=int,
default=DEFAULT_TIMEOUT,
help='Default: ' + str(DEFAULT_TIMEOUT) +
'. Number of seconds after which '
'an inactive connection must be dropped. Inactivity is defined by no '
'data sent or received by the client.'
)
parser.add_argument(
'--version',
'-v',
action='store_true',
default=DEFAULT_VERSION,
help='Prints proxy.py version.')
return parser
class Flags:
"""Contains all input flags and inferred input parameters."""
ROOT_DATA_DIR_NAME = '.proxy.py'
GENERATED_CERTS_DIR_NAME = 'certificates'
def __init__(
self,
auth_code: Optional[bytes] = DEFAULT_BASIC_AUTH,
server_recvbuf_size: int = DEFAULT_SERVER_RECVBUF_SIZE,
client_recvbuf_size: int = DEFAULT_CLIENT_RECVBUF_SIZE,
pac_file: Optional[str] = DEFAULT_PAC_FILE,
pac_file_url_path: Optional[bytes] = DEFAULT_PAC_FILE_URL_PATH,
plugins: Optional[Dict[bytes, List[type]]] = None,
disable_headers: Optional[List[bytes]] = None,
certfile: Optional[str] = None,
keyfile: Optional[str] = None,
ca_cert_dir: Optional[str] = None,
ca_key_file: Optional[str] = None,
ca_cert_file: Optional[str] = None,
ca_signing_key_file: Optional[str] = None,
num_workers: int = 0,
hostname: Union[ipaddress.IPv4Address,
ipaddress.IPv6Address] = DEFAULT_IPV6_HOSTNAME,
port: int = DEFAULT_PORT,
backlog: int = DEFAULT_BACKLOG,
static_server_dir: str = DEFAULT_STATIC_SERVER_DIR,
enable_static_server: bool = DEFAULT_ENABLE_STATIC_SERVER,
devtools_event_queue: Optional[DictQueueType] = None,
devtools_ws_path: bytes = DEFAULT_DEVTOOLS_WS_PATH,
timeout: int = DEFAULT_TIMEOUT,
threadless: bool = DEFAULT_THREADLESS,
enable_events: bool = DEFAULT_ENABLE_EVENTS) -> None:
self.threadless = threadless
self.timeout = timeout
self.auth_code = auth_code
self.server_recvbuf_size = server_recvbuf_size
self.client_recvbuf_size = client_recvbuf_size
self.pac_file = pac_file
self.pac_file_url_path = pac_file_url_path
if plugins is None:
plugins = {}
self.plugins: Dict[bytes, List[type]] = plugins
if disable_headers is None:
disable_headers = DEFAULT_DISABLE_HEADERS
self.disable_headers = disable_headers
self.certfile: Optional[str] = certfile
self.keyfile: Optional[str] = keyfile
self.ca_key_file: Optional[str] = ca_key_file
self.ca_cert_file: Optional[str] = ca_cert_file
self.ca_signing_key_file: Optional[str] = ca_signing_key_file
self.num_workers: int = num_workers if num_workers > 0 else multiprocessing.cpu_count()
self.hostname: Union[ipaddress.IPv4Address,
ipaddress.IPv6Address] = hostname
self.family: socket.AddressFamily = socket.AF_INET6 if hostname.version == 6 else socket.AF_INET
self.port: int = port
self.backlog: int = backlog
self.enable_static_server: bool = enable_static_server
self.static_server_dir: str = static_server_dir
self.devtools_event_queue: Optional[DictQueueType] = devtools_event_queue
self.devtools_ws_path: bytes = devtools_ws_path
self.enable_events: bool = enable_events
self.proxy_py_data_dir = os.path.join(
str(pathlib.Path.home()), self.ROOT_DATA_DIR_NAME)
os.makedirs(self.proxy_py_data_dir, exist_ok=True)
self.ca_cert_dir: Optional[str] = ca_cert_dir
if self.ca_cert_dir is None:
self.ca_cert_dir = os.path.join(
self.proxy_py_data_dir, self.GENERATED_CERTS_DIR_NAME)
os.makedirs(self.ca_cert_dir, exist_ok=True)
def tls_interception_enabled(self) -> bool:
return self.ca_key_file is not None and \
self.ca_cert_dir is not None and \
self.ca_signing_key_file is not None and \
self.ca_cert_file is not None
def encryption_enabled(self) -> bool:
return self.keyfile is not None and \
self.certfile is not None

24
proxy/common/types.py Normal file
View File

@ -0,0 +1,24 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import queue
from typing import TYPE_CHECKING, Dict, Any
from typing_extensions import Protocol
if TYPE_CHECKING:
DictQueueType = queue.Queue[Dict[str, Any]] # pragma: no cover
else:
DictQueueType = queue.Queue
class HasFileno(Protocol):
def fileno(self) -> int:
... # pragma: no cover

199
proxy/common/utils.py Normal file
View File

@ -0,0 +1,199 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import contextlib
import functools
import ipaddress
import socket
from types import TracebackType
from typing import Optional, Dict, Any, List, Tuple, Type, Callable
from .constants import HTTP_1_1, COLON, WHITESPACE, CRLF
def text_(s: Any, encoding: str = 'utf-8', errors: str = 'strict') -> Any:
"""Utility to ensure text-like usability.
If s is of type bytes or int, return s.decode(encoding, errors),
otherwise return s as it is."""
if isinstance(s, int):
return str(s)
if isinstance(s, bytes):
return s.decode(encoding, errors)
return s
def bytes_(s: Any, encoding: str = 'utf-8', errors: str = 'strict') -> Any:
"""Utility to ensure binary-like usability.
If s is type str or int, return s.encode(encoding, errors),
otherwise return s as it is."""
if isinstance(s, int):
s = str(s)
if isinstance(s, str):
return s.encode(encoding, errors)
return s
def build_http_request(method: bytes, url: bytes,
protocol_version: bytes = HTTP_1_1,
headers: Optional[Dict[bytes, bytes]] = None,
body: Optional[bytes] = None) -> bytes:
"""Build and returns a HTTP request packet."""
if headers is None:
headers = {}
return build_http_pkt(
[method, url, protocol_version], headers, body)
def build_http_response(status_code: int,
protocol_version: bytes = HTTP_1_1,
reason: Optional[bytes] = None,
headers: Optional[Dict[bytes, bytes]] = None,
body: Optional[bytes] = None) -> bytes:
"""Build and returns a HTTP response packet."""
line = [protocol_version, bytes_(status_code)]
if reason:
line.append(reason)
if headers is None:
headers = {}
has_content_length = False
has_transfer_encoding = False
for k in headers:
if k.lower() == b'content-length':
has_content_length = True
if k.lower() == b'transfer-encoding':
has_transfer_encoding = True
if body is not None and \
not has_transfer_encoding and \
not has_content_length:
headers[b'Content-Length'] = bytes_(len(body))
return build_http_pkt(line, headers, body)
def build_http_header(k: bytes, v: bytes) -> bytes:
"""Build and return a HTTP header line for use in raw packet."""
return k + COLON + WHITESPACE + v
def build_http_pkt(line: List[bytes],
headers: Optional[Dict[bytes, bytes]] = None,
body: Optional[bytes] = None) -> bytes:
"""Build and returns a HTTP request or response packet."""
req = WHITESPACE.join(line) + CRLF
if headers is not None:
for k in headers:
req += build_http_header(k, headers[k]) + CRLF
req += CRLF
if body:
req += body
return req
def build_websocket_handshake_request(
key: bytes,
method: bytes = b'GET',
url: bytes = b'/') -> bytes:
"""
Build and returns a Websocket handshake request packet.
:param key: Sec-WebSocket-Key header value.
:param method: HTTP method.
:param url: Websocket request path.
"""
return build_http_request(
method, url,
headers={
b'Connection': b'upgrade',
b'Upgrade': b'websocket',
b'Sec-WebSocket-Key': key,
b'Sec-WebSocket-Version': b'13',
}
)
def build_websocket_handshake_response(accept: bytes) -> bytes:
"""
Build and returns a Websocket handshake response packet.
:param accept: Sec-WebSocket-Accept header value
"""
return build_http_response(
101, reason=b'Switching Protocols',
headers={
b'Upgrade': b'websocket',
b'Connection': b'Upgrade',
b'Sec-WebSocket-Accept': accept
}
)
def find_http_line(raw: bytes) -> Tuple[Optional[bytes], bytes]:
"""Find and returns first line ending in CRLF along with following buffer.
If no ending CRLF is found, line is None."""
pos = raw.find(CRLF)
if pos == -1:
return None, raw
line = raw[:pos]
rest = raw[pos + len(CRLF):]
return line, rest
def new_socket_connection(addr: Tuple[str, int]) -> socket.socket:
conn = None
try:
ip = ipaddress.ip_address(addr[0])
if ip.version == 4:
conn = socket.socket(
socket.AF_INET, socket.SOCK_STREAM, 0)
conn.connect(addr)
else:
conn = socket.socket(
socket.AF_INET6, socket.SOCK_STREAM, 0)
conn.connect((addr[0], addr[1], 0, 0))
except ValueError:
pass # does not appear to be an IPv4 or IPv6 address
if conn is not None:
return conn
# try to establish dual stack IPv4/IPv6 connection.
return socket.create_connection(addr)
class socket_connection(contextlib.ContextDecorator):
"""Same as new_socket_connection but as a context manager and decorator."""
def __init__(self, addr: Tuple[str, int]):
self.addr: Tuple[str, int] = addr
self.conn: Optional[socket.socket] = None
super().__init__()
def __enter__(self) -> socket.socket:
self.conn = new_socket_connection(self.addr)
return self.conn
def __exit__(
self,
exc_type: Optional[Type[BaseException]],
exc_val: Optional[BaseException],
exc_tb: Optional[TracebackType]) -> bool:
if self.conn:
self.conn.close()
return False
def __call__(self, func: Callable[..., Any]
) -> Callable[[socket.socket], Any]:
@functools.wraps(func)
def decorated(*args: Any, **kwargs: Any) -> Any:
with self as conn:
return func(conn, *args, **kwargs)
return decorated

11
proxy/common/version.py Normal file
View File

@ -0,0 +1,11 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
VERSION = (2, 0, 0)
__version__ = '.'.join(map(str, VERSION[0:3]))

9
proxy/core/__init__.py Normal file
View File

@ -0,0 +1,9 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""

233
proxy/core/acceptor.py Normal file
View File

@ -0,0 +1,233 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import logging
import multiprocessing
import selectors
import socket
import threading
# import time
from multiprocessing import connection
from multiprocessing.reduction import send_handle, recv_handle
from typing import List, Optional, Type, Tuple
from .threadless import ThreadlessWork, Threadless
from .event import EventQueue, EventDispatcher, eventNames
from ..common.flags import Flags
logger = logging.getLogger(__name__)
class AcceptorPool:
"""AcceptorPool.
Pre-spawns worker processes to utilize all cores available on the system. Server socket connection is
dispatched over a pipe to workers. Each worker accepts incoming client request and spawns a
separate thread to handle the client request.
"""
def __init__(self, flags: Flags, work_klass: Type[ThreadlessWork]) -> None:
self.flags = flags
self.running: bool = False
self.socket: Optional[socket.socket] = None
self.acceptors: List[Acceptor] = []
self.work_queues: List[connection.Connection] = []
self.work_klass = work_klass
self.event_queue: Optional[EventQueue] = None
self.event_dispatcher: Optional[EventDispatcher] = None
self.event_dispatcher_thread: Optional[threading.Thread] = None
self.event_dispatcher_shutdown: Optional[threading.Event] = None
if self.flags.enable_events:
self.event_queue = EventQueue()
def listen(self) -> None:
self.socket = socket.socket(self.flags.family, socket.SOCK_STREAM)
self.socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
self.socket.bind((str(self.flags.hostname), self.flags.port))
self.socket.listen(self.flags.backlog)
self.socket.setblocking(False)
logger.info(
'Listening on %s:%d' %
(self.flags.hostname, self.flags.port))
def start_workers(self) -> None:
"""Start worker processes."""
for acceptor_id in range(self.flags.num_workers):
work_queue = multiprocessing.Pipe()
acceptor = Acceptor(
idd=acceptor_id,
work_queue=work_queue[1],
flags=self.flags,
work_klass=self.work_klass,
event_queue=self.event_queue
)
acceptor.start()
logger.debug('Started acceptor process %d', acceptor.pid)
self.acceptors.append(acceptor)
self.work_queues.append(work_queue[0])
logger.info('Started %d workers' % self.flags.num_workers)
def start_event_dispatcher(self) -> None:
self.event_dispatcher_shutdown = threading.Event()
assert self.event_dispatcher_shutdown
assert self.event_queue
self.event_dispatcher = EventDispatcher(
shutdown=self.event_dispatcher_shutdown,
event_queue=self.event_queue
)
self.event_dispatcher_thread = threading.Thread(
target=self.event_dispatcher.run
)
self.event_dispatcher_thread.start()
logger.debug('Thread ID: %d', self.event_dispatcher_thread.ident)
def shutdown(self) -> None:
logger.info('Shutting down %d workers' % self.flags.num_workers)
if self.flags.enable_events:
assert self.event_dispatcher_shutdown
assert self.event_dispatcher_thread
self.event_dispatcher_shutdown.set()
self.event_dispatcher_thread.join()
logger.debug(
'Shutdown of global event dispatcher thread %d successful',
self.event_dispatcher_thread.ident)
for acceptor in self.acceptors:
acceptor.join()
logger.debug('Acceptors shutdown')
def setup(self) -> None:
"""Listen on port, setup workers and pass server socket to workers."""
self.running = True
self.listen()
if self.flags.enable_events:
self.start_event_dispatcher()
self.start_workers()
# Send server socket to all acceptor processes.
assert self.socket is not None
for index in range(self.flags.num_workers):
send_handle(
self.work_queues[index],
self.socket.fileno(),
self.acceptors[index].pid
)
self.work_queues[index].close()
self.socket.close()
class Acceptor(multiprocessing.Process):
"""Socket client acceptor.
Accepts client connection over received server socket handle and
starts a new work thread.
"""
lock = multiprocessing.Lock()
def __init__(
self,
idd: int,
work_queue: connection.Connection,
flags: Flags,
work_klass: Type[ThreadlessWork],
event_queue: Optional[EventQueue] = None) -> None:
super().__init__()
self.idd = idd
self.work_queue: connection.Connection = work_queue
self.flags = flags
self.work_klass = work_klass
self.event_queue = event_queue
self.running = False
self.selector: Optional[selectors.DefaultSelector] = None
self.sock: Optional[socket.socket] = None
self.threadless_process: Optional[multiprocessing.Process] = None
self.threadless_client_queue: Optional[connection.Connection] = None
def start_threadless_process(self) -> None:
pipe = multiprocessing.Pipe()
self.threadless_client_queue = pipe[0]
self.threadless_process = Threadless(
client_queue=pipe[1],
flags=self.flags,
work_klass=self.work_klass,
event_queue=self.event_queue
)
self.threadless_process.start()
logger.debug('Started process %d', self.threadless_process.pid)
def shutdown_threadless_process(self) -> None:
assert self.threadless_process and self.threadless_client_queue
logger.debug('Stopped process %d', self.threadless_process.pid)
self.threadless_process.join()
self.threadless_client_queue.close()
def start_work(self, conn: socket.socket, addr: Tuple[str, int]) -> None:
if self.flags.threadless and \
self.threadless_client_queue and \
self.threadless_process:
self.threadless_client_queue.send(addr)
send_handle(
self.threadless_client_queue,
conn.fileno(),
self.threadless_process.pid
)
conn.close()
else:
work = self.work_klass(
fileno=conn.fileno(),
addr=addr,
flags=self.flags,
event_queue=self.event_queue
)
work_thread = threading.Thread(target=work.run)
work.publish_event(
event_name=eventNames.WORK_STARTED,
event_payload={'fileno': conn.fileno(), 'addr': addr},
publisher_id=self.__class__.__name__
)
work_thread.start()
def run_once(self) -> None:
assert self.selector and self.sock
with self.lock:
events = self.selector.select(timeout=1)
if len(events) == 0:
return
conn, addr = self.sock.accept()
# now = time.time()
# fileno: int = conn.fileno()
self.start_work(conn, addr)
# logger.info('Work started for fd %d in %f seconds', fileno, time.time() - now)
def run(self) -> None:
self.running = True
self.selector = selectors.DefaultSelector()
fileno = recv_handle(self.work_queue)
self.work_queue.close()
self.sock = socket.fromfd(
fileno,
family=self.flags.family,
type=socket.SOCK_STREAM
)
try:
self.selector.register(self.sock, selectors.EVENT_READ)
if self.flags.threadless:
self.start_threadless_process()
while self.running:
self.run_once()
except KeyboardInterrupt:
pass
finally:
self.selector.unregister(self.sock)
if self.flags.threadless:
self.shutdown_threadless_process()
self.sock.close()
self.running = False

129
proxy/core/connection.py Normal file
View File

@ -0,0 +1,129 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import socket
import ssl
import logging
from abc import ABC, abstractmethod
from typing import NamedTuple, Optional, Union, Tuple
from ..common.constants import DEFAULT_BUFFER_SIZE
from ..common.utils import new_socket_connection
logger = logging.getLogger(__name__)
TcpConnectionTypes = NamedTuple('TcpConnectionTypes', [
('SERVER', int),
('CLIENT', int),
])
tcpConnectionTypes = TcpConnectionTypes(1, 2)
class TcpConnectionUninitializedException(Exception):
pass
class TcpConnection(ABC):
"""TCP server/client connection abstraction.
Main motivation of this class is to provide a buffer management
when reading and writing into the socket.
Implement the connection property abstract method to return
a socket connection object."""
def __init__(self, tag: int):
self.buffer: bytes = b''
self.closed: bool = False
self.tag: str = 'server' if tag == tcpConnectionTypes.SERVER else 'client'
@property
@abstractmethod
def connection(self) -> Union[ssl.SSLSocket, socket.socket]:
"""Must return the socket connection to use in this class."""
raise TcpConnectionUninitializedException() # pragma: no cover
def send(self, data: bytes) -> int:
"""Users must handle BrokenPipeError exceptions"""
return self.connection.send(data)
def recv(self, buffer_size: int = DEFAULT_BUFFER_SIZE) -> Optional[bytes]:
"""Users must handle socket.error exceptions"""
data: bytes = self.connection.recv(buffer_size)
if len(data) == 0:
return None
logger.debug(
'received %d bytes from %s' %
(len(data), self.tag))
# logger.info(data)
return data
def close(self) -> bool:
if not self.closed:
self.connection.close()
self.closed = True
return self.closed
def buffer_size(self) -> int:
return len(self.buffer)
def has_buffer(self) -> bool:
return self.buffer_size() > 0
def queue(self, data: bytes) -> int:
self.buffer += data
return len(data)
def flush(self) -> int:
"""Users must handle BrokenPipeError exceptions"""
if self.buffer_size() == 0:
return 0
sent: int = self.send(self.buffer)
# logger.info(self.buffer[:sent])
self.buffer = self.buffer[sent:]
logger.debug('flushed %d bytes to %s' % (sent, self.tag))
return sent
class TcpServerConnection(TcpConnection):
"""Establishes connection to upstream server."""
def __init__(self, host: str, port: int):
super().__init__(tcpConnectionTypes.SERVER)
self._conn: Optional[Union[ssl.SSLSocket, socket.socket]] = None
self.addr: Tuple[str, int] = (host, int(port))
@property
def connection(self) -> Union[ssl.SSLSocket, socket.socket]:
if self._conn is None:
raise TcpConnectionUninitializedException()
return self._conn
def connect(self) -> None:
if self._conn is not None:
return
self._conn = new_socket_connection(self.addr)
class TcpClientConnection(TcpConnection):
"""An accepted client connection request."""
def __init__(self,
conn: Union[ssl.SSLSocket, socket.socket],
addr: Tuple[str, int]):
super().__init__(tcpConnectionTypes.CLIENT)
self._conn: Optional[Union[ssl.SSLSocket, socket.socket]] = conn
self.addr: Tuple[str, int] = addr
@property
def connection(self) -> Union[ssl.SSLSocket, socket.socket]:
if self._conn is None:
raise TcpConnectionUninitializedException()
return self._conn

142
proxy/core/event.py Normal file
View File

@ -0,0 +1,142 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import os
import queue
import time
import threading
import multiprocessing
import logging
from typing import Dict, Optional, Any, NamedTuple, List
from ..common.types import DictQueueType
logger = logging.getLogger(__name__)
EventNames = NamedTuple('EventNames', [
('WORK_STARTED', int),
('WORK_FINISHED', int),
('SUBSCRIBE', int),
('UNSUBSCRIBE', int),
])
eventNames = EventNames(1, 2, 3, 4)
class EventQueue:
"""Global event queue."""
def __init__(self) -> None:
super().__init__()
self.queue = multiprocessing.Manager().Queue()
def publish(
self,
request_id: str,
event_name: int,
event_payload: Dict[str, Any],
publisher_id: Optional[str] = None
) -> None:
"""Publish an event into the queue.
1. Request ID - Globally unique
2. Process ID - Process ID of event publisher.
This will be process id of acceptor workers.
3. Thread ID - Thread ID of event publisher.
When --threadless is enabled, this value will be same for all the requests
received by a single acceptor worker.
When --threadless is disabled, this value will be
Thread ID of the thread handling the client request.
4. Event Timestamp - Time when this event occur
5. Event Name - One of the defined or custom event name
6. Event Payload - Optional data associated with the event
7. Publisher ID (optional) - Optionally, publishing entity unique name / ID
"""
self.queue.put({
'request_id': request_id,
'process_id': os.getpid(),
'thread_id': threading.get_ident(),
'event_timestamp': time.time(),
'event_name': event_name,
'event_payload': event_payload,
'publisher_id': publisher_id,
})
def subscribe(
self,
sub_id: str,
channel: DictQueueType) -> None:
self.queue.put({
'event_name': eventNames.SUBSCRIBE,
'event_payload': {'sub_id': sub_id, 'channel': channel},
})
class EventDispatcher:
"""Core EventDispatcher.
Provides:
1. A dispatcher module which consumes core events and dispatches
them to EventQueueBasePlugin
2. A publish utility for publishing core events into
global events queue.
Direct consuming from global events queue outside of dispatcher
module is not-recommended. Python native multiprocessing queue
doesn't provide a fanout functionality which core dispatcher module
implements so that several plugins can consume same published
event at a time.
When --enable-events is used, a multiprocessing.Queue is created and
attached to global Flags. This queue can then be used for
dispatching an Event dict object into the queue.
When --enable-events is used, dispatcher module is automatically
started. Dispatcher module also ensures that queue is not full and
doesn't utilize too much memory in case there are no event plugins
enabled.
"""
def __init__(
self,
shutdown: threading.Event,
event_queue: EventQueue) -> None:
self.shutdown: threading.Event = shutdown
self.event_queue: EventQueue = event_queue
self.subscribers: Dict[str, DictQueueType] = {}
def run(self) -> None:
try:
while not self.shutdown.is_set():
try:
ev = self.event_queue.queue.get(timeout=1)
if ev['event_name'] == eventNames.SUBSCRIBE:
self.subscribers[ev['event_payload']['sub_id']] = \
ev['event_payload']['channel']
elif ev['event_name'] == eventNames.UNSUBSCRIBE:
del self.subscribers[ev['event_payload']['sub_id']]
else:
# logger.info(ev)
unsub_ids: List[str] = []
for sub_id in self.subscribers:
try:
self.subscribers[sub_id].put(ev)
except BrokenPipeError:
unsub_ids.append(sub_id)
for sub_id in unsub_ids:
del self.subscribers[sub_id]
except queue.Empty:
pass
except EOFError:
pass
except KeyboardInterrupt:
pass
except Exception as e:
logger.exception('Event dispatcher exception', exc_info=e)

242
proxy/core/threadless.py Normal file
View File

@ -0,0 +1,242 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import os
import uuid
import socket
import logging
import asyncio
import selectors
import contextlib
import ssl
import multiprocessing
from multiprocessing import connection
from multiprocessing.reduction import recv_handle
from abc import ABC, abstractmethod
from typing import Dict, Optional, Tuple, List, Union, Generator, Any, Type
from .event import EventQueue, eventNames
from ..common.flags import Flags
from ..common.types import HasFileno
from ..common.constants import DEFAULT_TIMEOUT
logger = logging.getLogger(__name__)
class ThreadlessWork(ABC):
"""Implement ThreadlessWork to hook into the event loop provided by Threadless process."""
def publish_event(
self,
event_name: int,
event_payload: Dict[str, Any],
publisher_id: Optional[str] = None) -> None:
if not self.flags.enable_events:
return
assert self.event_queue
self.event_queue.publish(
self.uid,
event_name,
event_payload,
publisher_id
)
def shutdown(self) -> None:
"""Must close any opened resources and call super().shutdown()."""
self.publish_event(
event_name=eventNames.WORK_FINISHED,
event_payload={},
publisher_id=self.__class__.__name__
)
@abstractmethod
def __init__(
self,
fileno: int,
addr: Tuple[str, int],
flags: Optional[Flags],
event_queue: Optional[EventQueue] = None,
uid: Optional[str] = None) -> None:
self.fileno = fileno
self.addr = addr
self.flags = flags if flags else Flags()
self.event_queue = event_queue
self.uid: str = uid if uid is not None else uuid.uuid4().hex
@abstractmethod
def initialize(self) -> None:
pass # pragma: no cover
@abstractmethod
def is_inactive(self) -> bool:
return False # pragma: no cover
@abstractmethod
def get_events(self) -> Dict[socket.socket, int]:
return {} # pragma: no cover
@abstractmethod
def handle_events(
self,
readables: List[Union[int, HasFileno]],
writables: List[Union[int, HasFileno]]) -> bool:
"""Return True to shutdown work."""
return False # pragma: no cover
@abstractmethod
def run(self) -> None:
pass
class Threadless(multiprocessing.Process):
"""Threadless provides an event loop. Use it by implementing Threadless class.
When --threadless option is enabled, each Acceptor process also
spawns one Threadless process. And instead of spawning new thread
for each accepted client connection, Acceptor process sends
accepted client connection to Threadless process over a pipe.
HttpProtocolHandler implements ThreadlessWork class and hooks into the
event loop provided by Threadless.
"""
def __init__(
self,
client_queue: connection.Connection,
flags: Flags,
work_klass: Type[ThreadlessWork],
event_queue: Optional[EventQueue] = None) -> None:
super().__init__()
self.client_queue = client_queue
self.flags = flags
self.work_klass = work_klass
self.event_queue = event_queue
self.works: Dict[int, ThreadlessWork] = {}
self.selector: Optional[selectors.DefaultSelector] = None
self.loop: Optional[asyncio.AbstractEventLoop] = None
@contextlib.contextmanager
def selected_events(self) -> Generator[Tuple[List[Union[int, HasFileno]],
List[Union[int, HasFileno]]],
None, None]:
events: Dict[socket.socket, int] = {}
for work in self.works.values():
events.update(work.get_events())
assert self.selector is not None
for fd in events:
self.selector.register(fd, events[fd])
ev = self.selector.select(timeout=1)
readables = []
writables = []
for key, mask in ev:
if mask & selectors.EVENT_READ:
readables.append(key.fileobj)
if mask & selectors.EVENT_WRITE:
writables.append(key.fileobj)
yield (readables, writables)
for fd in events.keys():
self.selector.unregister(fd)
async def handle_events(
self, fileno: int,
readables: List[Union[int, HasFileno]],
writables: List[Union[int, HasFileno]]) -> bool:
return self.works[fileno].handle_events(readables, writables)
# TODO: Use correct future typing annotations
async def wait_for_tasks(
self, tasks: Dict[int, Any]) -> None:
for work_id in tasks:
# TODO: Resolving one handle_events here can block resolution of
# other tasks
try:
teardown = await asyncio.wait_for(tasks[work_id], DEFAULT_TIMEOUT)
if teardown:
self.cleanup(work_id)
except asyncio.TimeoutError:
self.cleanup(work_id)
def accept_client(self) -> None:
addr = self.client_queue.recv()
fileno = recv_handle(self.client_queue)
self.works[fileno] = self.work_klass(
fileno=fileno,
addr=addr,
flags=self.flags,
event_queue=self.event_queue
)
self.works[fileno].publish_event(
event_name=eventNames.WORK_STARTED,
event_payload={'fileno': fileno, 'addr': addr},
publisher_id=self.__class__.__name__
)
try:
self.works[fileno].initialize()
except ssl.SSLError as e:
logger.exception('ssl.SSLError', exc_info=e)
self.cleanup(fileno)
def cleanup_inactive(self) -> None:
inactive_works: List[int] = []
for work_id in self.works:
if self.works[work_id].is_inactive():
inactive_works.append(work_id)
for work_id in inactive_works:
self.cleanup(work_id)
def cleanup(self, work_id: int) -> None:
# TODO: HttpProtocolHandler.shutdown can call flush which may block
self.works[work_id].shutdown()
del self.works[work_id]
os.close(work_id)
def run_once(self) -> None:
assert self.loop is not None
with self.selected_events() as (readables, writables):
if len(readables) == 0 and len(writables) == 0:
# Remove and shutdown inactive connections
self.cleanup_inactive()
return
# Note that selector from now on is idle,
# until all the logic below completes.
#
# Invoke Threadless.handle_events
# TODO: Only send readable / writables that client originally
# registered.
tasks = {}
for fileno in self.works:
tasks[fileno] = self.loop.create_task(
self.handle_events(fileno, readables, writables))
# Accepted client connection from Acceptor
if self.client_queue in readables:
self.accept_client()
# Wait for Threadless.handle_events to complete
self.loop.run_until_complete(self.wait_for_tasks(tasks))
# Remove and shutdown inactive connections
self.cleanup_inactive()
def run(self) -> None:
try:
self.selector = selectors.DefaultSelector()
self.selector.register(self.client_queue, selectors.EVENT_READ)
self.loop = asyncio.get_event_loop()
while True:
self.run_once()
except KeyboardInterrupt:
pass
finally:
assert self.selector is not None
self.selector.unregister(self.client_queue)
self.client_queue.close()
assert self.loop is not None
self.loop.close()

9
proxy/http/__init__.py Normal file
View File

@ -0,0 +1,9 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""

View File

@ -0,0 +1,80 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
from typing import NamedTuple, Tuple, List, Optional
from ..common.utils import bytes_, find_http_line
from ..common.constants import CRLF, DEFAULT_BUFFER_SIZE
ChunkParserStates = NamedTuple('ChunkParserStates', [
('WAITING_FOR_SIZE', int),
('WAITING_FOR_DATA', int),
('COMPLETE', int),
])
chunkParserStates = ChunkParserStates(1, 2, 3)
class ChunkParser:
"""HTTP chunked encoding response parser."""
def __init__(self) -> None:
self.state = chunkParserStates.WAITING_FOR_SIZE
self.body: bytes = b'' # Parsed chunks
self.chunk: bytes = b'' # Partial chunk received
# Expected size of next following chunk
self.size: Optional[int] = None
def parse(self, raw: bytes) -> bytes:
more = True if len(raw) > 0 else False
while more and self.state != chunkParserStates.COMPLETE:
more, raw = self.process(raw)
return raw
def process(self, raw: bytes) -> Tuple[bool, bytes]:
if self.state == chunkParserStates.WAITING_FOR_SIZE:
# Consume prior chunk in buffer
# in case chunk size without CRLF was received
raw = self.chunk + raw
self.chunk = b''
# Extract following chunk data size
line, raw = find_http_line(raw)
# CRLF not received or Blank line was received.
if line is None or line.strip() == b'':
self.chunk = raw
raw = b''
else:
self.size = int(line, 16)
self.state = chunkParserStates.WAITING_FOR_DATA
elif self.state == chunkParserStates.WAITING_FOR_DATA:
assert self.size is not None
remaining = self.size - len(self.chunk)
self.chunk += raw[:remaining]
raw = raw[remaining:]
if len(self.chunk) == self.size:
raw = raw[len(CRLF):]
self.body += self.chunk
if self.size == 0:
self.state = chunkParserStates.COMPLETE
else:
self.state = chunkParserStates.WAITING_FOR_SIZE
self.chunk = b''
self.size = None
return len(raw) > 0, raw
@staticmethod
def to_chunks(raw: bytes, chunk_size: int = DEFAULT_BUFFER_SIZE) -> bytes:
chunks: List[bytes] = []
for i in range(0, len(raw), chunk_size):
chunk = raw[i: i + chunk_size]
chunks.append(bytes_('{:x}'.format(len(chunk))))
chunks.append(chunk)
chunks.append(bytes_('{:x}'.format(0)))
chunks.append(b'')
return CRLF.join(chunks) + CRLF

46
proxy/http/codes.py Normal file
View File

@ -0,0 +1,46 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
from typing import NamedTuple
HttpStatusCodes = NamedTuple('HttpStatusCodes', [
# 1xx
('CONTINUE', int),
('SWITCHING_PROTOCOLS', int),
# 2xx
('OK', int),
# 3xx
('MOVED_PERMANENTLY', int),
('SEE_OTHER', int),
('TEMPORARY_REDIRECT', int),
('PERMANENT_REDIRECT', int),
# 4xx
('BAD_REQUEST', int),
('UNAUTHORIZED', int),
('FORBIDDEN', int),
('NOT_FOUND', int),
('PROXY_AUTH_REQUIRED', int),
('REQUEST_TIMEOUT', int),
('I_AM_A_TEAPOT', int),
# 5xx
('INTERNAL_SERVER_ERROR', int),
('NOT_IMPLEMENTED', int),
('BAD_GATEWAY', int),
('GATEWAY_TIMEOUT', int),
('NETWORK_READ_TIMEOUT_ERROR', int),
('NETWORK_CONNECT_TIMEOUT_ERROR', int),
])
httpStatusCodes = HttpStatusCodes(
100, 101,
200,
301, 303, 307, 308,
400, 401, 403, 404, 407, 408, 418,
500, 501, 502, 504, 598, 599
)

314
proxy/http/devtools.py Normal file
View File

@ -0,0 +1,314 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import threading
import queue
import socket
import time
import secrets
import os
import logging
import json
from typing import Optional, Union, List, Tuple, Dict, Any
from .parser import httpParserStates, httpParserTypes, HttpParser
from .server import HttpWebServerBasePlugin, httpProtocolTypes
from .websocket import WebsocketFrame, websocketOpcodes
from .handler import HttpProtocolHandlerPlugin
from ..common.constants import COLON, PROXY_PY_START_TIME
from ..common.types import HasFileno, DictQueueType
from ..common.utils import bytes_, text_
from ..core.connection import TcpClientConnection
logger = logging.getLogger(__name__)
class DevtoolsWebsocketPlugin(HttpWebServerBasePlugin):
"""DevtoolsWebsocketPlugin handles Devtools Frontend websocket requests.
For every connected Devtools Frontend instance, a dispatcher thread is
started which drains the global Devtools protocol events queue.
Dispatcher thread is terminated when Devtools Frontend disconnects."""
def __init__(self, *args: Any, **kwargs: Any):
super().__init__(*args, **kwargs)
self.event_dispatcher_thread: Optional[threading.Thread] = None
self.event_dispatcher_shutdown: Optional[threading.Event] = None
def start_dispatcher(self) -> None:
self.event_dispatcher_shutdown = threading.Event()
assert self.flags.devtools_event_queue is not None
self.event_dispatcher_thread = threading.Thread(
target=DevtoolsWebsocketPlugin.event_dispatcher,
args=(self.event_dispatcher_shutdown,
self.flags.devtools_event_queue,
self.client))
self.event_dispatcher_thread.start()
def stop_dispatcher(self) -> None:
assert self.event_dispatcher_shutdown is not None
assert self.event_dispatcher_thread is not None
self.event_dispatcher_shutdown.set()
self.event_dispatcher_thread.join()
logger.debug('Event dispatcher shutdown')
@staticmethod
def event_dispatcher(
shutdown: threading.Event,
devtools_event_queue: DictQueueType,
client: TcpClientConnection) -> None:
while not shutdown.is_set():
try:
ev = devtools_event_queue.get(timeout=1)
frame = WebsocketFrame()
frame.fin = True
frame.opcode = websocketOpcodes.TEXT_FRAME
frame.data = bytes_(json.dumps(ev))
logger.debug(ev)
client.queue(frame.build())
except queue.Empty:
pass
except Exception as e:
logger.exception('Event dispatcher exception', exc_info=e)
break
except KeyboardInterrupt:
break
def routes(self) -> List[Tuple[int, bytes]]:
return [
(httpProtocolTypes.WEBSOCKET, self.flags.devtools_ws_path)
]
def handle_request(self, request: HttpParser) -> None:
pass
def on_websocket_open(self) -> None:
self.start_dispatcher()
def on_websocket_message(self, frame: WebsocketFrame) -> None:
if frame.data:
message = json.loads(frame.data)
self.handle_message(message)
else:
logger.debug('No data found in frame')
def on_websocket_close(self) -> None:
self.stop_dispatcher()
def handle_message(self, message: Dict[str, Any]) -> None:
frame = WebsocketFrame()
frame.fin = True
frame.opcode = websocketOpcodes.TEXT_FRAME
if message['method'] in (
'Page.canScreencast',
'Network.canEmulateNetworkConditions',
'Emulation.canEmulate'
):
data = json.dumps({
'id': message['id'],
'result': False
})
elif message['method'] == 'Page.getResourceTree':
data = json.dumps({
'id': message['id'],
'result': {
'frameTree': {
'frame': {
'id': 1,
'url': 'http://proxypy',
'mimeType': 'other',
},
'childFrames': [],
'resources': []
}
}
})
elif message['method'] == 'Network.getResponseBody':
logger.debug('received request method Network.getResponseBody')
data = json.dumps({
'id': message['id'],
'result': {
'body': '',
'base64Encoded': False,
}
})
else:
data = json.dumps({
'id': message['id'],
'result': {},
})
frame.data = bytes_(data)
self.client.queue(frame.build())
class DevtoolsProtocolPlugin(HttpProtocolHandlerPlugin):
"""
DevtoolsProtocolPlugin taps into core `HttpProtocolHandler`
events and converts them into Devtools Protocol json messages.
A DevtoolsProtocolPlugin instance is created per request.
Per request devtool events are queued into a global multiprocessing queue.
"""
frame_id = secrets.token_hex(8)
loader_id = secrets.token_hex(8)
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.id: str = f'{ os.getpid() }-{ threading.get_ident() }-{ time.time() }'
self.response = HttpParser(httpParserTypes.RESPONSE_PARSER)
def get_descriptors(
self) -> Tuple[List[socket.socket], List[socket.socket]]:
return [], []
def write_to_descriptors(self, w: List[Union[int, HasFileno]]) -> bool:
return False
def read_from_descriptors(self, r: List[Union[int, HasFileno]]) -> bool:
return False
def on_client_data(self, raw: bytes) -> Optional[bytes]:
return raw
def on_request_complete(self) -> Union[socket.socket, bool]:
if not self.request.has_upstream_server() and \
self.request.path == self.config.devtools_ws_path:
return False
# Handle devtool frontend websocket upgrade
if self.config.devtools_event_queue:
self.config.devtools_event_queue.put({
'method': 'Network.requestWillBeSent',
'params': self.request_will_be_sent(),
})
return False
def on_response_chunk(self, chunk: bytes) -> bytes:
if not self.request.has_upstream_server() and \
self.request.path == self.config.devtools_ws_path:
return chunk
if self.config.devtools_event_queue:
self.response.parse(chunk)
if self.response.state >= httpParserStates.HEADERS_COMPLETE:
self.config.devtools_event_queue.put({
'method': 'Network.responseReceived',
'params': self.response_received(),
})
if self.response.state >= httpParserStates.RCVING_BODY:
self.config.devtools_event_queue.put({
'method': 'Network.dataReceived',
'params': self.data_received(chunk)
})
if self.response.state == httpParserStates.COMPLETE:
self.config.devtools_event_queue.put({
'method': 'Network.loadingFinished',
'params': self.loading_finished()
})
return chunk
def on_client_connection_close(self) -> None:
pass
def request_will_be_sent(self) -> Dict[str, Any]:
now = time.time()
return {
'requestId': self.id,
'loaderId': self.loader_id,
'documentURL': 'http://proxy-py',
'request': {
'url': text_(
self.request.path
if self.request.has_upstream_server() else
b'http://' + bytes_(str(self.config.hostname)) +
COLON + bytes_(self.config.port) + self.request.path
),
'urlFragment': '',
'method': text_(self.request.method),
'headers': {text_(v[0]): text_(v[1]) for v in self.request.headers.values()},
'initialPriority': 'High',
'mixedContentType': 'none',
'postData': None if self.request.method != 'POST'
else text_(self.request.body)
},
'timestamp': now - PROXY_PY_START_TIME,
'wallTime': now,
'initiator': {
'type': 'other'
},
'type': text_(self.request.header(b'content-type'))
if self.request.has_header(b'content-type')
else 'Other',
'frameId': self.frame_id,
'hasUserGesture': False
}
def response_received(self) -> Dict[str, Any]:
return {
'requestId': self.id,
'frameId': self.frame_id,
'loaderId': self.loader_id,
'timestamp': time.time(),
'type': text_(self.response.header(b'content-type'))
if self.response.has_header(b'content-type')
else 'Other',
'response': {
'url': '',
'status': '',
'statusText': '',
'headers': '',
'headersText': '',
'mimeType': '',
'connectionReused': True,
'connectionId': '',
'encodedDataLength': '',
'fromDiskCache': False,
'fromServiceWorker': False,
'timing': {
'requestTime': '',
'proxyStart': -1,
'proxyEnd': -1,
'dnsStart': -1,
'dnsEnd': -1,
'connectStart': -1,
'connectEnd': -1,
'sslStart': -1,
'sslEnd': -1,
'workerStart': -1,
'workerReady': -1,
'sendStart': 0,
'sendEnd': 0,
'receiveHeadersEnd': 0,
},
'requestHeaders': '',
'remoteIPAddress': '',
'remotePort': '',
}
}
def data_received(self, chunk: bytes) -> Dict[str, Any]:
return {
'requestId': self.id,
'timestamp': time.time(),
'dataLength': len(chunk),
'encodedDataLength': len(chunk),
}
def loading_finished(self) -> Dict[str, Any]:
return {
'requestId': self.id,
'timestamp': time.time(),
'encodedDataLength': self.response.total_size
}

94
proxy/http/exception.py Normal file
View File

@ -0,0 +1,94 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
from typing import Optional, Dict
from .parser import HttpParser
from .codes import httpStatusCodes
from ..common.constants import PROXY_AGENT_HEADER_VALUE, PROXY_AGENT_HEADER_KEY
from ..common.utils import build_http_response
class HttpProtocolException(Exception):
"""Top level HttpProtocolException exception class.
All exceptions raised during execution of Http request lifecycle MUST
inherit HttpProtocolException base class. Implement response() method
to optionally return custom response to client."""
def response(self, request: HttpParser) -> Optional[bytes]:
return None # pragma: no cover
class HttpRequestRejected(HttpProtocolException):
"""Generic exception that can be used to reject the client requests.
Connections can either be dropped/closed or optionally an
HTTP status code can be returned."""
def __init__(self,
status_code: Optional[int] = None,
reason: Optional[bytes] = None,
headers: Optional[Dict[bytes, bytes]] = None,
body: Optional[bytes] = None):
self.status_code: Optional[int] = status_code
self.reason: Optional[bytes] = reason
self.headers: Optional[Dict[bytes, bytes]] = headers
self.body: Optional[bytes] = body
def response(self, _request: HttpParser) -> Optional[bytes]:
if self.status_code:
return build_http_response(
status_code=self.status_code,
reason=self.reason,
headers=self.headers,
body=self.body
)
return None
class ProxyConnectionFailed(HttpProtocolException):
"""Exception raised when HttpProxyPlugin is unable to establish connection to upstream server."""
RESPONSE_PKT = build_http_response(
httpStatusCodes.BAD_GATEWAY,
reason=b'Bad Gateway',
headers={
PROXY_AGENT_HEADER_KEY: PROXY_AGENT_HEADER_VALUE,
b'Connection': b'close'
},
body=b'Bad Gateway'
)
def __init__(self, host: str, port: int, reason: str):
self.host: str = host
self.port: int = port
self.reason: str = reason
def response(self, _request: HttpParser) -> bytes:
return self.RESPONSE_PKT
class ProxyAuthenticationFailed(HttpProtocolException):
"""Exception raised when Http Proxy auth is enabled and
incoming request doesn't present necessary credentials."""
RESPONSE_PKT = build_http_response(
httpStatusCodes.PROXY_AUTH_REQUIRED,
reason=b'Proxy Authentication Required',
headers={
PROXY_AGENT_HEADER_KEY: PROXY_AGENT_HEADER_VALUE,
b'Proxy-Authenticate': b'Basic',
b'Connection': b'close',
},
body=b'Proxy Authentication Required')
def response(self, _request: HttpParser) -> bytes:
return self.RESPONSE_PKT

414
proxy/http/handler.py Normal file
View File

@ -0,0 +1,414 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import socket
import selectors
import ssl
import time
import contextlib
import errno
import logging
from abc import ABC, abstractmethod
from typing import Tuple, List, Union, Optional, Generator, Dict
from .parser import HttpParser, httpParserStates, httpParserTypes
from .exception import HttpProtocolException
from ..common.flags import Flags
from ..common.types import HasFileno
from ..core.threadless import ThreadlessWork
from ..core.event import EventQueue
from ..core.connection import TcpClientConnection
logger = logging.getLogger(__name__)
class HttpProtocolHandlerPlugin(ABC):
"""Base HttpProtocolHandler Plugin class.
NOTE: This is an internal plugin and in most cases only useful for core contributors.
If you are looking for proxy server plugins see `<proxy.HttpProxyBasePlugin>`.
Implements various lifecycle events for an accepted client connection.
Following events are of interest:
1. Client Connection Accepted
A new plugin instance is created per accepted client connection.
Add your logic within __init__ constructor for any per connection setup.
2. Client Request Chunk Received
on_client_data is called for every chunk of data sent by the client.
3. Client Request Complete
on_request_complete is called once client request has completed.
4. Server Response Chunk Received
on_response_chunk is called for every chunk received from the server.
5. Client Connection Closed
Add your logic within `on_client_connection_close` for any per connection teardown.
"""
def __init__(
self,
config: Flags,
client: TcpClientConnection,
request: HttpParser,
event_queue: EventQueue):
self.config: Flags = config
self.client: TcpClientConnection = client
self.request: HttpParser = request
self.event_queue = event_queue
super().__init__()
def name(self) -> str:
"""A unique name for your plugin.
Defaults to name of the class. This helps plugin developers to directly
access a specific plugin by its name."""
return self.__class__.__name__
@abstractmethod
def get_descriptors(
self) -> Tuple[List[socket.socket], List[socket.socket]]:
return [], [] # pragma: no cover
@abstractmethod
def write_to_descriptors(self, w: List[Union[int, HasFileno]]) -> bool:
pass # pragma: no cover
@abstractmethod
def read_from_descriptors(self, r: List[Union[int, HasFileno]]) -> bool:
pass # pragma: no cover
@abstractmethod
def on_client_data(self, raw: bytes) -> Optional[bytes]:
return raw # pragma: no cover
@abstractmethod
def on_request_complete(self) -> Union[socket.socket, bool]:
"""Called right after client request parser has reached COMPLETE state."""
pass # pragma: no cover
@abstractmethod
def on_response_chunk(self, chunk: bytes) -> bytes:
"""Handle data chunks as received from the server.
Return optionally modified chunk to return back to client."""
return chunk # pragma: no cover
@abstractmethod
def on_client_connection_close(self) -> None:
pass # pragma: no cover
class HttpProtocolHandler(ThreadlessWork):
"""HTTP, HTTPS, HTTP2, WebSockets protocol handler.
Accepts `Client` connection object and manages HttpProtocolHandlerPlugin invocations.
"""
def __init__(self, fileno: int, addr: Tuple[str, int],
flags: Optional[Flags] = None,
event_queue: Optional[EventQueue] = None,
uid: Optional[str] = None):
super().__init__(fileno, addr, flags, event_queue, uid)
self.start_time: float = time.time()
self.last_activity: float = self.start_time
self.request: HttpParser = HttpParser(httpParserTypes.REQUEST_PARSER)
self.response: HttpParser = HttpParser(httpParserTypes.RESPONSE_PARSER)
self.selector = selectors.DefaultSelector()
self.client: TcpClientConnection = TcpClientConnection(
self.fromfd(self.fileno), self.addr
)
self.plugins: Dict[str, HttpProtocolHandlerPlugin] = {}
def initialize(self) -> None:
"""Optionally upgrades connection to HTTPS, set conn in non-blocking mode and initializes plugins."""
conn = self.optionally_wrap_socket(self.client.connection)
conn.setblocking(False)
if self.flags.encryption_enabled():
self.client = TcpClientConnection(conn=conn, addr=self.addr)
if b'HttpProtocolHandlerPlugin' in self.flags.plugins:
for klass in self.flags.plugins[b'HttpProtocolHandlerPlugin']:
instance = klass(
self.flags,
self.client,
self.request,
self.event_queue)
self.plugins[instance.name()] = instance
logger.debug('Handling connection %r' % self.client.connection)
def is_inactive(self) -> bool:
if not self.client.has_buffer() and \
self.connection_inactive_for() > self.flags.timeout:
return True
return False
def get_events(self) -> Dict[socket.socket, int]:
events: Dict[socket.socket, int] = {
self.client.connection: selectors.EVENT_READ
}
if self.client.has_buffer():
events[self.client.connection] |= selectors.EVENT_WRITE
# HttpProtocolHandlerPlugin.get_descriptors
for plugin in self.plugins.values():
plugin_read_desc, plugin_write_desc = plugin.get_descriptors()
for r in plugin_read_desc:
if r not in events:
events[r] = selectors.EVENT_READ
else:
events[r] |= selectors.EVENT_READ
for w in plugin_write_desc:
if w not in events:
events[w] = selectors.EVENT_WRITE
else:
events[w] |= selectors.EVENT_WRITE
return events
def handle_events(
self,
readables: List[Union[int, HasFileno]],
writables: List[Union[int, HasFileno]]) -> bool:
"""Returns True if proxy must teardown."""
# Flush buffer for ready to write sockets
teardown = self.handle_writables(writables)
if teardown:
return True
# Invoke plugin.write_to_descriptors
for plugin in self.plugins.values():
teardown = plugin.write_to_descriptors(writables)
if teardown:
return True
# Read from ready to read sockets
teardown = self.handle_readables(readables)
if teardown:
return True
# Invoke plugin.read_from_descriptors
for plugin in self.plugins.values():
teardown = plugin.read_from_descriptors(readables)
if teardown:
return True
return False
def shutdown(self) -> None:
try:
# Flush pending buffer if any
self.flush()
# Invoke plugin.on_client_connection_close
for plugin in self.plugins.values():
plugin.on_client_connection_close()
logger.debug(
'Closing client connection %r '
'at address %r with pending client buffer size %d bytes' %
(self.client.connection, self.client.addr, self.client.buffer_size()))
conn = self.client.connection
# Unwrap if wrapped before shutdown.
if self.flags.encryption_enabled() and \
isinstance(self.client.connection, ssl.SSLSocket):
conn = self.client.connection.unwrap()
conn.shutdown(socket.SHUT_WR)
logger.debug('Client connection shutdown successful')
except OSError:
pass
finally:
self.client.connection.close()
logger.debug('Client connection closed')
super().shutdown()
def fromfd(self, fileno: int) -> socket.socket:
conn = socket.fromfd(
fileno, family=socket.AF_INET if self.flags.hostname.version == 4 else socket.AF_INET6,
type=socket.SOCK_STREAM)
return conn
def optionally_wrap_socket(
self, conn: socket.socket) -> Union[ssl.SSLSocket, socket.socket]:
"""Attempts to wrap accepted client connection using provided certificates.
Shutdown and closes client connection upon error.
"""
if self.flags.encryption_enabled():
ctx = ssl.create_default_context(
ssl.Purpose.CLIENT_AUTH)
ctx.options |= ssl.OP_NO_SSLv2 | ssl.OP_NO_SSLv3 | ssl.OP_NO_TLSv1 | ssl.OP_NO_TLSv1_1
ctx.verify_mode = ssl.CERT_NONE
assert self.flags.keyfile and self.flags.certfile
ctx.load_cert_chain(
certfile=self.flags.certfile,
keyfile=self.flags.keyfile)
conn = ctx.wrap_socket(conn, server_side=True)
return conn
def connection_inactive_for(self) -> float:
return time.time() - self.last_activity
def flush(self) -> None:
if not self.client.has_buffer():
return
try:
self.selector.register(
self.client.connection,
selectors.EVENT_WRITE)
while self.client.has_buffer():
ev: List[Tuple[selectors.SelectorKey, int]
] = self.selector.select(timeout=1)
if len(ev) == 0:
continue
self.client.flush()
except BrokenPipeError:
pass
finally:
self.selector.unregister(self.client.connection)
def handle_writables(self, writables: List[Union[int, HasFileno]]) -> bool:
if self.client.buffer_size() > 0 and self.client.connection in writables:
logger.debug('Client is ready for writes, flushing buffer')
self.last_activity = time.time()
# Invoke plugin.on_response_chunk
chunk = self.client.buffer
for plugin in self.plugins.values():
chunk = plugin.on_response_chunk(chunk)
if chunk is None:
break
try:
self.client.flush()
except OSError:
logger.error('OSError when flushing buffer to client')
return True
except BrokenPipeError:
logger.error(
'BrokenPipeError when flushing buffer for client')
return True
return False
def handle_readables(self, readables: List[Union[int, HasFileno]]) -> bool:
if self.client.connection in readables:
logger.debug('Client is ready for reads, reading')
self.last_activity = time.time()
try:
client_data = self.client.recv(self.flags.client_recvbuf_size)
except ssl.SSLWantReadError: # Try again later
logger.warning(
'SSLWantReadError encountered while reading from client, will retry ...')
return False
except socket.error as e:
if e.errno == errno.ECONNRESET:
logger.warning('%r' % e)
else:
logger.exception(
'Exception while receiving from %s connection %r with reason %r' %
(self.client.tag, self.client.connection, e))
return True
if not client_data:
logger.debug('Client closed connection, tearing down...')
self.client.closed = True
return True
try:
# HttpProtocolHandlerPlugin.on_client_data
# Can raise HttpProtocolException to teardown the connection
plugin_index = 0
plugins = list(self.plugins.values())
while plugin_index < len(plugins) and client_data:
client_data = plugins[plugin_index].on_client_data(
client_data)
if client_data is None:
break
plugin_index += 1
# Don't parse request any further after 1st request has completed.
# This specially does happen for pipeline requests.
# Plugins can utilize on_client_data for such cases and
# apply custom logic to handle request data sent after 1st
# valid request.
if client_data and self.request.state != httpParserStates.COMPLETE:
# Parse http request
self.request.parse(client_data)
if self.request.state == httpParserStates.COMPLETE:
# Invoke plugin.on_request_complete
for plugin in self.plugins.values():
upgraded_sock = plugin.on_request_complete()
if isinstance(upgraded_sock, ssl.SSLSocket):
logger.debug(
'Updated client conn to %s', upgraded_sock)
self.client._conn = upgraded_sock
for plugin_ in self.plugins.values():
if plugin_ != plugin:
plugin_.client._conn = upgraded_sock
elif isinstance(upgraded_sock, bool) and upgraded_sock is True:
return True
except HttpProtocolException as e:
logger.debug(
'HttpProtocolException type raised')
response = e.response(self.request)
if response:
self.client.queue(response)
return True
return False
@contextlib.contextmanager
def selected_events(self) -> \
Generator[Tuple[List[Union[int, HasFileno]],
List[Union[int, HasFileno]]],
None, None]:
events = self.get_events()
for fd in events:
self.selector.register(fd, events[fd])
ev = self.selector.select(timeout=1)
readables = []
writables = []
for key, mask in ev:
if mask & selectors.EVENT_READ:
readables.append(key.fileobj)
if mask & selectors.EVENT_WRITE:
writables.append(key.fileobj)
yield (readables, writables)
for fd in events.keys():
self.selector.unregister(fd)
def run_once(self) -> bool:
with self.selected_events() as (readables, writables):
teardown = self.handle_events(readables, writables)
if teardown:
return True
return False
def run(self) -> None:
try:
self.initialize()
while True:
# Teardown if client buffer is empty and connection is inactive
if self.is_inactive():
logger.debug(
'Client buffer is empty and maximum inactivity has reached '
'between client and server connection, tearing down...')
break
teardown = self.run_once()
if teardown:
break
except KeyboardInterrupt: # pragma: no cover
pass
except ssl.SSLError as e:
logger.exception('ssl.SSLError', exc_info=e)
except Exception as e:
logger.exception(
'Exception while handling connection %r' %
self.client.connection, exc_info=e)
finally:
self.shutdown()

34
proxy/http/methods.py Normal file
View File

@ -0,0 +1,34 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
from typing import NamedTuple
HttpMethods = NamedTuple('HttpMethods', [
('GET', bytes),
('HEAD', bytes),
('POST', bytes),
('PUT', bytes),
('DELETE', bytes),
('CONNECT', bytes),
('OPTIONS', bytes),
('TRACE', bytes),
('PATCH', bytes),
])
httpMethods = HttpMethods(
b'GET',
b'HEAD',
b'POST',
b'PUT',
b'DELETE',
b'CONNECT',
b'OPTIONS',
b'TRACE',
b'PATCH',
)

262
proxy/http/parser.py Normal file
View File

@ -0,0 +1,262 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
from urllib import parse as urlparse
from typing import TypeVar, NamedTuple, Optional, Dict, Type, Tuple, List
from .methods import httpMethods
from .chunk_parser import ChunkParser, chunkParserStates
from ..common.constants import DEFAULT_DISABLE_HEADERS, COLON, CRLF, WHITESPACE, HTTP_1_1
from ..common.utils import build_http_request, find_http_line, text_
HttpParserStates = NamedTuple('HttpParserStates', [
('INITIALIZED', int),
('LINE_RCVD', int),
('RCVING_HEADERS', int),
('HEADERS_COMPLETE', int),
('RCVING_BODY', int),
('COMPLETE', int),
])
httpParserStates = HttpParserStates(1, 2, 3, 4, 5, 6)
HttpParserTypes = NamedTuple('HttpParserTypes', [
('REQUEST_PARSER', int),
('RESPONSE_PARSER', int),
])
httpParserTypes = HttpParserTypes(1, 2)
T = TypeVar('T', bound='HttpParser')
class HttpParser:
"""HTTP request/response parser."""
def __init__(self, parser_type: int) -> None:
self.type: int = parser_type
self.state: int = httpParserStates.INITIALIZED
# Raw bytes as passed to parse(raw) method and its total size
self.bytes: bytes = b''
self.total_size: int = 0
# Buffer to hold unprocessed bytes
self.buffer: bytes = b''
self.headers: Dict[bytes, Tuple[bytes, bytes]] = dict()
self.body: Optional[bytes] = None
self.method: Optional[bytes] = None
self.url: Optional[urlparse.SplitResultBytes] = None
self.code: Optional[bytes] = None
self.reason: Optional[bytes] = None
self.version: Optional[bytes] = None
self.chunk_parser: Optional[ChunkParser] = None
# This cleans up developer APIs as Python urlparse.urlsplit behaves differently
# for incoming proxy request and incoming web request. Web request is the one
# which is broken.
self.host: Optional[bytes] = None
self.port: Optional[int] = None
self.path: Optional[bytes] = None
@classmethod
def request(cls: Type[T], raw: bytes) -> T:
parser = cls(httpParserTypes.REQUEST_PARSER)
parser.parse(raw)
return parser
@classmethod
def response(cls: Type[T], raw: bytes) -> T:
parser = cls(httpParserTypes.RESPONSE_PARSER)
parser.parse(raw)
return parser
def header(self, key: bytes) -> bytes:
if key.lower() not in self.headers:
raise KeyError('%s not found in headers', text_(key))
return self.headers[key.lower()][1]
def has_header(self, key: bytes) -> bool:
return key.lower() in self.headers
def add_header(self, key: bytes, value: bytes) -> None:
self.headers[key.lower()] = (key, value)
def add_headers(self, headers: List[Tuple[bytes, bytes]]) -> None:
for (key, value) in headers:
self.add_header(key, value)
def del_header(self, header: bytes) -> None:
if header.lower() in self.headers:
del self.headers[header.lower()]
def del_headers(self, headers: List[bytes]) -> None:
for key in headers:
self.del_header(key.lower())
def set_url(self, url: bytes) -> None:
self.url = urlparse.urlsplit(url)
self.set_line_attributes()
def set_line_attributes(self) -> None:
if self.type == httpParserTypes.REQUEST_PARSER:
if self.method == httpMethods.CONNECT and self.url:
u = urlparse.urlsplit(b'//' + self.url.path)
self.host, self.port = u.hostname, u.port
elif self.url:
self.host, self.port = self.url.hostname, self.url.port \
if self.url.port else 80
else:
raise KeyError('Invalid request\n%s' % self.bytes)
self.path = self.build_url()
def is_chunked_encoded(self) -> bool:
return b'transfer-encoding' in self.headers and \
self.headers[b'transfer-encoding'][1].lower() == b'chunked'
def parse(self, raw: bytes) -> None:
"""Parses Http request out of raw bytes.
Check HttpParser state after parse has successfully returned."""
self.bytes += raw
self.total_size += len(raw)
# Prepend past buffer
raw = self.buffer + raw
self.buffer = b''
more = True if len(raw) > 0 else False
while more and self.state != httpParserStates.COMPLETE:
if self.state in (
httpParserStates.HEADERS_COMPLETE,
httpParserStates.RCVING_BODY):
if b'content-length' in self.headers:
self.state = httpParserStates.RCVING_BODY
if self.body is None:
self.body = b''
total_size = int(self.header(b'content-length'))
received_size = len(self.body)
self.body += raw[:total_size - received_size]
if self.body and \
len(self.body) == int(self.header(b'content-length')):
self.state = httpParserStates.COMPLETE
more, raw = len(raw) > 0, raw[total_size - received_size:]
elif self.is_chunked_encoded():
if not self.chunk_parser:
self.chunk_parser = ChunkParser()
raw = self.chunk_parser.parse(raw)
if self.chunk_parser.state == chunkParserStates.COMPLETE:
self.body = self.chunk_parser.body
self.state = httpParserStates.COMPLETE
more = False
else:
more, raw = self.process(raw)
self.buffer = raw
def process(self, raw: bytes) -> Tuple[bool, bytes]:
"""Returns False when no CRLF could be found in received bytes."""
line, raw = find_http_line(raw)
if line is None:
return False, raw
if self.state == httpParserStates.INITIALIZED:
self.process_line(line)
self.state = httpParserStates.LINE_RCVD
elif self.state in (httpParserStates.LINE_RCVD, httpParserStates.RCVING_HEADERS):
if self.state == httpParserStates.LINE_RCVD:
# LINE_RCVD state is equivalent to RCVING_HEADERS
self.state = httpParserStates.RCVING_HEADERS
if line.strip() == b'': # Blank line received.
self.state = httpParserStates.HEADERS_COMPLETE
else:
self.process_header(line)
# When connect request is received without a following host header
# See
# `TestHttpParser.test_connect_request_without_host_header_request_parse`
# for details
if self.state == httpParserStates.LINE_RCVD and \
self.type == httpParserTypes.RESPONSE_PARSER and \
raw == CRLF:
self.state = httpParserStates.COMPLETE
# When raw request has ended with \r\n\r\n and no more http headers are expected
# See `TestHttpParser.test_request_parse_without_content_length` and
# `TestHttpParser.test_response_parse_without_content_length` for details
elif self.state == httpParserStates.HEADERS_COMPLETE and \
self.type == httpParserTypes.REQUEST_PARSER and \
self.method != httpMethods.POST and \
self.bytes.endswith(CRLF * 2):
self.state = httpParserStates.COMPLETE
elif self.state == httpParserStates.HEADERS_COMPLETE and \
self.type == httpParserTypes.REQUEST_PARSER and \
self.method == httpMethods.POST and \
not self.is_chunked_encoded() and \
(b'content-length' not in self.headers or
(b'content-length' in self.headers and
int(self.headers[b'content-length'][1]) == 0)) and \
self.bytes.endswith(CRLF * 2):
self.state = httpParserStates.COMPLETE
return len(raw) > 0, raw
def process_line(self, raw: bytes) -> None:
line = raw.split(WHITESPACE)
if self.type == httpParserTypes.REQUEST_PARSER:
self.method = line[0].upper()
self.set_url(line[1])
self.version = line[2]
else:
self.version = line[0]
self.code = line[1]
self.reason = WHITESPACE.join(line[2:])
def process_header(self, raw: bytes) -> None:
parts = raw.split(COLON)
key = parts[0].strip()
value = COLON.join(parts[1:]).strip()
self.add_headers([(key, value)])
def build_url(self) -> bytes:
if not self.url:
return b'/None'
url = self.url.path
if url == b'':
url = b'/'
if not self.url.query == b'':
url += b'?' + self.url.query
if not self.url.fragment == b'':
url += b'#' + self.url.fragment
return url
def build(self, disable_headers: Optional[List[bytes]] = None) -> bytes:
assert self.method and self.version and self.path
if disable_headers is None:
disable_headers = DEFAULT_DISABLE_HEADERS
body: Optional[bytes] = ChunkParser.to_chunks(self.body) \
if self.is_chunked_encoded() and self.body else \
self.body
return build_http_request(
self.method, self.path, self.version,
headers={} if not self.headers else {self.headers[k][0]: self.headers[k][1] for k in self.headers if
k.lower() not in disable_headers},
body=body
)
def has_upstream_server(self) -> bool:
"""Host field SHOULD be None for incoming local WebServer requests."""
return True if self.host is not None else False
def is_http_1_1_keep_alive(self) -> bool:
return self.version == HTTP_1_1 and \
(not self.has_header(b'Connection') or
self.header(b'Connection').lower() == b'keep-alive')

458
proxy/http/proxy.py Normal file
View File

@ -0,0 +1,458 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import threading
import subprocess
import os
import ssl
import socket
import time
import errno
import logging
from abc import ABC, abstractmethod
from typing import Optional, List, Union, Dict, cast, Any, Tuple
from .handler import HttpProtocolHandlerPlugin
from .exception import HttpProtocolException, ProxyConnectionFailed, ProxyAuthenticationFailed
from .codes import httpStatusCodes
from .parser import HttpParser, httpParserStates, httpParserTypes
from .methods import httpMethods
from ..common.types import HasFileno
from ..common.flags import Flags
from ..common.constants import PROXY_AGENT_HEADER_VALUE
from ..common.utils import build_http_response, text_
from ..core.connection import TcpClientConnection, TcpServerConnection, TcpConnectionUninitializedException
logger = logging.getLogger(__name__)
class HttpProxyBasePlugin(ABC):
"""Base HttpProxyPlugin Plugin class.
Implement various lifecycle event methods to customize behavior."""
def __init__(
self,
config: Flags,
client: TcpClientConnection):
self.config = config # pragma: no cover
self.client = client # pragma: no cover
def name(self) -> str:
"""A unique name for your plugin.
Defaults to name of the class. This helps plugin developers to directly
access a specific plugin by its name."""
return self.__class__.__name__ # pragma: no cover
@abstractmethod
def before_upstream_connection(
self, request: HttpParser) -> Optional[HttpParser]:
"""Handler called just before Proxy upstream connection is established.
Return optionally modified request object.
Raise HttpRequestRejected or HttpProtocolException directly to drop the connection."""
return request # pragma: no cover
@abstractmethod
def handle_client_request(
self, request: HttpParser) -> Optional[HttpParser]:
"""Handler called before dispatching client request to upstream.
Note: For pipelined (keep-alive) connections, this handler can be
called multiple times, for each request sent to upstream.
Note: If TLS interception is enabled, this handler can
be called multiple times if client exchanges multiple
requests over same SSL session.
Return optionally modified request object to dispatch to upstream.
Return None to drop the request data, e.g. in case a response has already been queued.
Raise HttpRequestRejected or HttpProtocolException directly to
teardown the connection with client.
"""
return request # pragma: no cover
@abstractmethod
def handle_upstream_chunk(self, chunk: bytes) -> bytes:
"""Handler called right after receiving raw response from upstream server.
For HTTPS connections, chunk will be encrypted unless
TLS interception is also enabled."""
return chunk # pragma: no cover
@abstractmethod
def on_upstream_connection_close(self) -> None:
"""Handler called right after upstream connection has been closed."""
pass # pragma: no cover
class HttpProxyPlugin(HttpProtocolHandlerPlugin):
"""HttpProtocolHandler plugin which implements HttpProxy specifications."""
PROXY_TUNNEL_ESTABLISHED_RESPONSE_PKT = build_http_response(
httpStatusCodes.OK,
reason=b'Connection established'
)
# Used to synchronize with other HttpProxyPlugin instances while
# generating certificates
lock = threading.Lock()
def __init__(
self,
*args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.start_time: float = time.time()
self.server: Optional[TcpServerConnection] = None
self.response: HttpParser = HttpParser(httpParserTypes.RESPONSE_PARSER)
self.pipeline_request: Optional[HttpParser] = None
self.pipeline_response: Optional[HttpParser] = None
self.plugins: Dict[str, HttpProxyBasePlugin] = {}
if b'HttpProxyBasePlugin' in self.config.plugins:
for klass in self.config.plugins[b'HttpProxyBasePlugin']:
instance = klass(self.config, self.client)
self.plugins[instance.name()] = instance
def get_descriptors(
self) -> Tuple[List[socket.socket], List[socket.socket]]:
if not self.request.has_upstream_server():
return [], []
r: List[socket.socket] = []
w: List[socket.socket] = []
if self.server and not self.server.closed and self.server.connection:
r.append(self.server.connection)
if self.server and not self.server.closed and \
self.server.has_buffer() and self.server.connection:
w.append(self.server.connection)
return r, w
def write_to_descriptors(self, w: List[Union[int, HasFileno]]) -> bool:
if self.request.has_upstream_server() and \
self.server and not self.server.closed and \
self.server.has_buffer() and \
self.server.connection in w:
logger.debug('Server is write ready, flushing buffer')
try:
self.server.flush()
except OSError:
logger.error('OSError when flushing buffer to server')
return True
except BrokenPipeError:
logger.error(
'BrokenPipeError when flushing buffer for server')
return True
return False
def read_from_descriptors(self, r: List[Union[int, HasFileno]]) -> bool:
if self.request.has_upstream_server(
) and self.server and not self.server.closed and self.server.connection in r:
logger.debug('Server is ready for reads, reading...')
try:
raw = self.server.recv(self.config.server_recvbuf_size)
except ssl.SSLWantReadError: # Try again later
# logger.warning('SSLWantReadError encountered while reading from server, will retry ...')
return False
except socket.error as e:
if e.errno == errno.ECONNRESET:
logger.warning('Connection reset by upstream: %r' % e)
else:
logger.exception(
'Exception while receiving from %s connection %r with reason %r' %
(self.server.tag, self.server.connection, e))
return True
if not raw:
logger.debug('Server closed connection, tearing down...')
return True
for plugin in self.plugins.values():
raw = plugin.handle_upstream_chunk(raw)
# parse incoming response packet
# only for non-https requests and when
# tls interception is enabled
if self.request.method != httpMethods.CONNECT:
# See https://github.com/abhinavsingh/proxy.py/issues/127 for why
# currently response parsing is disabled when TLS interception is enabled.
#
# or self.config.tls_interception_enabled():
if self.response.state == httpParserStates.COMPLETE:
if self.pipeline_response is None:
self.pipeline_response = HttpParser(
httpParserTypes.RESPONSE_PARSER)
self.pipeline_response.parse(raw)
if self.pipeline_response.state == httpParserStates.COMPLETE:
self.pipeline_response = None
else:
self.response.parse(raw)
else:
self.response.total_size += len(raw)
# queue raw data for client
self.client.queue(raw)
return False
def access_log(self) -> None:
server_host, server_port = self.server.addr if self.server else (
None, None)
connection_time_ms = (time.time() - self.start_time) * 1000
if self.request.method == b'CONNECT':
logger.info(
'%s:%s - %s %s:%s - %s bytes - %.2f ms' %
(self.client.addr[0],
self.client.addr[1],
text_(self.request.method),
text_(server_host),
text_(server_port),
self.response.total_size,
connection_time_ms))
elif self.request.method:
logger.info(
'%s:%s - %s %s:%s%s - %s %s - %s bytes - %.2f ms' %
(self.client.addr[0], self.client.addr[1],
text_(self.request.method),
text_(server_host), server_port,
text_(self.request.path),
text_(self.response.code),
text_(self.response.reason),
self.response.total_size,
connection_time_ms))
def on_client_connection_close(self) -> None:
if not self.request.has_upstream_server():
return
self.access_log()
# If server was never initialized, return
if self.server is None:
return
# Note that, server instance was initialized
# but not necessarily the connection object exists.
# Invoke plugin.on_upstream_connection_close
for plugin in self.plugins.values():
plugin.on_upstream_connection_close()
try:
try:
self.server.connection.shutdown(socket.SHUT_WR)
except OSError:
pass
finally:
# TODO: Unwrap if wrapped before close?
self.server.connection.close()
except TcpConnectionUninitializedException:
pass
finally:
logger.debug(
'Closed server connection with pending server buffer size %d bytes' %
self.server.buffer_size())
def on_response_chunk(self, chunk: bytes) -> bytes:
# TODO: Allow to output multiple access_log lines
# for each request over a pipelined HTTP connection (not for HTTPS).
# However, this must also be accompanied by resetting both request
# and response objects.
#
# if not self.request.method == httpMethods.CONNECT and \
# self.response.state == httpParserStates.COMPLETE:
# self.access_log()
return chunk
def on_client_data(self, raw: bytes) -> Optional[bytes]:
if not self.request.has_upstream_server():
return raw
if self.server and not self.server.closed:
if self.request.state == httpParserStates.COMPLETE and (
self.request.method != httpMethods.CONNECT or
self.config.tls_interception_enabled()):
if self.pipeline_request is None:
self.pipeline_request = HttpParser(
httpParserTypes.REQUEST_PARSER)
self.pipeline_request.parse(raw)
if self.pipeline_request.state == httpParserStates.COMPLETE:
for plugin in self.plugins.values():
assert self.pipeline_request is not None
r = plugin.handle_client_request(self.pipeline_request)
if r is None:
return None
self.pipeline_request = r
assert self.pipeline_request is not None
self.server.queue(self.pipeline_request.build())
self.pipeline_request = None
else:
self.server.queue(raw)
return None
else:
return raw
@staticmethod
def generated_cert_file_path(ca_cert_dir: str, host: str) -> str:
return os.path.join(ca_cert_dir, '%s.pem' % host)
def generate_upstream_certificate(
self, _certificate: Optional[Dict[str, Any]]) -> str:
if not (self.config.ca_cert_dir and self.config.ca_signing_key_file and
self.config.ca_cert_file and self.config.ca_key_file):
raise HttpProtocolException(
f'For certificate generation all the following flags are mandatory: '
f'--ca-cert-file:{ self.config.ca_cert_file }, '
f'--ca-key-file:{ self.config.ca_key_file }, '
f'--ca-signing-key-file:{ self.config.ca_signing_key_file }')
cert_file_path = HttpProxyPlugin.generated_cert_file_path(
self.config.ca_cert_dir, text_(self.request.host))
with self.lock:
if not os.path.isfile(cert_file_path):
logger.debug('Generating certificates %s', cert_file_path)
# TODO: Parse subject from certificate
# Currently we only set CN= field for generated certificates.
gen_cert = subprocess.Popen(
['openssl', 'req', '-new', '-key', self.config.ca_signing_key_file, '-subj',
f'/C=/ST=/L=/O=/OU=/CN={ text_(self.request.host) }'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
sign_cert = subprocess.Popen(
['openssl', 'x509', '-req', '-days', '365', '-CA', self.config.ca_cert_file, '-CAkey',
self.config.ca_key_file, '-set_serial', str(int(time.time())), '-out', cert_file_path],
stdin=gen_cert.stdout,
stderr=subprocess.PIPE)
# TODO: Ensure sign_cert success.
sign_cert.communicate(timeout=10)
return cert_file_path
def wrap_server(self) -> None:
assert self.server is not None
assert isinstance(self.server.connection, socket.socket)
ctx = ssl.create_default_context(
ssl.Purpose.SERVER_AUTH)
ctx.options |= ssl.OP_NO_SSLv2 | ssl.OP_NO_SSLv3 | ssl.OP_NO_TLSv1 | ssl.OP_NO_TLSv1_1
self.server.connection.setblocking(True)
self.server._conn = ctx.wrap_socket(
self.server.connection,
server_hostname=text_(self.request.host))
self.server.connection.setblocking(False)
def wrap_client(self) -> None:
assert self.server is not None
assert isinstance(self.server.connection, ssl.SSLSocket)
generated_cert = self.generate_upstream_certificate(
cast(Dict[str, Any], self.server.connection.getpeercert()))
self.client.connection.setblocking(True)
self.client.flush()
self.client._conn = ssl.wrap_socket(
self.client.connection,
server_side=True,
keyfile=self.config.ca_signing_key_file,
certfile=generated_cert)
self.client.connection.setblocking(False)
logger.debug(
'TLS interception using %s', generated_cert)
def on_request_complete(self) -> Union[socket.socket, bool]:
if not self.request.has_upstream_server():
return False
self.authenticate()
# Note: can raise HttpRequestRejected exception
# Invoke plugin.before_upstream_connection
do_connect = True
for plugin in self.plugins.values():
r = plugin.before_upstream_connection(self.request)
if r is None:
do_connect = False
break
self.request = r
if do_connect:
self.connect_upstream()
for plugin in self.plugins.values():
assert self.request is not None
r = plugin.handle_client_request(self.request)
if r is not None:
self.request = r
else:
return False
if self.request.method == httpMethods.CONNECT:
self.client.queue(
HttpProxyPlugin.PROXY_TUNNEL_ESTABLISHED_RESPONSE_PKT)
# If interception is enabled
if self.config.tls_interception_enabled():
# Perform SSL/TLS handshake with upstream
self.wrap_server()
# Generate certificate and perform handshake with client
try:
# wrap_client also flushes client data before wrapping
# sending to client can raise, handle expected exceptions
self.wrap_client()
except OSError:
logger.error('OSError when wrapping client')
return True
except BrokenPipeError:
logger.error(
'BrokenPipeError when wrapping client')
return True
# Update all plugin connection reference
for plugin in self.plugins.values():
plugin.client._conn = self.client.connection
return self.client.connection
elif self.server:
# - proxy-connection header is a mistake, it doesn't seem to be
# officially documented in any specification, drop it.
# - proxy-authorization is of no use for upstream, remove it.
self.request.del_headers(
[b'proxy-authorization', b'proxy-connection'])
# - For HTTP/1.0, connection header defaults to close
# - For HTTP/1.1, connection header defaults to keep-alive
# Respect headers sent by client instead of manipulating
# Connection or Keep-Alive header. However, note that per
# https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Connection
# connection headers are meant for communication between client and
# first intercepting proxy.
self.request.add_headers(
[(b'Via', b'1.1 %s' % PROXY_AGENT_HEADER_VALUE)])
# Disable args.disable_headers before dispatching to upstream
self.server.queue(
self.request.build(
disable_headers=self.config.disable_headers))
return False
def authenticate(self) -> None:
if self.config.auth_code:
if b'proxy-authorization' not in self.request.headers or \
self.request.headers[b'proxy-authorization'][1] != self.config.auth_code:
raise ProxyAuthenticationFailed()
def connect_upstream(self) -> None:
host, port = self.request.host, self.request.port
if host and port:
self.server = TcpServerConnection(text_(host), port)
try:
logger.debug(
'Connecting to upstream %s:%s' %
(text_(host), port))
self.server.connect()
self.server.connection.setblocking(False)
logger.debug(
'Connected to upstream %s:%s' %
(text_(host), port))
except Exception as e: # TimeoutError, socket.gaierror
self.server.closed = True
raise ProxyConnectionFailed(text_(host), port, repr(e)) from e
else:
logger.exception('Both host and port must exist')
raise HttpProtocolException()

309
proxy/http/server.py Normal file
View File

@ -0,0 +1,309 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import time
import logging
import os
import mimetypes
import socket
from abc import ABC, abstractmethod
from typing import List, Tuple, Optional, NamedTuple, Dict, Union, Any
from .exception import HttpProtocolException
from .websocket import WebsocketFrame, websocketOpcodes
from .codes import httpStatusCodes
from .parser import HttpParser, httpParserStates, httpParserTypes
from .handler import HttpProtocolHandlerPlugin
from ..common.utils import bytes_, text_, build_http_response, build_websocket_handshake_response
from ..common.flags import Flags
from ..common.constants import PROXY_AGENT_HEADER_VALUE
from ..common.types import HasFileno
from ..core.connection import TcpClientConnection
from ..core.event import EventQueue
logger = logging.getLogger(__name__)
HttpProtocolTypes = NamedTuple('HttpProtocolTypes', [
('HTTP', int),
('HTTPS', int),
('WEBSOCKET', int),
])
httpProtocolTypes = HttpProtocolTypes(1, 2, 3)
class HttpWebServerBasePlugin(ABC):
"""Web Server Plugin for routing of requests."""
def __init__(
self,
flags: Flags,
client: TcpClientConnection,
event_queue: EventQueue):
self.flags = flags
self.client = client
self.event_queue = event_queue
@abstractmethod
def routes(self) -> List[Tuple[int, bytes]]:
"""Return List(protocol, path) that this plugin handles."""
raise NotImplementedError() # pragma: no cover
@abstractmethod
def handle_request(self, request: HttpParser) -> None:
"""Handle the request and serve response."""
raise NotImplementedError() # pragma: no cover
@abstractmethod
def on_websocket_open(self) -> None:
"""Called when websocket handshake has finished."""
raise NotImplementedError() # pragma: no cover
@abstractmethod
def on_websocket_message(self, frame: WebsocketFrame) -> None:
"""Handle websocket frame."""
raise NotImplementedError() # pragma: no cover
@abstractmethod
def on_websocket_close(self) -> None:
"""Called when websocket connection has been closed."""
raise NotImplementedError() # pragma: no cover
class HttpWebServerPacFilePlugin(HttpWebServerBasePlugin):
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.pac_file_response: Optional[bytes] = None
self.cache_pac_file_response()
def cache_pac_file_response(self) -> None:
if self.flags.pac_file:
try:
with open(self.flags.pac_file, 'rb') as f:
content = f.read()
except IOError:
content = bytes_(self.flags.pac_file)
self.pac_file_response = build_http_response(
200, reason=b'OK', headers={
b'Content-Type': b'application/x-ns-proxy-autoconfig',
}, body=content
)
def routes(self) -> List[Tuple[int, bytes]]:
if self.flags.pac_file_url_path:
return [
(httpProtocolTypes.HTTP, bytes_(self.flags.pac_file_url_path)),
(httpProtocolTypes.HTTPS, bytes_(self.flags.pac_file_url_path)),
]
return [] # pragma: no cover
def handle_request(self, request: HttpParser) -> None:
if self.flags.pac_file and self.pac_file_response:
self.client.queue(self.pac_file_response)
def on_websocket_open(self) -> None:
pass # pragma: no cover
def on_websocket_message(self, frame: WebsocketFrame) -> None:
pass # pragma: no cover
def on_websocket_close(self) -> None:
pass # pragma: no cover
class HttpWebServerPlugin(HttpProtocolHandlerPlugin):
"""HttpProtocolHandler plugin which handles incoming requests to local web server."""
DEFAULT_404_RESPONSE = build_http_response(
httpStatusCodes.NOT_FOUND,
reason=b'NOT FOUND',
headers={b'Server': PROXY_AGENT_HEADER_VALUE,
b'Connection': b'close'}
)
DEFAULT_501_RESPONSE = build_http_response(
httpStatusCodes.NOT_IMPLEMENTED,
reason=b'NOT IMPLEMENTED',
headers={b'Server': PROXY_AGENT_HEADER_VALUE,
b'Connection': b'close'}
)
def __init__(
self,
*args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.start_time: float = time.time()
self.pipeline_request: Optional[HttpParser] = None
self.switched_protocol: Optional[int] = None
self.routes: Dict[int, Dict[bytes, HttpWebServerBasePlugin]] = {
httpProtocolTypes.HTTP: {},
httpProtocolTypes.HTTPS: {},
httpProtocolTypes.WEBSOCKET: {},
}
self.route: Optional[HttpWebServerBasePlugin] = None
if b'HttpWebServerBasePlugin' in self.config.plugins:
for klass in self.config.plugins[b'HttpWebServerBasePlugin']:
instance = klass(self.config, self.client, self.event_queue)
for (protocol, path) in instance.routes():
self.routes[protocol][path] = instance
@staticmethod
def read_and_build_static_file_response(path: str) -> bytes:
with open(path, 'rb') as f:
content = f.read()
content_type = mimetypes.guess_type(path)[0]
if content_type is None:
content_type = 'text/plain'
return build_http_response(
httpStatusCodes.OK,
reason=b'OK',
headers={
b'Content-Type': bytes_(content_type),
b'Connection': b'close',
},
body=content)
def serve_file_or_404(self, path: str) -> bool:
"""Read and serves a file from disk.
Queues 404 Not Found for IOError.
Shouldn't this be server error?
"""
try:
self.client.queue(
self.read_and_build_static_file_response(path))
except IOError:
self.client.queue(self.DEFAULT_404_RESPONSE)
return True
def try_upgrade(self) -> bool:
if self.request.has_header(b'connection') and \
self.request.header(b'connection').lower() == b'upgrade':
if self.request.has_header(b'upgrade') and \
self.request.header(b'upgrade').lower() == b'websocket':
self.client.queue(
build_websocket_handshake_response(
WebsocketFrame.key_to_accept(
self.request.header(b'Sec-WebSocket-Key'))))
self.switched_protocol = httpProtocolTypes.WEBSOCKET
else:
self.client.queue(self.DEFAULT_501_RESPONSE)
return True
return False
def on_request_complete(self) -> Union[socket.socket, bool]:
if self.request.has_upstream_server():
return False
# If a websocket route exists for the path, try upgrade
if self.request.path in self.routes[httpProtocolTypes.WEBSOCKET]:
self.route = self.routes[httpProtocolTypes.WEBSOCKET][self.request.path]
# Connection upgrade
teardown = self.try_upgrade()
if teardown:
return True
# For upgraded connections, nothing more to do
if self.switched_protocol:
# Invoke plugin.on_websocket_open
self.route.on_websocket_open()
return False
# Routing for Http(s) requests
protocol = httpProtocolTypes.HTTPS \
if self.config.encryption_enabled() else \
httpProtocolTypes.HTTP
for r in self.routes[protocol]:
if r == self.request.path:
self.route = self.routes[protocol][r]
self.route.handle_request(self.request)
return False
# No-route found, try static serving if enabled
if self.config.enable_static_server:
path = text_(self.request.path).split('?')[0]
if os.path.isfile(self.config.static_server_dir + path):
return self.serve_file_or_404(
self.config.static_server_dir + path)
# Catch all unhandled web server requests, return 404
self.client.queue(self.DEFAULT_404_RESPONSE)
return True
def write_to_descriptors(self, w: List[Union[int, HasFileno]]) -> bool:
pass
def read_from_descriptors(self, r: List[Union[int, HasFileno]]) -> bool:
pass
def on_client_data(self, raw: bytes) -> Optional[bytes]:
if self.switched_protocol == httpProtocolTypes.WEBSOCKET:
remaining = raw
frame = WebsocketFrame()
while remaining != b'':
# TODO: Teardown if invalid protocol exception
remaining = frame.parse(remaining)
for r in self.routes[httpProtocolTypes.WEBSOCKET]:
if r == self.request.path:
route = self.routes[httpProtocolTypes.WEBSOCKET][r]
if frame.opcode == websocketOpcodes.CONNECTION_CLOSE:
logger.warning(
'Client sent connection close packet')
raise HttpProtocolException()
else:
route.on_websocket_message(frame)
frame.reset()
return None
# If 1st valid request was completed and it's a HTTP/1.1 keep-alive
# And only if we have a route, parse pipeline requests
elif self.request.state == httpParserStates.COMPLETE and \
self.request.is_http_1_1_keep_alive() and \
self.route is not None:
if self.pipeline_request is None:
self.pipeline_request = HttpParser(
httpParserTypes.REQUEST_PARSER)
self.pipeline_request.parse(raw)
if self.pipeline_request.state == httpParserStates.COMPLETE:
self.route.handle_request(self.pipeline_request)
if not self.pipeline_request.is_http_1_1_keep_alive():
logger.error(
'Pipelined request is not keep-alive, will teardown request...')
raise HttpProtocolException()
self.pipeline_request = None
return raw
def on_response_chunk(self, chunk: bytes) -> bytes:
return chunk
def on_client_connection_close(self) -> None:
if self.request.has_upstream_server():
return
if self.switched_protocol:
# Invoke plugin.on_websocket_close
for r in self.routes[httpProtocolTypes.WEBSOCKET]:
if r == self.request.path:
self.routes[httpProtocolTypes.WEBSOCKET][r].on_websocket_close()
self.access_log()
def access_log(self) -> None:
logger.info(
'%s:%s - %s %s - %.2f ms' %
(self.client.addr[0],
self.client.addr[1],
text_(self.request.method),
text_(self.request.path),
(time.time() - self.start_time) * 1000))
def get_descriptors(
self) -> Tuple[List[socket.socket], List[socket.socket]]:
return [], []

265
proxy/http/websocket.py Normal file
View File

@ -0,0 +1,265 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import hashlib
import base64
import selectors
import struct
import socket
import secrets
import ssl
import ipaddress
import logging
import io
from typing import TypeVar, Type, Optional, NamedTuple, Union, Callable
from .parser import httpParserTypes, HttpParser
from ..common.constants import DEFAULT_BUFFER_SIZE
from ..common.utils import new_socket_connection, build_websocket_handshake_request
from ..core.connection import tcpConnectionTypes, TcpConnection
WebsocketOpcodes = NamedTuple('WebsocketOpcodes', [
('CONTINUATION_FRAME', int),
('TEXT_FRAME', int),
('BINARY_FRAME', int),
('CONNECTION_CLOSE', int),
('PING', int),
('PONG', int),
])
websocketOpcodes = WebsocketOpcodes(0x0, 0x1, 0x2, 0x8, 0x9, 0xA)
V = TypeVar('V', bound='WebsocketFrame')
logger = logging.getLogger(__name__)
class WebsocketFrame:
"""Websocket frames parser and constructor."""
GUID = b'258EAFA5-E914-47DA-95CA-C5AB0DC85B11'
def __init__(self) -> None:
self.fin: bool = False
self.rsv1: bool = False
self.rsv2: bool = False
self.rsv3: bool = False
self.opcode: int = 0
self.masked: bool = False
self.payload_length: Optional[int] = None
self.mask: Optional[bytes] = None
self.data: Optional[bytes] = None
@classmethod
def text(cls: Type[V], data: bytes) -> bytes:
frame = cls()
frame.fin = True
frame.opcode = websocketOpcodes.TEXT_FRAME
frame.data = data
return frame.build()
def reset(self) -> None:
self.fin = False
self.rsv1 = False
self.rsv2 = False
self.rsv3 = False
self.opcode = 0
self.masked = False
self.payload_length = None
self.mask = None
self.data = None
def parse_fin_and_rsv(self, byte: int) -> None:
self.fin = bool(byte & 1 << 7)
self.rsv1 = bool(byte & 1 << 6)
self.rsv2 = bool(byte & 1 << 5)
self.rsv3 = bool(byte & 1 << 4)
self.opcode = byte & 0b00001111
def parse_mask_and_payload(self, byte: int) -> None:
self.masked = bool(byte & 0b10000000)
self.payload_length = byte & 0b01111111
def build(self) -> bytes:
if self.payload_length is None and self.data:
self.payload_length = len(self.data)
raw = io.BytesIO()
raw.write(
struct.pack(
'!B',
(1 << 7 if self.fin else 0) |
(1 << 6 if self.rsv1 else 0) |
(1 << 5 if self.rsv2 else 0) |
(1 << 4 if self.rsv3 else 0) |
self.opcode
))
assert self.payload_length is not None
if self.payload_length < 126:
raw.write(
struct.pack(
'!B',
(1 << 7 if self.masked else 0) | self.payload_length
)
)
elif self.payload_length < 1 << 16:
raw.write(
struct.pack(
'!BH',
(1 << 7 if self.masked else 0) | 126,
self.payload_length
)
)
elif self.payload_length < 1 << 64:
raw.write(
struct.pack(
'!BHQ',
(1 << 7 if self.masked else 0) | 127,
self.payload_length
)
)
else:
raise ValueError(f'Invalid payload_length { self.payload_length },'
f'maximum allowed { 1 << 64 }')
if self.masked and self.data:
mask = secrets.token_bytes(4) if self.mask is None else self.mask
raw.write(mask)
raw.write(self.apply_mask(self.data, mask))
elif self.data:
raw.write(self.data)
return raw.getvalue()
def parse(self, raw: bytes) -> bytes:
cur = 0
self.parse_fin_and_rsv(raw[cur])
cur += 1
self.parse_mask_and_payload(raw[cur])
cur += 1
if self.payload_length == 126:
data = raw[cur: cur + 2]
self.payload_length, = struct.unpack('!H', data)
cur += 2
elif self.payload_length == 127:
data = raw[cur: cur + 8]
self.payload_length, = struct.unpack('!Q', data)
cur += 8
if self.masked:
self.mask = raw[cur: cur + 4]
cur += 4
assert self.payload_length
self.data = raw[cur: cur + self.payload_length]
cur += self.payload_length
if self.masked:
assert self.mask is not None
self.data = self.apply_mask(self.data, self.mask)
return raw[cur:]
@staticmethod
def apply_mask(data: bytes, mask: bytes) -> bytes:
raw = bytearray(data)
for i in range(len(raw)):
raw[i] = raw[i] ^ mask[i % 4]
return bytes(raw)
@staticmethod
def key_to_accept(key: bytes) -> bytes:
sha1 = hashlib.sha1()
sha1.update(key + WebsocketFrame.GUID)
return base64.b64encode(sha1.digest())
class WebsocketClient(TcpConnection):
def __init__(self,
hostname: Union[ipaddress.IPv4Address, ipaddress.IPv6Address],
port: int,
path: bytes = b'/',
on_message: Optional[Callable[[WebsocketFrame], None]] = None) -> None:
super().__init__(tcpConnectionTypes.CLIENT)
self.hostname: Union[ipaddress.IPv4Address,
ipaddress.IPv6Address] = hostname
self.port: int = port
self.path: bytes = path
self.sock: socket.socket = new_socket_connection(
(str(self.hostname), self.port))
self.on_message: Optional[Callable[[
WebsocketFrame], None]] = on_message
self.upgrade()
self.sock.setblocking(False)
self.selector: selectors.DefaultSelector = selectors.DefaultSelector()
@property
def connection(self) -> Union[ssl.SSLSocket, socket.socket]:
return self.sock
def upgrade(self) -> None:
key = base64.b64encode(secrets.token_bytes(16))
self.sock.send(build_websocket_handshake_request(key, url=self.path))
response = HttpParser(httpParserTypes.RESPONSE_PARSER)
response.parse(self.sock.recv(DEFAULT_BUFFER_SIZE))
accept = response.header(b'Sec-Websocket-Accept')
assert WebsocketFrame.key_to_accept(key) == accept
def ping(self, data: Optional[bytes] = None) -> None:
pass
def pong(self, data: Optional[bytes] = None) -> None:
pass
def shutdown(self, _data: Optional[bytes] = None) -> None:
"""Closes connection with the server."""
super().close()
def run_once(self) -> bool:
ev = selectors.EVENT_READ
if self.has_buffer():
ev |= selectors.EVENT_WRITE
self.selector.register(self.sock.fileno(), ev)
events = self.selector.select(timeout=1)
self.selector.unregister(self.sock)
for key, mask in events:
if mask & selectors.EVENT_READ and self.on_message:
raw = self.recv()
if raw is None or raw == b'':
self.closed = True
logger.debug('Websocket connection closed by server')
return True
frame = WebsocketFrame()
frame.parse(raw)
self.on_message(frame)
elif mask & selectors.EVENT_WRITE:
logger.debug(self.buffer)
self.flush()
return False
def run(self) -> None:
logger.debug('running')
try:
while not self.closed:
teardown = self.run_once()
if teardown:
break
except KeyboardInterrupt:
pass
finally:
try:
self.selector.unregister(self.sock)
self.sock.shutdown(socket.SHUT_WR)
except Exception as e:
logging.exception(
'Exception while shutdown of websocket client', exc_info=e)
self.sock.close()
logger.info('done')

212
proxy/main.py Executable file
View File

@ -0,0 +1,212 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import base64
import importlib
import inspect
import ipaddress
import logging
import multiprocessing
import os
import sys
import time
from typing import Dict, List, Optional
from .common.flags import Flags, init_parser
from .common.utils import text_, bytes_
from .common.types import DictQueueType
from .common.constants import DOT, COMMA
from .common.constants import DEFAULT_LOG_FORMAT, DEFAULT_LOG_FILE, DEFAULT_LOG_LEVEL
from .common.version import __version__
from .core.acceptor import AcceptorPool
from .http.handler import HttpProtocolHandler
if os.name != 'nt':
import resource
logger = logging.getLogger(__name__)
def is_py3() -> bool:
"""Exists only to avoid mocking sys.version_info in tests."""
return sys.version_info[0] == 3
def set_open_file_limit(soft_limit: int) -> None:
"""Configure open file description soft limit on supported OS."""
if os.name != 'nt': # resource module not available on Windows OS
curr_soft_limit, curr_hard_limit = resource.getrlimit(
resource.RLIMIT_NOFILE)
if curr_soft_limit < soft_limit < curr_hard_limit:
resource.setrlimit(
resource.RLIMIT_NOFILE, (soft_limit, curr_hard_limit))
logger.debug(
'Open file soft limit set to %d', soft_limit)
def load_plugins(plugins: bytes) -> Dict[bytes, List[type]]:
"""Accepts a comma separated list of Python modules and returns
a list of respective Python classes."""
p: Dict[bytes, List[type]] = {
b'HttpProtocolHandlerPlugin': [],
b'HttpProxyBasePlugin': [],
b'HttpWebServerBasePlugin': [],
}
for plugin_ in plugins.split(COMMA):
plugin = text_(plugin_.strip())
if plugin == '':
continue
module_name, klass_name = plugin.rsplit(text_(DOT), 1)
klass = getattr(
importlib.import_module(
module_name.replace(
os.path.sep, text_(DOT))),
klass_name)
base_klass = inspect.getmro(klass)[1]
p[bytes_(base_klass.__name__)].append(klass)
logger.info(
'Loaded %s %s.%s',
'plugin' if klass.__name__ != 'HttpWebServerRouteHandler' else 'route',
module_name,
# HttpWebServerRouteHandler route decorator adds a special
# staticmethod to return decorated function name
klass.__name__ if klass.__name__ != 'HttpWebServerRouteHandler' else klass.name())
return p
def setup_logger(
log_file: Optional[str] = DEFAULT_LOG_FILE,
log_level: str = DEFAULT_LOG_LEVEL,
log_format: str = DEFAULT_LOG_FORMAT) -> None:
ll = getattr(
logging,
{'D': 'DEBUG',
'I': 'INFO',
'W': 'WARNING',
'E': 'ERROR',
'C': 'CRITICAL'}[log_level.upper()[0]])
if log_file:
logging.basicConfig(
filename=log_file,
filemode='a',
level=ll,
format=log_format)
else:
logging.basicConfig(level=ll, format=log_format)
def main(input_args: List[str]) -> None:
if not is_py3():
print(
'DEPRECATION: "develop" branch no longer supports Python 2.7. Kindly upgrade to Python 3+. '
'If for some reasons you cannot upgrade, consider using "master" branch or simply '
'"pip install proxy.py==0.3".'
'\n\n'
'DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. '
'Please upgrade your Python as Python 2.7 won\'t be maintained after that date. '
'A future version of pip will drop support for Python 2.7.')
sys.exit(1)
args = init_parser().parse_args(input_args)
if args.version:
print(__version__)
sys.exit(0)
if (args.cert_file and args.key_file) and \
(args.ca_key_file and args.ca_cert_file and args.ca_signing_key_file):
print('You can either enable end-to-end encryption OR TLS interception,'
'not both together.')
sys.exit(1)
try:
setup_logger(args.log_file, args.log_level, args.log_format)
set_open_file_limit(args.open_file_limit)
auth_code = None
if args.basic_auth:
auth_code = b'Basic %s' % base64.b64encode(bytes_(args.basic_auth))
default_plugins = ''
devtools_event_queue: Optional[DictQueueType] = None
if args.enable_devtools:
default_plugins += 'proxy.http.devtools.DevtoolsProtocolPlugin,'
default_plugins += 'proxy.http.server.HttpWebServerPlugin,'
if not args.disable_http_proxy:
default_plugins += 'proxy.http.proxy.HttpProxyPlugin,'
if args.enable_web_server or \
args.pac_file is not None or \
args.enable_static_server:
if 'proxy.http.server.HttpWebServerPlugin' not in default_plugins:
default_plugins += 'proxy.http.server.HttpWebServerPlugin,'
if args.enable_devtools:
default_plugins += 'proxy.http.devtools.DevtoolsWebsocketPlugin,'
devtools_event_queue = multiprocessing.Manager().Queue()
if args.pac_file is not None:
default_plugins += 'proxy.http.server.HttpWebServerPacFilePlugin,'
flags = Flags(
auth_code=auth_code,
server_recvbuf_size=args.server_recvbuf_size,
client_recvbuf_size=args.client_recvbuf_size,
pac_file=bytes_(args.pac_file),
pac_file_url_path=bytes_(args.pac_file_url_path),
disable_headers=[
header.lower() for header in bytes_(
args.disable_headers).split(COMMA) if header.strip() != b''],
certfile=args.cert_file,
keyfile=args.key_file,
ca_cert_dir=args.ca_cert_dir,
ca_key_file=args.ca_key_file,
ca_cert_file=args.ca_cert_file,
ca_signing_key_file=args.ca_signing_key_file,
hostname=ipaddress.ip_address(args.hostname),
port=args.port,
backlog=args.backlog,
num_workers=args.num_workers,
static_server_dir=args.static_server_dir,
enable_static_server=args.enable_static_server,
devtools_event_queue=devtools_event_queue,
devtools_ws_path=args.devtools_ws_path,
timeout=args.timeout,
threadless=args.threadless,
enable_events=args.enable_events)
flags.plugins = load_plugins(
bytes_(
'%s%s' %
(default_plugins, args.plugins)))
acceptor_pool = AcceptorPool(
flags=flags,
work_klass=HttpProtocolHandler
)
if args.pid_file:
with open(args.pid_file, 'wb') as pid_file:
pid_file.write(bytes_(os.getpid()))
try:
acceptor_pool.setup()
# TODO: Introduce cron feature instead of mindless sleep
while True:
time.sleep(2**10)
except Exception as e:
logger.exception('exception', exc_info=e)
finally:
acceptor_pool.shutdown()
except KeyboardInterrupt: # pragma: no cover
pass
finally:
if args.pid_file and os.path.exists(args.pid_file):
os.remove(args.pid_file)
def entry_point() -> None:
main(sys.argv[1:])

0
public/.gitkeep Normal file
View File

View File

@ -2,6 +2,8 @@ python-coveralls==2.9.3
coverage==4.5.4
flake8==3.7.9
pytest==5.2.2
pytest-cov==2.8.1
autopep8==1.4.4
mypy==0.730
py-spy==0.3.0
codecov==2.0.15

132
setup.py
View File

@ -7,75 +7,79 @@
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
from setuptools import setup
import proxy
from setuptools import setup, find_packages
classifiers = [
'Development Status :: 5 - Production/Stable',
'Environment :: Console',
'Environment :: No Input/Output (Daemon)',
'Environment :: Web Environment',
'Environment :: MacOS X',
'Environment :: Plugins',
'Environment :: Win32 (MS Windows)',
'Framework :: Robot Framework',
'Framework :: Robot Framework :: Library',
'Intended Audience :: Developers',
'Intended Audience :: Education',
'Intended Audience :: End Users/Desktop',
'Intended Audience :: System Administrators',
'Intended Audience :: Science/Research',
'License :: OSI Approved :: BSD License',
'Natural Language :: English',
'Operating System :: MacOS',
'Operating System :: MacOS :: MacOS 9',
'Operating System :: MacOS :: MacOS X',
'Operating System :: POSIX',
'Operating System :: POSIX :: Linux',
'Operating System :: Unix',
'Operating System :: Microsoft',
'Operating System :: Microsoft :: Windows',
'Operating System :: Microsoft :: Windows :: Windows 10',
'Operating System :: Android',
'Operating System :: OS Independent',
'Programming Language :: Python :: Implementation',
'Programming Language :: Python :: 3 :: Only',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Topic :: Internet',
'Topic :: Internet :: Proxy Servers',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Internet :: WWW/HTTP :: Browsers',
'Topic :: Internet :: WWW/HTTP :: Dynamic Content',
'Topic :: Internet :: WWW/HTTP :: Dynamic Content :: CGI Tools/Libraries',
'Topic :: Internet :: WWW/HTTP :: HTTP Servers',
'Topic :: Scientific/Engineering :: Information Analysis',
'Topic :: Software Development :: Debuggers',
'Topic :: Software Development :: Libraries :: Python Modules',
'Topic :: System :: Monitoring',
'Topic :: System :: Networking',
'Topic :: System :: Networking :: Firewalls',
'Topic :: System :: Networking :: Monitoring',
'Topic :: Utilities',
'Typing :: Typed',
]
from proxy.common.version import __version__
from proxy.common.constants import __author__, __author_email__, __homepage__, __description__, __download_url__, __license__
setup(
name='proxy.py',
version=proxy.__version__,
author=proxy.__author__,
author_email=proxy.__author_email__,
url=proxy.__homepage__,
description=proxy.__description__,
version=__version__,
author=__author__,
author_email=__author_email__,
url=__homepage__,
description=__description__,
long_description=open('README.md').read().strip(),
long_description_content_type='text/markdown',
download_url=proxy.__download_url__,
classifiers=classifiers,
license=proxy.__license__,
py_modules=['proxy'],
scripts=['proxy.py'],
download_url=__download_url__,
license=__license__,
packages=find_packages(exclude=['benchmark', 'tests', 'plugin_examples']),
install_requires=open('requirements.txt', 'r').read().strip().split(),
entry_points={
'console_scripts': [
'proxy = proxy.main:entry_point'
]
},
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Console',
'Environment :: No Input/Output (Daemon)',
'Environment :: Web Environment',
'Environment :: MacOS X',
'Environment :: Plugins',
'Environment :: Win32 (MS Windows)',
'Framework :: Robot Framework',
'Framework :: Robot Framework :: Library',
'Intended Audience :: Developers',
'Intended Audience :: Education',
'Intended Audience :: End Users/Desktop',
'Intended Audience :: System Administrators',
'Intended Audience :: Science/Research',
'License :: OSI Approved :: BSD License',
'Natural Language :: English',
'Operating System :: MacOS',
'Operating System :: MacOS :: MacOS 9',
'Operating System :: MacOS :: MacOS X',
'Operating System :: POSIX',
'Operating System :: POSIX :: Linux',
'Operating System :: Unix',
'Operating System :: Microsoft',
'Operating System :: Microsoft :: Windows',
'Operating System :: Microsoft :: Windows :: Windows 10',
'Operating System :: Android',
'Operating System :: OS Independent',
'Programming Language :: Python :: Implementation',
'Programming Language :: Python :: 3 :: Only',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Topic :: Internet',
'Topic :: Internet :: Proxy Servers',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Internet :: WWW/HTTP :: Browsers',
'Topic :: Internet :: WWW/HTTP :: Dynamic Content',
'Topic :: Internet :: WWW/HTTP :: Dynamic Content :: CGI Tools/Libraries',
'Topic :: Internet :: WWW/HTTP :: HTTP Servers',
'Topic :: Scientific/Engineering :: Information Analysis',
'Topic :: Software Development :: Debuggers',
'Topic :: Software Development :: Libraries :: Python Modules',
'Topic :: System :: Monitoring',
'Topic :: System :: Networking',
'Topic :: System :: Networking :: Firewalls',
'Topic :: System :: Networking :: Monitoring',
'Topic :: Utilities',
'Typing :: Typed',
],
)

2387
tests.py

File diff suppressed because it is too large Load Diff

14
tests/__init__.py Normal file
View File

@ -0,0 +1,14 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import logging
from proxy.common.constants import DEFAULT_LOG_FORMAT
logging.basicConfig(level=logging.DEBUG, format=DEFAULT_LOG_FORMAT)

147
tests/test_acceptor.py Normal file
View File

@ -0,0 +1,147 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import unittest
import socket
import selectors
import multiprocessing
from unittest import mock
from proxy.common.flags import Flags
from proxy.core.acceptor import Acceptor, AcceptorPool
class TestAcceptor(unittest.TestCase):
def setUp(self) -> None:
self.acceptor_id = 1
self.mock_protocol_handler = mock.MagicMock()
self.pipe = multiprocessing.Pipe()
self.flags = Flags()
self.acceptor = Acceptor(
idd=self.acceptor_id,
work_queue=self.pipe[1],
flags=self.flags,
work_klass=self.mock_protocol_handler)
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
@mock.patch('proxy.core.acceptor.recv_handle')
def test_continues_when_no_events(
self,
mock_recv_handle: mock.Mock,
mock_fromfd: mock.Mock,
mock_selector: mock.Mock) -> None:
fileno = 10
conn = mock.MagicMock()
addr = mock.MagicMock()
sock = mock_fromfd.return_value
mock_fromfd.return_value.accept.return_value = (conn, addr)
mock_recv_handle.return_value = fileno
selector = mock_selector.return_value
selector.select.side_effect = [[], KeyboardInterrupt()]
self.acceptor.run()
sock.accept.assert_not_called()
self.mock_protocol_handler.assert_not_called()
@mock.patch('threading.Thread')
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
@mock.patch('proxy.core.acceptor.recv_handle')
def test_accepts_client_from_server_socket(
self,
mock_recv_handle: mock.Mock,
mock_fromfd: mock.Mock,
mock_selector: mock.Mock,
mock_thread: mock.Mock) -> None:
fileno = 10
conn = mock.MagicMock()
addr = mock.MagicMock()
sock = mock_fromfd.return_value
mock_fromfd.return_value.accept.return_value = (conn, addr)
mock_recv_handle.return_value = fileno
mock_thread.return_value.start.side_effect = KeyboardInterrupt()
selector = mock_selector.return_value
selector.select.return_value = [(None, None)]
self.acceptor.run()
selector.register.assert_called_with(sock, selectors.EVENT_READ)
selector.unregister.assert_called_with(sock)
mock_recv_handle.assert_called_with(self.pipe[1])
mock_fromfd.assert_called_with(
fileno,
family=socket.AF_INET6,
type=socket.SOCK_STREAM
)
self.mock_protocol_handler.assert_called_with(
fileno=conn.fileno(),
addr=addr,
flags=self.flags,
event_queue=None,
)
mock_thread.assert_called_with(
target=self.mock_protocol_handler.return_value.run)
mock_thread.return_value.start.assert_called()
sock.close.assert_called()
class TestAcceptorPool(unittest.TestCase):
@mock.patch('proxy.core.acceptor.send_handle')
@mock.patch('multiprocessing.Pipe')
@mock.patch('socket.socket')
@mock.patch('proxy.core.acceptor.Acceptor')
def test_setup_and_shutdown(
self,
mock_worker: mock.Mock,
mock_socket: mock.Mock,
mock_pipe: mock.Mock,
_mock_send_handle: mock.Mock) -> None:
mock_worker1 = mock.MagicMock()
mock_worker2 = mock.MagicMock()
mock_worker.side_effect = [mock_worker1, mock_worker2]
num_workers = 2
sock = mock_socket.return_value
work_klass = mock.MagicMock()
flags = Flags(num_workers=2)
acceptor = AcceptorPool(flags=flags, work_klass=work_klass)
acceptor.setup()
work_klass.assert_not_called()
mock_socket.assert_called_with(
socket.AF_INET6 if acceptor.flags.hostname.version == 6 else socket.AF_INET,
socket.SOCK_STREAM
)
sock.setsockopt.assert_called_with(
socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
sock.bind.assert_called_with(
(str(acceptor.flags.hostname), acceptor.flags.port))
sock.listen.assert_called_with(acceptor.flags.backlog)
sock.setblocking.assert_called_with(False)
self.assertTrue(mock_pipe.call_count, num_workers)
self.assertTrue(mock_worker.call_count, num_workers)
mock_worker1.start.assert_called()
mock_worker1.join.assert_not_called()
mock_worker2.start.assert_called()
mock_worker2.join.assert_not_called()
sock.close.assert_called()
acceptor.shutdown()
mock_worker1.join.assert_called()
mock_worker2.join.assert_called()

View File

@ -0,0 +1,89 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import unittest
from proxy.http.chunk_parser import chunkParserStates, ChunkParser
class TestChunkParser(unittest.TestCase):
def setUp(self) -> None:
self.parser = ChunkParser()
def test_chunk_parse_basic(self) -> None:
self.parser.parse(b''.join([
b'4\r\n',
b'Wiki\r\n',
b'5\r\n',
b'pedia\r\n',
b'E\r\n',
b' in\r\n\r\nchunks.\r\n',
b'0\r\n',
b'\r\n'
]))
self.assertEqual(self.parser.chunk, b'')
self.assertEqual(self.parser.size, None)
self.assertEqual(self.parser.body, b'Wikipedia in\r\n\r\nchunks.')
self.assertEqual(self.parser.state, chunkParserStates.COMPLETE)
def test_chunk_parse_issue_27(self) -> None:
"""Case when data ends with the chunk size but without ending CRLF."""
self.parser.parse(b'3')
self.assertEqual(self.parser.chunk, b'3')
self.assertEqual(self.parser.size, None)
self.assertEqual(self.parser.body, b'')
self.assertEqual(
self.parser.state,
chunkParserStates.WAITING_FOR_SIZE)
self.parser.parse(b'\r\n')
self.assertEqual(self.parser.chunk, b'')
self.assertEqual(self.parser.size, 3)
self.assertEqual(self.parser.body, b'')
self.assertEqual(
self.parser.state,
chunkParserStates.WAITING_FOR_DATA)
self.parser.parse(b'abc')
self.assertEqual(self.parser.chunk, b'')
self.assertEqual(self.parser.size, None)
self.assertEqual(self.parser.body, b'abc')
self.assertEqual(
self.parser.state,
chunkParserStates.WAITING_FOR_SIZE)
self.parser.parse(b'\r\n')
self.assertEqual(self.parser.chunk, b'')
self.assertEqual(self.parser.size, None)
self.assertEqual(self.parser.body, b'abc')
self.assertEqual(
self.parser.state,
chunkParserStates.WAITING_FOR_SIZE)
self.parser.parse(b'4\r\n')
self.assertEqual(self.parser.chunk, b'')
self.assertEqual(self.parser.size, 4)
self.assertEqual(self.parser.body, b'abc')
self.assertEqual(
self.parser.state,
chunkParserStates.WAITING_FOR_DATA)
self.parser.parse(b'defg\r\n0')
self.assertEqual(self.parser.chunk, b'0')
self.assertEqual(self.parser.size, None)
self.assertEqual(self.parser.body, b'abcdefg')
self.assertEqual(
self.parser.state,
chunkParserStates.WAITING_FOR_SIZE)
self.parser.parse(b'\r\n\r\n')
self.assertEqual(self.parser.chunk, b'')
self.assertEqual(self.parser.size, None)
self.assertEqual(self.parser.body, b'abcdefg')
self.assertEqual(self.parser.state, chunkParserStates.COMPLETE)
def test_to_chunks(self) -> None:
self.assertEqual(
b'f\r\n{"key":"value"}\r\n0\r\n\r\n',
ChunkParser.to_chunks(b'{"key":"value"}'))

118
tests/test_connection.py Normal file
View File

@ -0,0 +1,118 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import unittest
import socket
import ssl
from unittest import mock
from typing import Optional, Union
from proxy.core.connection import tcpConnectionTypes, TcpConnectionUninitializedException
from proxy.core.connection import TcpServerConnection, TcpConnection, TcpClientConnection
from proxy.common.constants import DEFAULT_IPV6_HOSTNAME, DEFAULT_PORT, DEFAULT_IPV4_HOSTNAME
class TestTcpConnection(unittest.TestCase):
class TcpConnectionToTest(TcpConnection):
def __init__(self, conn: Optional[Union[ssl.SSLSocket, socket.socket]] = None,
tag: int = tcpConnectionTypes.CLIENT) -> None:
super().__init__(tag)
self._conn = conn
@property
def connection(self) -> Union[ssl.SSLSocket, socket.socket]:
if self._conn is None:
raise TcpConnectionUninitializedException()
return self._conn
def testThrowsKeyErrorIfNoConn(self) -> None:
self.conn = TestTcpConnection.TcpConnectionToTest()
with self.assertRaises(TcpConnectionUninitializedException):
self.conn.send(b'dummy')
with self.assertRaises(TcpConnectionUninitializedException):
self.conn.recv()
with self.assertRaises(TcpConnectionUninitializedException):
self.conn.close()
def testClosesIfNotClosed(self) -> None:
_conn = mock.MagicMock()
self.conn = TestTcpConnection.TcpConnectionToTest(_conn)
self.conn.close()
_conn.close.assert_called()
self.assertTrue(self.conn.closed)
def testNoOpIfAlreadyClosed(self) -> None:
_conn = mock.MagicMock()
self.conn = TestTcpConnection.TcpConnectionToTest(_conn)
self.conn.closed = True
self.conn.close()
_conn.close.assert_not_called()
self.assertTrue(self.conn.closed)
def testFlushReturnsIfNoBuffer(self) -> None:
_conn = mock.MagicMock()
self.conn = TestTcpConnection.TcpConnectionToTest(_conn)
self.conn.flush()
self.assertTrue(not _conn.send.called)
@mock.patch('socket.socket')
def testTcpServerEstablishesIPv6Connection(
self, mock_socket: mock.Mock) -> None:
conn = TcpServerConnection(
str(DEFAULT_IPV6_HOSTNAME), DEFAULT_PORT)
conn.connect()
mock_socket.assert_called()
mock_socket.return_value.connect.assert_called_with(
(str(DEFAULT_IPV6_HOSTNAME), DEFAULT_PORT, 0, 0))
@mock.patch('proxy.core.connection.new_socket_connection')
def testTcpServerIgnoresDoubleConnectSilently(
self,
mock_new_socket_connection: mock.Mock) -> None:
conn = TcpServerConnection(
str(DEFAULT_IPV6_HOSTNAME), DEFAULT_PORT)
conn.connect()
conn.connect()
mock_new_socket_connection.assert_called_once()
@mock.patch('socket.socket')
def testTcpServerEstablishesIPv4Connection(
self, mock_socket: mock.Mock) -> None:
conn = TcpServerConnection(
str(DEFAULT_IPV4_HOSTNAME), DEFAULT_PORT)
conn.connect()
mock_socket.assert_called()
mock_socket.return_value.connect.assert_called_with(
(str(DEFAULT_IPV4_HOSTNAME), DEFAULT_PORT))
@mock.patch('proxy.core.connection.new_socket_connection')
def testTcpServerConnectionProperty(
self,
mock_new_socket_connection: mock.Mock) -> None:
conn = TcpServerConnection(
str(DEFAULT_IPV6_HOSTNAME), DEFAULT_PORT)
conn.connect()
self.assertEqual(
conn.connection,
mock_new_socket_connection.return_value)
def testTcpServerRaisesTcpConnectionUninitializedException(self) -> None:
conn = TcpServerConnection(
str(DEFAULT_IPV6_HOSTNAME), DEFAULT_PORT)
with self.assertRaises(TcpConnectionUninitializedException):
_ = conn.connection
def testTcpClientRaisesTcpConnectionUninitializedException(self) -> None:
_conn = mock.MagicMock()
_addr = mock.MagicMock()
conn = TcpClientConnection(_conn, _addr)
conn._conn = None
with self.assertRaises(TcpConnectionUninitializedException):
_ = conn.connection

520
tests/test_http_parser.py Normal file
View File

@ -0,0 +1,520 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import unittest
from typing import Dict, Tuple
from proxy.common.constants import CRLF
from proxy.common.utils import build_http_request, find_http_line, build_http_response, build_http_header, bytes_
from proxy.http.methods import httpMethods
from proxy.http.codes import httpStatusCodes
from proxy.http.parser import HttpParser, httpParserTypes, httpParserStates
class TestHttpParser(unittest.TestCase):
def setUp(self) -> None:
self.parser = HttpParser(httpParserTypes.REQUEST_PARSER)
def test_build_request(self) -> None:
self.assertEqual(
build_http_request(
b'GET', b'http://localhost:12345', b'HTTP/1.1'),
CRLF.join([
b'GET http://localhost:12345 HTTP/1.1',
CRLF
]))
self.assertEqual(
build_http_request(b'GET', b'http://localhost:12345', b'HTTP/1.1',
headers={b'key': b'value'}),
CRLF.join([
b'GET http://localhost:12345 HTTP/1.1',
b'key: value',
CRLF
]))
self.assertEqual(
build_http_request(b'GET', b'http://localhost:12345', b'HTTP/1.1',
headers={b'key': b'value'},
body=b'Hello from py'),
CRLF.join([
b'GET http://localhost:12345 HTTP/1.1',
b'key: value',
CRLF
]) + b'Hello from py')
def test_build_response(self) -> None:
self.assertEqual(
build_http_response(
200, reason=b'OK', protocol_version=b'HTTP/1.1'),
CRLF.join([
b'HTTP/1.1 200 OK',
CRLF
]))
self.assertEqual(
build_http_response(200, reason=b'OK', protocol_version=b'HTTP/1.1',
headers={b'key': b'value'}),
CRLF.join([
b'HTTP/1.1 200 OK',
b'key: value',
CRLF
]))
def test_build_response_adds_content_length_header(self) -> None:
body = b'Hello world!!!'
self.assertEqual(
build_http_response(200, reason=b'OK', protocol_version=b'HTTP/1.1',
headers={b'key': b'value'},
body=body),
CRLF.join([
b'HTTP/1.1 200 OK',
b'key: value',
b'Content-Length: ' + bytes_(len(body)),
CRLF
]) + body)
def test_build_header(self) -> None:
self.assertEqual(
build_http_header(
b'key', b'value'), b'key: value')
def test_header_raises(self) -> None:
with self.assertRaises(KeyError):
self.parser.header(b'not-found')
def test_has_header(self) -> None:
self.parser.add_header(b'key', b'value')
self.assertFalse(self.parser.has_header(b'not-found'))
self.assertTrue(self.parser.has_header(b'key'))
def test_set_host_port_raises(self) -> None:
with self.assertRaises(KeyError):
self.parser.set_line_attributes()
def test_find_line(self) -> None:
self.assertEqual(
find_http_line(
b'CONNECT python.org:443 HTTP/1.0\r\n\r\n'),
(b'CONNECT python.org:443 HTTP/1.0',
CRLF))
def test_find_line_returns_None(self) -> None:
self.assertEqual(
find_http_line(b'CONNECT python.org:443 HTTP/1.0'),
(None,
b'CONNECT python.org:443 HTTP/1.0'))
def test_connect_request_with_crlf_as_separate_chunk(self) -> None:
"""See https://github.com/abhinavsingh/py/issues/70 for background."""
raw = b'CONNECT pypi.org:443 HTTP/1.0\r\n'
self.parser.parse(raw)
self.assertEqual(self.parser.state, httpParserStates.LINE_RCVD)
self.parser.parse(CRLF)
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
def test_get_full_parse(self) -> None:
raw = CRLF.join([
b'GET %s HTTP/1.1',
b'Host: %s',
CRLF
])
pkt = raw % (b'https://example.com/path/dir/?a=b&c=d#p=q',
b'example.com')
self.parser.parse(pkt)
self.assertEqual(self.parser.total_size, len(pkt))
self.assertEqual(self.parser.build_url(), b'/path/dir/?a=b&c=d#p=q')
self.assertEqual(self.parser.method, b'GET')
assert self.parser.url
self.assertEqual(self.parser.url.hostname, b'example.com')
self.assertEqual(self.parser.url.port, None)
self.assertEqual(self.parser.version, b'HTTP/1.1')
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
self.assertDictContainsSubset(
{b'host': (b'Host', b'example.com')}, self.parser.headers)
self.parser.del_headers([b'host'])
self.parser.add_headers([(b'Host', b'example.com')])
self.assertEqual(
raw %
(b'/path/dir/?a=b&c=d#p=q',
b'example.com'),
self.parser.build())
def test_build_url_none(self) -> None:
self.assertEqual(self.parser.build_url(), b'/None')
def test_line_rcvd_to_rcving_headers_state_change(self) -> None:
pkt = b'GET http://localhost HTTP/1.1'
self.parser.parse(pkt)
self.assertEqual(self.parser.total_size, len(pkt))
self.assert_state_change_with_crlf(
httpParserStates.INITIALIZED,
httpParserStates.LINE_RCVD,
httpParserStates.COMPLETE)
def test_get_partial_parse1(self) -> None:
pkt = CRLF.join([
b'GET http://localhost:8080 HTTP/1.1'
])
self.parser.parse(pkt)
self.assertEqual(self.parser.total_size, len(pkt))
self.assertEqual(self.parser.method, None)
self.assertEqual(self.parser.url, None)
self.assertEqual(self.parser.version, None)
self.assertEqual(
self.parser.state,
httpParserStates.INITIALIZED)
self.parser.parse(CRLF)
self.assertEqual(self.parser.total_size, len(pkt) + len(CRLF))
self.assertEqual(self.parser.method, b'GET')
assert self.parser.url
self.assertEqual(self.parser.url.hostname, b'localhost')
self.assertEqual(self.parser.url.port, 8080)
self.assertEqual(self.parser.version, b'HTTP/1.1')
self.assertEqual(self.parser.state, httpParserStates.LINE_RCVD)
host_hdr = b'Host: localhost:8080'
self.parser.parse(host_hdr)
self.assertEqual(self.parser.total_size,
len(pkt) + len(CRLF) + len(host_hdr))
self.assertDictEqual(self.parser.headers, dict())
self.assertEqual(self.parser.buffer, b'Host: localhost:8080')
self.assertEqual(self.parser.state, httpParserStates.LINE_RCVD)
self.parser.parse(CRLF * 2)
self.assertEqual(self.parser.total_size, len(pkt) +
(3 * len(CRLF)) + len(host_hdr))
self.assertDictContainsSubset(
{b'host': (b'Host', b'localhost:8080')}, self.parser.headers)
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
def test_get_partial_parse2(self) -> None:
self.parser.parse(CRLF.join([
b'GET http://localhost:8080 HTTP/1.1',
b'Host: '
]))
self.assertEqual(self.parser.method, b'GET')
assert self.parser.url
self.assertEqual(self.parser.url.hostname, b'localhost')
self.assertEqual(self.parser.url.port, 8080)
self.assertEqual(self.parser.version, b'HTTP/1.1')
self.assertEqual(self.parser.buffer, b'Host: ')
self.assertEqual(self.parser.state, httpParserStates.LINE_RCVD)
self.parser.parse(b'localhost:8080' + CRLF)
self.assertDictContainsSubset(
{b'host': (b'Host', b'localhost:8080')}, self.parser.headers)
self.assertEqual(self.parser.buffer, b'')
self.assertEqual(
self.parser.state,
httpParserStates.RCVING_HEADERS)
self.parser.parse(b'Content-Type: text/plain' + CRLF)
self.assertEqual(self.parser.buffer, b'')
self.assertDictContainsSubset(
{b'content-type': (b'Content-Type', b'text/plain')}, self.parser.headers)
self.assertEqual(
self.parser.state,
httpParserStates.RCVING_HEADERS)
self.parser.parse(CRLF)
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
def test_post_full_parse(self) -> None:
raw = CRLF.join([
b'POST %s HTTP/1.1',
b'Host: localhost',
b'Content-Length: 7',
b'Content-Type: application/x-www-form-urlencoded' + CRLF,
b'a=b&c=d'
])
self.parser.parse(raw % b'http://localhost')
self.assertEqual(self.parser.method, b'POST')
assert self.parser.url
self.assertEqual(self.parser.url.hostname, b'localhost')
self.assertEqual(self.parser.url.port, None)
self.assertEqual(self.parser.version, b'HTTP/1.1')
self.assertDictContainsSubset(
{b'content-type': (b'Content-Type', b'application/x-www-form-urlencoded')}, self.parser.headers)
self.assertDictContainsSubset(
{b'content-length': (b'Content-Length', b'7')}, self.parser.headers)
self.assertEqual(self.parser.body, b'a=b&c=d')
self.assertEqual(self.parser.buffer, b'')
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
self.assertEqual(len(self.parser.build()), len(raw % b'/'))
def assert_state_change_with_crlf(self,
initial_state: int,
next_state: int,
final_state: int) -> None:
self.assertEqual(self.parser.state, initial_state)
self.parser.parse(CRLF)
self.assertEqual(self.parser.state, next_state)
self.parser.parse(CRLF)
self.assertEqual(self.parser.state, final_state)
def test_post_partial_parse(self) -> None:
self.parser.parse(CRLF.join([
b'POST http://localhost HTTP/1.1',
b'Host: localhost',
b'Content-Length: 7',
b'Content-Type: application/x-www-form-urlencoded'
]))
self.assertEqual(self.parser.method, b'POST')
assert self.parser.url
self.assertEqual(self.parser.url.hostname, b'localhost')
self.assertEqual(self.parser.url.port, None)
self.assertEqual(self.parser.version, b'HTTP/1.1')
self.assert_state_change_with_crlf(
httpParserStates.RCVING_HEADERS,
httpParserStates.RCVING_HEADERS,
httpParserStates.HEADERS_COMPLETE)
self.parser.parse(b'a=b')
self.assertEqual(
self.parser.state,
httpParserStates.RCVING_BODY)
self.assertEqual(self.parser.body, b'a=b')
self.assertEqual(self.parser.buffer, b'')
self.parser.parse(b'&c=d')
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
self.assertEqual(self.parser.body, b'a=b&c=d')
self.assertEqual(self.parser.buffer, b'')
def test_connect_request_without_host_header_request_parse(self) -> None:
"""Case where clients can send CONNECT request without a Host header field.
Example:
1. pip3 --proxy http://localhost:8899 install <package name>
Uses HTTP/1.0, Host header missing with CONNECT requests
2. Android Emulator
Uses HTTP/1.1, Host header missing with CONNECT requests
See https://github.com/abhinavsingh/py/issues/5 for details.
"""
self.parser.parse(b'CONNECT pypi.org:443 HTTP/1.0\r\n\r\n')
self.assertEqual(self.parser.method, b'CONNECT')
self.assertEqual(self.parser.version, b'HTTP/1.0')
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
def test_request_parse_without_content_length(self) -> None:
"""Case when incoming request doesn't contain a content-length header.
From http://w3-org.9356.n7.nabble.com/POST-with-empty-body-td103965.html
'A POST with no content-length and no body is equivalent to a POST with Content-Length: 0
and nothing following, as could perfectly happen when you upload an empty file for instance.'
See https://github.com/abhinavsingh/py/issues/20 for details.
"""
self.parser.parse(CRLF.join([
b'POST http://localhost HTTP/1.1',
b'Host: localhost',
b'Content-Type: application/x-www-form-urlencoded',
CRLF
]))
self.assertEqual(self.parser.method, b'POST')
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
def test_response_parse_without_content_length(self) -> None:
"""Case when server response doesn't contain a content-length header for non-chunk response types.
HttpParser by itself has no way to know if more data should be expected.
In example below, parser reaches state httpParserStates.HEADERS_COMPLETE
and it is responsibility of callee to change state to httpParserStates.COMPLETE
when server stream closes.
See https://github.com/abhinavsingh/py/issues/20 for details.
"""
self.parser.type = httpParserTypes.RESPONSE_PARSER
self.parser.parse(b'HTTP/1.0 200 OK' + CRLF)
self.assertEqual(self.parser.code, b'200')
self.assertEqual(self.parser.version, b'HTTP/1.0')
self.assertEqual(self.parser.state, httpParserStates.LINE_RCVD)
self.parser.parse(CRLF.join([
b'Server: BaseHTTP/0.3 Python/2.7.10',
b'Date: Thu, 13 Dec 2018 16:24:09 GMT',
CRLF
]))
self.assertEqual(
self.parser.state,
httpParserStates.HEADERS_COMPLETE)
def test_response_parse(self) -> None:
self.parser.type = httpParserTypes.RESPONSE_PARSER
self.parser.parse(b''.join([
b'HTTP/1.1 301 Moved Permanently\r\n',
b'Location: http://www.google.com/\r\n',
b'Content-Type: text/html; charset=UTF-8\r\n',
b'Date: Wed, 22 May 2013 14:07:29 GMT\r\n',
b'Expires: Fri, 21 Jun 2013 14:07:29 GMT\r\n',
b'Cache-Control: public, max-age=2592000\r\n',
b'Server: gws\r\n',
b'Content-Length: 219\r\n',
b'X-XSS-Protection: 1; mode=block\r\n',
b'X-Frame-Options: SAMEORIGIN\r\n\r\n',
b'<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">\n' +
b'<TITLE>301 Moved</TITLE></HEAD>',
b'<BODY>\n<H1>301 Moved</H1>\nThe document has moved\n' +
b'<A HREF="http://www.google.com/">here</A>.\r\n</BODY></HTML>\r\n'
]))
self.assertEqual(self.parser.code, b'301')
self.assertEqual(self.parser.reason, b'Moved Permanently')
self.assertEqual(self.parser.version, b'HTTP/1.1')
self.assertEqual(
self.parser.body,
b'<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">\n' +
b'<TITLE>301 Moved</TITLE></HEAD><BODY>\n<H1>301 Moved</H1>\nThe document has moved\n' +
b'<A HREF="http://www.google.com/">here</A>.\r\n</BODY></HTML>\r\n')
self.assertDictContainsSubset(
{b'content-length': (b'Content-Length', b'219')}, self.parser.headers)
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
def test_response_partial_parse(self) -> None:
self.parser.type = httpParserTypes.RESPONSE_PARSER
self.parser.parse(b''.join([
b'HTTP/1.1 301 Moved Permanently\r\n',
b'Location: http://www.google.com/\r\n',
b'Content-Type: text/html; charset=UTF-8\r\n',
b'Date: Wed, 22 May 2013 14:07:29 GMT\r\n',
b'Expires: Fri, 21 Jun 2013 14:07:29 GMT\r\n',
b'Cache-Control: public, max-age=2592000\r\n',
b'Server: gws\r\n',
b'Content-Length: 219\r\n',
b'X-XSS-Protection: 1; mode=block\r\n',
b'X-Frame-Options: SAMEORIGIN\r\n'
]))
self.assertDictContainsSubset(
{b'x-frame-options': (b'X-Frame-Options', b'SAMEORIGIN')}, self.parser.headers)
self.assertEqual(
self.parser.state,
httpParserStates.RCVING_HEADERS)
self.parser.parse(b'\r\n')
self.assertEqual(
self.parser.state,
httpParserStates.HEADERS_COMPLETE)
self.parser.parse(
b'<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">\n' +
b'<TITLE>301 Moved</TITLE></HEAD>')
self.assertEqual(
self.parser.state,
httpParserStates.RCVING_BODY)
self.parser.parse(
b'<BODY>\n<H1>301 Moved</H1>\nThe document has moved\n' +
b'<A HREF="http://www.google.com/">here</A>.\r\n</BODY></HTML>\r\n')
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
def test_chunked_response_parse(self) -> None:
self.parser.type = httpParserTypes.RESPONSE_PARSER
self.parser.parse(b''.join([
b'HTTP/1.1 200 OK\r\n',
b'Content-Type: application/json\r\n',
b'Date: Wed, 22 May 2013 15:08:15 GMT\r\n',
b'Server: gunicorn/0.16.1\r\n',
b'transfer-encoding: chunked\r\n',
b'Connection: keep-alive\r\n\r\n',
b'4\r\n',
b'Wiki\r\n',
b'5\r\n',
b'pedia\r\n',
b'E\r\n',
b' in\r\n\r\nchunks.\r\n',
b'0\r\n',
b'\r\n'
]))
self.assertEqual(self.parser.body, b'Wikipedia in\r\n\r\nchunks.')
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
def test_pipelined_response_parse(self) -> None:
response = build_http_response(
httpStatusCodes.OK, reason=b'OK',
headers={
b'Content-Length': b'15'
},
body=b'{"key":"value"}',
)
self.assert_pipeline_response(response)
def test_pipelined_chunked_response_parse(self) -> None:
response = build_http_response(
httpStatusCodes.OK, reason=b'OK',
headers={
b'Transfer-Encoding': b'chunked',
b'Content-Type': b'application/json',
},
body=b'f\r\n{"key":"value"}\r\n0\r\n\r\n'
)
self.assert_pipeline_response(response)
def assert_pipeline_response(self, response: bytes) -> None:
self.parser = HttpParser(httpParserTypes.RESPONSE_PARSER)
self.parser.parse(response + response)
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
self.assertEqual(self.parser.body, b'{"key":"value"}')
self.assertEqual(self.parser.buffer, response)
# parse buffer
parser = HttpParser(httpParserTypes.RESPONSE_PARSER)
parser.parse(self.parser.buffer)
self.assertEqual(parser.state, httpParserStates.COMPLETE)
self.assertEqual(parser.body, b'{"key":"value"}')
self.assertEqual(parser.buffer, b'')
def test_chunked_request_parse(self) -> None:
self.parser.parse(build_http_request(
httpMethods.POST, b'http://example.org/',
headers={
b'Transfer-Encoding': b'chunked',
b'Content-Type': b'application/json',
},
body=b'f\r\n{"key":"value"}\r\n0\r\n\r\n'))
self.assertEqual(self.parser.body, b'{"key":"value"}')
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
self.assertEqual(self.parser.build(), build_http_request(
httpMethods.POST, b'/',
headers={
b'Transfer-Encoding': b'chunked',
b'Content-Type': b'application/json',
},
body=b'f\r\n{"key":"value"}\r\n0\r\n\r\n'))
def test_is_http_1_1_keep_alive(self) -> None:
self.parser.parse(build_http_request(
httpMethods.GET, b'/'
))
self.assertTrue(self.parser.is_http_1_1_keep_alive())
def test_is_http_1_1_keep_alive_with_non_close_connection_header(
self) -> None:
self.parser.parse(build_http_request(
httpMethods.GET, b'/',
headers={
b'Connection': b'keep-alive',
}
))
self.assertTrue(self.parser.is_http_1_1_keep_alive())
def test_is_not_http_1_1_keep_alive_with_close_header(self) -> None:
self.parser.parse(build_http_request(
httpMethods.GET, b'/',
headers={
b'Connection': b'close',
}
))
self.assertFalse(self.parser.is_http_1_1_keep_alive())
def test_is_not_http_1_1_keep_alive_for_http_1_0(self) -> None:
self.parser.parse(build_http_request(
httpMethods.GET, b'/', protocol_version=b'HTTP/1.0',
))
self.assertFalse(self.parser.is_http_1_1_keep_alive())
def assertDictContainsSubset(self, subset: Dict[bytes, Tuple[bytes, bytes]],
dictionary: Dict[bytes, Tuple[bytes, bytes]]) -> None:
for k in subset.keys():
self.assertTrue(k in dictionary)

91
tests/test_http_proxy.py Normal file
View File

@ -0,0 +1,91 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import unittest
import selectors
from unittest import mock
from proxy.common.flags import Flags
from proxy.http.proxy import HttpProxyPlugin
from proxy.http.handler import HttpProtocolHandler
from proxy.http.exception import HttpProtocolException
from proxy.common.utils import build_http_request
class TestHttpProxyPlugin(unittest.TestCase):
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
def setUp(self,
mock_fromfd: mock.Mock,
mock_selector: mock.Mock) -> None:
self.mock_fromfd = mock_fromfd
self.mock_selector = mock_selector
self.fileno = 10
self._addr = ('127.0.0.1', 54382)
self.flags = Flags()
self.plugin = mock.MagicMock()
self.flags.plugins = {
b'HttpProtocolHandlerPlugin': [HttpProxyPlugin],
b'HttpProxyBasePlugin': [self.plugin]
}
self._conn = mock_fromfd.return_value
self.protocol_handler = HttpProtocolHandler(
self.fileno, self._addr, flags=self.flags)
self.protocol_handler.initialize()
def test_proxy_plugin_initialized(self) -> None:
self.plugin.assert_called()
@mock.patch('proxy.http.proxy.TcpServerConnection')
def test_proxy_plugin_on_and_before_upstream_connection(
self,
mock_server_conn: mock.Mock) -> None:
self.plugin.return_value.before_upstream_connection.side_effect = lambda r: r
self.plugin.return_value.handle_client_request.side_effect = lambda r: r
self._conn.recv.return_value = build_http_request(
b'GET', b'http://upstream.host/not-found.html',
headers={
b'Host': b'upstream.host'
})
self.mock_selector.return_value.select.side_effect = [
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)], ]
self.protocol_handler.run_once()
mock_server_conn.assert_called_with('upstream.host', 80)
self.plugin.return_value.before_upstream_connection.assert_called()
self.plugin.return_value.handle_client_request.assert_called()
@mock.patch('proxy.http.proxy.TcpServerConnection')
def test_proxy_plugin_before_upstream_connection_can_teardown(
self,
mock_server_conn: mock.Mock) -> None:
self.plugin.return_value.before_upstream_connection.side_effect = HttpProtocolException()
self._conn.recv.return_value = build_http_request(
b'GET', b'http://upstream.host/not-found.html',
headers={
b'Host': b'upstream.host'
})
self.mock_selector.return_value.select.side_effect = [
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)], ]
self.protocol_handler.run_once()
self.plugin.return_value.before_upstream_connection.assert_called()
mock_server_conn.assert_not_called()

View File

@ -0,0 +1,444 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import unittest
import selectors
import ssl
import socket
import json
from urllib import parse as urlparse
from unittest import mock
from typing import Type, cast, Any
from proxy.common.flags import Flags
from proxy.http.handler import HttpProtocolHandler
from proxy.http.proxy import HttpProxyBasePlugin, HttpProxyPlugin
from proxy.common.utils import build_http_request, bytes_, build_http_response
from proxy.common.constants import PROXY_AGENT_HEADER_VALUE
from proxy.http.codes import httpStatusCodes
from proxy.http.methods import httpMethods
from plugin_examples import modify_post_data
from plugin_examples import mock_rest_api
from plugin_examples import redirect_to_custom_server
from plugin_examples import filter_by_upstream
from plugin_examples import cache_responses
from plugin_examples import man_in_the_middle
def get_plugin_by_test_name(test_name: str) -> Type[HttpProxyBasePlugin]:
plugin: Type[HttpProxyBasePlugin] = modify_post_data.ModifyPostDataPlugin
if test_name == 'test_modify_post_data_plugin':
plugin = modify_post_data.ModifyPostDataPlugin
elif test_name == 'test_proposed_rest_api_plugin':
plugin = mock_rest_api.ProposedRestApiPlugin
elif test_name == 'test_redirect_to_custom_server_plugin':
plugin = redirect_to_custom_server.RedirectToCustomServerPlugin
elif test_name == 'test_filter_by_upstream_host_plugin':
plugin = filter_by_upstream.FilterByUpstreamHostPlugin
elif test_name == 'test_cache_responses_plugin':
plugin = cache_responses.CacheResponsesPlugin
elif test_name == 'test_man_in_the_middle_plugin':
plugin = man_in_the_middle.ManInTheMiddlePlugin
return plugin
class TestHttpProxyPluginExamples(unittest.TestCase):
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
def setUp(self,
mock_fromfd: mock.Mock,
mock_selector: mock.Mock) -> None:
self.fileno = 10
self._addr = ('127.0.0.1', 54382)
self.flags = Flags()
self.plugin = mock.MagicMock()
self.mock_fromfd = mock_fromfd
self.mock_selector = mock_selector
plugin = get_plugin_by_test_name(self._testMethodName)
self.flags.plugins = {
b'HttpProtocolHandlerPlugin': [HttpProxyPlugin],
b'HttpProxyBasePlugin': [plugin],
}
self._conn = mock_fromfd.return_value
self.protocol_handler = HttpProtocolHandler(
self.fileno, self._addr, flags=self.flags)
self.protocol_handler.initialize()
@mock.patch('proxy.http.proxy.TcpServerConnection')
def test_modify_post_data_plugin(
self, mock_server_conn: mock.Mock) -> None:
original = b'{"key": "value"}'
modified = b'{"key": "modified"}'
self._conn.recv.return_value = build_http_request(
b'POST', b'http://httpbin.org/post',
headers={
b'Host': b'httpbin.org',
b'Content-Type': b'application/x-www-form-urlencoded',
b'Content-Length': bytes_(len(original)),
},
body=original
)
self.mock_selector.return_value.select.side_effect = [
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)], ]
self.protocol_handler.run_once()
mock_server_conn.assert_called_with('httpbin.org', 80)
mock_server_conn.return_value.queue.assert_called_with(
build_http_request(
b'POST', b'/post',
headers={
b'Host': b'httpbin.org',
b'Content-Length': bytes_(len(modified)),
b'Content-Type': b'application/json',
b'Via': b'1.1 %s' % PROXY_AGENT_HEADER_VALUE,
},
body=modified
)
)
@mock.patch('proxy.http.proxy.TcpServerConnection')
def test_proposed_rest_api_plugin(
self, mock_server_conn: mock.Mock) -> None:
path = b'/v1/users/'
self._conn.recv.return_value = build_http_request(
b'GET', b'http://%s%s' % (
mock_rest_api.ProposedRestApiPlugin.API_SERVER, path),
headers={
b'Host': mock_rest_api.ProposedRestApiPlugin.API_SERVER,
}
)
self.mock_selector.return_value.select.side_effect = [
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)], ]
self.protocol_handler.run_once()
mock_server_conn.assert_not_called()
self.assertEqual(
self.protocol_handler.client.buffer,
build_http_response(
httpStatusCodes.OK, reason=b'OK',
headers={b'Content-Type': b'application/json'},
body=bytes_(
json.dumps(
mock_rest_api.ProposedRestApiPlugin.REST_API_SPEC[path]))
))
@mock.patch('proxy.http.proxy.TcpServerConnection')
def test_redirect_to_custom_server_plugin(
self, mock_server_conn: mock.Mock) -> None:
request = build_http_request(
b'GET', b'http://example.org/get',
headers={
b'Host': b'example.org',
}
)
self._conn.recv.return_value = request
self.mock_selector.return_value.select.side_effect = [
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)], ]
self.protocol_handler.run_once()
upstream = urlparse.urlsplit(
redirect_to_custom_server.RedirectToCustomServerPlugin.UPSTREAM_SERVER)
mock_server_conn.assert_called_with('localhost', 8899)
mock_server_conn.return_value.queue.assert_called_with(
build_http_request(
b'GET', upstream.path,
headers={
b'Host': upstream.netloc,
b'Via': b'1.1 %s' % PROXY_AGENT_HEADER_VALUE,
}
)
)
@mock.patch('proxy.http.proxy.TcpServerConnection')
def test_filter_by_upstream_host_plugin(
self, mock_server_conn: mock.Mock) -> None:
request = build_http_request(
b'GET', b'http://google.com/',
headers={
b'Host': b'google.com',
}
)
self._conn.recv.return_value = request
self.mock_selector.return_value.select.side_effect = [
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)], ]
self.protocol_handler.run_once()
mock_server_conn.assert_not_called()
self.assertEqual(
self.protocol_handler.client.buffer,
build_http_response(
status_code=httpStatusCodes.I_AM_A_TEAPOT,
reason=b'I\'m a tea pot',
headers={
b'Connection': b'close'
},
)
)
@mock.patch('proxy.http.proxy.TcpServerConnection')
def test_man_in_the_middle_plugin(
self, mock_server_conn: mock.Mock) -> None:
request = build_http_request(
b'GET', b'http://super.secure/',
headers={
b'Host': b'super.secure',
}
)
self._conn.recv.return_value = request
server = mock_server_conn.return_value
server.connect.return_value = True
def has_buffer() -> bool:
return cast(bool, server.queue.called)
def closed() -> bool:
return not server.connect.called
server.has_buffer.side_effect = has_buffer
type(server).closed = mock.PropertyMock(side_effect=closed)
self.mock_selector.return_value.select.side_effect = [
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)],
[(selectors.SelectorKey(
fileobj=server.connection,
fd=server.connection.fileno,
events=selectors.EVENT_WRITE,
data=None), selectors.EVENT_WRITE)],
[(selectors.SelectorKey(
fileobj=server.connection,
fd=server.connection.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)], ]
# Client read
self.protocol_handler.run_once()
mock_server_conn.assert_called_with('super.secure', 80)
server.connect.assert_called_once()
queued_request = \
build_http_request(
b'GET', b'/',
headers={
b'Host': b'super.secure',
b'Via': b'1.1 %s' % PROXY_AGENT_HEADER_VALUE
}
)
server.queue.assert_called_once_with(queued_request)
# Server write
self.protocol_handler.run_once()
server.flush.assert_called_once()
# Server read
server.recv.return_value = \
build_http_response(
httpStatusCodes.OK,
reason=b'OK', body=b'Original Response From Upstream')
self.protocol_handler.run_once()
self.assertEqual(
self.protocol_handler.client.buffer,
build_http_response(
httpStatusCodes.OK,
reason=b'OK', body=b'Hello from man in the middle')
)
class TestHttpProxyPluginExamplesWithTlsInterception(unittest.TestCase):
@mock.patch('ssl.wrap_socket')
@mock.patch('ssl.create_default_context')
@mock.patch('proxy.http.proxy.TcpServerConnection')
@mock.patch('subprocess.Popen')
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
def setUp(self,
mock_fromfd: mock.Mock,
mock_selector: mock.Mock,
mock_popen: mock.Mock,
mock_server_conn: mock.Mock,
mock_ssl_context: mock.Mock,
mock_ssl_wrap: mock.Mock) -> None:
self.mock_fromfd = mock_fromfd
self.mock_selector = mock_selector
self.mock_popen = mock_popen
self.mock_server_conn = mock_server_conn
self.mock_ssl_context = mock_ssl_context
self.mock_ssl_wrap = mock_ssl_wrap
self.fileno = 10
self._addr = ('127.0.0.1', 54382)
self.flags = Flags(
ca_cert_file='ca-cert.pem',
ca_key_file='ca-key.pem',
ca_signing_key_file='ca-signing-key.pem',)
self.plugin = mock.MagicMock()
plugin = get_plugin_by_test_name(self._testMethodName)
self.flags.plugins = {
b'HttpProtocolHandlerPlugin': [HttpProxyPlugin],
b'HttpProxyBasePlugin': [plugin],
}
self._conn = mock.MagicMock(spec=socket.socket)
mock_fromfd.return_value = self._conn
self.protocol_handler = HttpProtocolHandler(
self.fileno, self._addr, flags=self.flags)
self.protocol_handler.initialize()
self.server = self.mock_server_conn.return_value
self.server_ssl_connection = mock.MagicMock(spec=ssl.SSLSocket)
self.mock_ssl_context.return_value.wrap_socket.return_value = self.server_ssl_connection
self.client_ssl_connection = mock.MagicMock(spec=ssl.SSLSocket)
self.mock_ssl_wrap.return_value = self.client_ssl_connection
def has_buffer() -> bool:
return cast(bool, self.server.queue.called)
def closed() -> bool:
return not self.server.connect.called
def mock_connection() -> Any:
if self.mock_ssl_context.return_value.wrap_socket.called:
return self.server_ssl_connection
return self._conn
self.server.has_buffer.side_effect = has_buffer
type(self.server).closed = mock.PropertyMock(side_effect=closed)
type(
self.server).connection = mock.PropertyMock(
side_effect=mock_connection)
self.mock_selector.return_value.select.side_effect = [
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)],
[(selectors.SelectorKey(
fileobj=self.client_ssl_connection,
fd=self.client_ssl_connection.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)],
[(selectors.SelectorKey(
fileobj=self.server_ssl_connection,
fd=self.server_ssl_connection.fileno,
events=selectors.EVENT_WRITE,
data=None), selectors.EVENT_WRITE)],
[(selectors.SelectorKey(
fileobj=self.server_ssl_connection,
fd=self.server_ssl_connection.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)], ]
# Connect
def send(raw: bytes) -> int:
return len(raw)
self._conn.send.side_effect = send
self._conn.recv.return_value = build_http_request(
httpMethods.CONNECT, b'uni.corn:443'
)
self.protocol_handler.run_once()
self.mock_popen.assert_called()
self.mock_server_conn.assert_called_once_with('uni.corn', 443)
self.server.connect.assert_called()
self.assertEqual(
self.protocol_handler.client.connection,
self.client_ssl_connection)
self.assertEqual(self.server.connection, self.server_ssl_connection)
self._conn.send.assert_called_with(
HttpProxyPlugin.PROXY_TUNNEL_ESTABLISHED_RESPONSE_PKT
)
self.assertEqual(self.protocol_handler.client.buffer, b'')
def test_modify_post_data_plugin(self) -> None:
original = b'{"key": "value"}'
modified = b'{"key": "modified"}'
self.client_ssl_connection.recv.return_value = build_http_request(
b'POST', b'/',
headers={
b'Host': b'uni.corn',
b'Content-Type': b'application/x-www-form-urlencoded',
b'Content-Length': bytes_(len(original)),
},
body=original
)
self.protocol_handler.run_once()
self.server.queue.assert_called_with(
build_http_request(
b'POST', b'/',
headers={
b'Host': b'uni.corn',
b'Content-Length': bytes_(len(modified)),
b'Content-Type': b'application/json',
},
body=modified
)
)
@mock.patch('proxy.http.proxy.TcpServerConnection')
def test_man_in_the_middle_plugin(
self, mock_server_conn: mock.Mock) -> None:
request = build_http_request(
b'GET', b'/',
headers={
b'Host': b'uni.corn',
}
)
self.client_ssl_connection.recv.return_value = request
# Client read
self.protocol_handler.run_once()
self.server.queue.assert_called_once_with(request)
# Server write
self.protocol_handler.run_once()
self.server.flush.assert_called_once()
# Server read
self.server.recv.return_value = \
build_http_response(
httpStatusCodes.OK,
reason=b'OK', body=b'Original Response From Upstream')
self.protocol_handler.run_once()
self.assertEqual(
self.protocol_handler.client.buffer,
build_http_response(
httpStatusCodes.OK,
reason=b'OK', body=b'Hello from man in the middle')
)

View File

@ -0,0 +1,169 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import uuid
import unittest
import socket
import ssl
import selectors
from typing import Any
from unittest import mock
from proxy.http.handler import HttpProtocolHandler
from proxy.http.proxy import HttpProxyPlugin
from proxy.http.methods import httpMethods
from proxy.common.utils import build_http_request, bytes_
from proxy.common.flags import Flags
class TestHttpProxyTlsInterception(unittest.TestCase):
@mock.patch('ssl.wrap_socket')
@mock.patch('ssl.create_default_context')
@mock.patch('proxy.http.proxy.TcpServerConnection')
@mock.patch('subprocess.Popen')
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
def test_e2e(
self,
mock_fromfd: mock.Mock,
mock_selector: mock.Mock,
mock_popen: mock.Mock,
mock_server_conn: mock.Mock,
mock_ssl_context: mock.Mock,
mock_ssl_wrap: mock.Mock) -> None:
host, port = uuid.uuid4().hex, 443
netloc = '{0}:{1}'.format(host, port)
self.mock_fromfd = mock_fromfd
self.mock_selector = mock_selector
self.mock_popen = mock_popen
self.mock_server_conn = mock_server_conn
self.mock_ssl_context = mock_ssl_context
self.mock_ssl_wrap = mock_ssl_wrap
ssl_connection = mock.MagicMock(spec=ssl.SSLSocket)
self.mock_ssl_context.return_value.wrap_socket.return_value = ssl_connection
self.mock_ssl_wrap.return_value = mock.MagicMock(spec=ssl.SSLSocket)
plain_connection = mock.MagicMock(spec=socket.socket)
def mock_connection() -> Any:
if self.mock_ssl_context.return_value.wrap_socket.called:
return ssl_connection
return plain_connection
type(self.mock_server_conn.return_value).connection = \
mock.PropertyMock(side_effect=mock_connection)
self.fileno = 10
self._addr = ('127.0.0.1', 54382)
self.flags = Flags(
ca_cert_file='ca-cert.pem',
ca_key_file='ca-key.pem',
ca_signing_key_file='ca-signing-key.pem',
)
self.plugin = mock.MagicMock()
self.proxy_plugin = mock.MagicMock()
self.flags.plugins = {
b'HttpProtocolHandlerPlugin': [self.plugin, HttpProxyPlugin],
b'HttpProxyBasePlugin': [self.proxy_plugin],
}
self._conn = mock_fromfd.return_value
self.protocol_handler = HttpProtocolHandler(
self.fileno, self._addr, flags=self.flags)
self.protocol_handler.initialize()
self.plugin.assert_called()
self.assertEqual(self.plugin.call_args[0][0], self.flags)
self.assertEqual(self.plugin.call_args[0][1].connection, self._conn)
self.proxy_plugin.assert_called()
self.assertEqual(self.proxy_plugin.call_args[0][0], self.flags)
self.assertEqual(
self.proxy_plugin.call_args[0][1].connection,
self._conn)
connect_request = build_http_request(
httpMethods.CONNECT, bytes_(netloc),
headers={
b'Host': bytes_(netloc),
})
self._conn.recv.return_value = connect_request
# Prepare mocked HttpProtocolHandlerPlugin
self.plugin.return_value.get_descriptors.return_value = ([], [])
self.plugin.return_value.write_to_descriptors.return_value = False
self.plugin.return_value.read_from_descriptors.return_value = False
self.plugin.return_value.on_client_data.side_effect = lambda raw: raw
self.plugin.return_value.on_request_complete.return_value = False
self.plugin.return_value.on_response_chunk.side_effect = lambda chunk: chunk
self.plugin.return_value.on_client_connection_close.return_value = None
# Prepare mocked HttpProxyBasePlugin
self.proxy_plugin.return_value.before_upstream_connection.side_effect = lambda r: r
self.proxy_plugin.return_value.handle_client_request.side_effect = lambda r: r
self.mock_selector.return_value.select.side_effect = [
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)], ]
self.protocol_handler.run_once()
# Assert our mocked plugins invocations
self.plugin.return_value.get_descriptors.assert_called()
self.plugin.return_value.write_to_descriptors.assert_called_with([])
self.plugin.return_value.on_client_data.assert_called_with(
connect_request)
self.plugin.return_value.on_request_complete.assert_called()
self.plugin.return_value.read_from_descriptors.assert_called_with([
self._conn])
self.proxy_plugin.return_value.before_upstream_connection.assert_called()
self.proxy_plugin.return_value.handle_client_request.assert_called()
self.mock_server_conn.assert_called_with(host, port)
self.mock_server_conn.return_value.connection.setblocking.assert_called_with(
False)
self.mock_ssl_context.assert_called_with(ssl.Purpose.SERVER_AUTH)
# self.assertEqual(self.mock_ssl_context.return_value.options,
# ssl.OP_NO_SSLv2 | ssl.OP_NO_SSLv3 | ssl.OP_NO_TLSv1 |
# ssl.OP_NO_TLSv1_1)
self.assertEqual(plain_connection.setblocking.call_count, 2)
self.mock_ssl_context.return_value.wrap_socket.assert_called_with(
plain_connection, server_hostname=host)
# TODO: Assert Popen arguments, piping, success condition
self.assertEqual(self.mock_popen.call_count, 2)
self.assertEqual(ssl_connection.setblocking.call_count, 1)
self.assertEqual(
self.mock_server_conn.return_value._conn,
ssl_connection)
self._conn.send.assert_called_with(
HttpProxyPlugin.PROXY_TUNNEL_ESTABLISHED_RESPONSE_PKT)
assert self.flags.ca_cert_dir is not None
self.mock_ssl_wrap.assert_called_with(
self._conn,
server_side=True,
keyfile=self.flags.ca_signing_key_file,
certfile=HttpProxyPlugin.generated_cert_file_path(
self.flags.ca_cert_dir, host)
)
self.assertEqual(self._conn.setblocking.call_count, 2)
self.assertEqual(
self.protocol_handler.client.connection,
self.mock_ssl_wrap.return_value)
# Assert connection references for all other plugins is updated
self.assertEqual(
self.plugin.return_value.client._conn,
self.mock_ssl_wrap.return_value)
self.assertEqual(
self.proxy_plugin.return_value.client._conn,
self.mock_ssl_wrap.return_value)

View File

@ -0,0 +1,41 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import unittest
from proxy.http.parser import HttpParser, httpParserTypes
from proxy.http.exception import HttpRequestRejected
from proxy.common.constants import CRLF
from proxy.common.utils import build_http_response
from proxy.http.codes import httpStatusCodes
class TestHttpRequestRejected(unittest.TestCase):
def setUp(self) -> None:
self.request = HttpParser(httpParserTypes.REQUEST_PARSER)
def test_empty_response(self) -> None:
e = HttpRequestRejected()
self.assertEqual(e.response(self.request), None)
def test_status_code_response(self) -> None:
e = HttpRequestRejected(status_code=200, reason=b'OK')
self.assertEqual(e.response(self.request), CRLF.join([
b'HTTP/1.1 200 OK',
CRLF
]))
def test_body_response(self) -> None:
e = HttpRequestRejected(
status_code=httpStatusCodes.NOT_FOUND, reason=b'NOT FOUND',
body=b'Nothing here')
self.assertEqual(
e.response(self.request),
build_http_response(httpStatusCodes.NOT_FOUND, reason=b'NOT FOUND', body=b'Nothing here'))

229
tests/test_main.py Normal file
View File

@ -0,0 +1,229 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import unittest
import logging
import tempfile
import os
from unittest import mock
from proxy.main import main
from proxy.common.utils import bytes_
from proxy.http.handler import HttpProtocolHandler
from proxy.common.constants import DEFAULT_LOG_LEVEL, DEFAULT_LOG_FILE, DEFAULT_LOG_FORMAT, DEFAULT_BASIC_AUTH
from proxy.common.constants import DEFAULT_TIMEOUT, DEFAULT_DEVTOOLS_WS_PATH, DEFAULT_DISABLE_HTTP_PROXY
from proxy.common.constants import DEFAULT_ENABLE_STATIC_SERVER, DEFAULT_ENABLE_EVENTS, DEFAULT_ENABLE_DEVTOOLS
from proxy.common.constants import DEFAULT_ENABLE_WEB_SERVER, DEFAULT_THREADLESS, DEFAULT_CERT_FILE, DEFAULT_KEY_FILE
from proxy.common.constants import DEFAULT_CA_CERT_FILE, DEFAULT_CA_KEY_FILE, DEFAULT_CA_SIGNING_KEY_FILE
from proxy.common.constants import DEFAULT_PAC_FILE, DEFAULT_PLUGINS, DEFAULT_PID_FILE, DEFAULT_PORT
from proxy.common.constants import DEFAULT_NUM_WORKERS, DEFAULT_OPEN_FILE_LIMIT, DEFAULT_IPV6_HOSTNAME
from proxy.common.constants import DEFAULT_SERVER_RECVBUF_SIZE, DEFAULT_CLIENT_RECVBUF_SIZE
from proxy.common.constants import COMMA
from proxy.common.version import __version__
def get_temp_file(name: str) -> str:
return os.path.join(tempfile.gettempdir(), name)
class TestMain(unittest.TestCase):
@staticmethod
def mock_default_args(mock_args: mock.Mock) -> None:
mock_args.version = False
mock_args.cert_file = DEFAULT_CERT_FILE
mock_args.key_file = DEFAULT_KEY_FILE
mock_args.ca_key_file = DEFAULT_CA_KEY_FILE
mock_args.ca_cert_file = DEFAULT_CA_CERT_FILE
mock_args.ca_signing_key_file = DEFAULT_CA_SIGNING_KEY_FILE
mock_args.pid_file = DEFAULT_PID_FILE
mock_args.log_file = DEFAULT_LOG_FILE
mock_args.log_level = DEFAULT_LOG_LEVEL
mock_args.log_format = DEFAULT_LOG_FORMAT
mock_args.basic_auth = DEFAULT_BASIC_AUTH
mock_args.hostname = DEFAULT_IPV6_HOSTNAME
mock_args.port = DEFAULT_PORT
mock_args.num_workers = DEFAULT_NUM_WORKERS
mock_args.disable_http_proxy = DEFAULT_DISABLE_HTTP_PROXY
mock_args.enable_web_server = DEFAULT_ENABLE_WEB_SERVER
mock_args.pac_file = DEFAULT_PAC_FILE
mock_args.plugins = DEFAULT_PLUGINS
mock_args.server_recvbuf_size = DEFAULT_SERVER_RECVBUF_SIZE
mock_args.client_recvbuf_size = DEFAULT_CLIENT_RECVBUF_SIZE
mock_args.open_file_limit = DEFAULT_OPEN_FILE_LIMIT
mock_args.enable_static_server = DEFAULT_ENABLE_STATIC_SERVER
mock_args.enable_devtools = DEFAULT_ENABLE_DEVTOOLS
mock_args.devtools_event_queue = None
mock_args.devtools_ws_path = DEFAULT_DEVTOOLS_WS_PATH
mock_args.timeout = DEFAULT_TIMEOUT
mock_args.threadless = DEFAULT_THREADLESS
mock_args.enable_events = DEFAULT_ENABLE_EVENTS
@mock.patch('time.sleep')
@mock.patch('proxy.main.load_plugins')
@mock.patch('proxy.main.init_parser')
@mock.patch('proxy.main.set_open_file_limit')
@mock.patch('proxy.main.Flags')
@mock.patch('proxy.main.AcceptorPool')
@mock.patch('logging.basicConfig')
def test_init_with_no_arguments(
self,
mock_logging_config: mock.Mock,
mock_acceptor_pool: mock.Mock,
mock_protocol_config: mock.Mock,
mock_set_open_file_limit: mock.Mock,
mock_init_parser: mock.Mock,
mock_load_plugins: mock.Mock,
mock_sleep: mock.Mock) -> None:
mock_sleep.side_effect = KeyboardInterrupt()
mock_args = mock_init_parser.return_value.parse_args.return_value
self.mock_default_args(mock_args)
main([])
mock_init_parser.assert_called()
mock_init_parser.return_value.parse_args.called_with([])
mock_load_plugins.assert_called_with(
b'proxy.http.proxy.HttpProxyPlugin,')
mock_logging_config.assert_called_with(
level=logging.INFO,
format=DEFAULT_LOG_FORMAT
)
mock_set_open_file_limit.assert_called_with(mock_args.open_file_limit)
mock_protocol_config.assert_called_with(
auth_code=mock_args.basic_auth,
backlog=mock_args.backlog,
ca_cert_dir=mock_args.ca_cert_dir,
ca_cert_file=mock_args.ca_cert_file,
ca_key_file=mock_args.ca_key_file,
ca_signing_key_file=mock_args.ca_signing_key_file,
certfile=mock_args.cert_file,
client_recvbuf_size=mock_args.client_recvbuf_size,
hostname=mock_args.hostname,
keyfile=mock_args.key_file,
num_workers=0,
pac_file=mock_args.pac_file,
pac_file_url_path=mock_args.pac_file_url_path,
port=mock_args.port,
server_recvbuf_size=mock_args.server_recvbuf_size,
disable_headers=[
header.lower() for header in bytes_(
mock_args.disable_headers).split(COMMA) if header.strip() != b''],
static_server_dir=mock_args.static_server_dir,
enable_static_server=mock_args.enable_static_server,
devtools_event_queue=None,
devtools_ws_path=DEFAULT_DEVTOOLS_WS_PATH,
timeout=DEFAULT_TIMEOUT,
threadless=DEFAULT_THREADLESS,
enable_events=DEFAULT_ENABLE_EVENTS,
)
mock_acceptor_pool.assert_called_with(
flags=mock_protocol_config.return_value,
work_klass=HttpProtocolHandler,
)
mock_acceptor_pool.return_value.setup.assert_called()
mock_acceptor_pool.return_value.shutdown.assert_called()
mock_sleep.assert_called()
@mock.patch('time.sleep')
@mock.patch('os.remove')
@mock.patch('os.path.exists')
@mock.patch('builtins.open')
@mock.patch('proxy.main.init_parser')
@mock.patch('proxy.main.AcceptorPool')
def test_pid_file_is_written_and_removed(
self,
mock_acceptor_pool: mock.Mock,
mock_init_parser: mock.Mock,
mock_open: mock.Mock,
mock_exists: mock.Mock,
mock_remove: mock.Mock,
mock_sleep: mock.Mock) -> None:
pid_file = get_temp_file('pid')
mock_sleep.side_effect = KeyboardInterrupt()
mock_args = mock_init_parser.return_value.parse_args.return_value
self.mock_default_args(mock_args)
mock_args.pid_file = pid_file
main(['--pid-file', pid_file])
mock_init_parser.assert_called()
mock_acceptor_pool.assert_called()
mock_acceptor_pool.return_value.setup.assert_called()
mock_open.assert_called_with(pid_file, 'wb')
mock_open.return_value.__enter__.return_value.write.assert_called_with(
bytes_(os.getpid()))
mock_exists.assert_called_with(pid_file)
mock_remove.assert_called_with(pid_file)
@mock.patch('time.sleep')
@mock.patch('proxy.main.Flags')
@mock.patch('proxy.main.AcceptorPool')
def test_basic_auth(
self,
mock_acceptor_pool: mock.Mock,
mock_protocol_config: mock.Mock,
mock_sleep: mock.Mock) -> None:
mock_sleep.side_effect = KeyboardInterrupt()
main(['--basic-auth', 'user:pass'])
flags = mock_protocol_config.return_value
mock_acceptor_pool.assert_called_with(
flags=flags,
work_klass=HttpProtocolHandler)
self.assertEqual(
mock_protocol_config.call_args[1]['auth_code'],
b'Basic dXNlcjpwYXNz')
@mock.patch('builtins.print')
def test_main_version(
self,
mock_print: mock.Mock) -> None:
with self.assertRaises(SystemExit):
main(['--version'])
mock_print.assert_called_with(__version__)
@mock.patch('time.sleep')
@mock.patch('builtins.print')
@mock.patch('proxy.main.AcceptorPool')
@mock.patch('proxy.main.is_py3')
def test_main_py3_runs(
self,
mock_is_py3: mock.Mock,
mock_acceptor_pool: mock.Mock,
mock_print: mock.Mock,
mock_sleep: mock.Mock) -> None:
mock_sleep.side_effect = KeyboardInterrupt()
mock_is_py3.return_value = True
main([])
mock_is_py3.assert_called()
mock_print.assert_not_called()
mock_acceptor_pool.assert_called()
mock_acceptor_pool.return_value.setup.assert_called()
@mock.patch('builtins.print')
@mock.patch('proxy.main.is_py3')
def test_main_py2_exit(
self,
mock_is_py3: mock.Mock,
mock_print: mock.Mock) -> None:
mock_is_py3.return_value = False
with self.assertRaises(SystemExit) as e:
main([])
mock_print.assert_called_with(
'DEPRECATION: "develop" branch no longer supports Python 2.7. Kindly upgrade to Python 3+. '
'If for some reasons you cannot upgrade, consider using "master" branch or simply '
'"pip install proxy.py==0.3".'
'\n\n'
'DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. '
'Please upgrade your Python as Python 2.7 won\'t be maintained after that date. '
'A future version of pip will drop support for Python 2.7.'
)
self.assertEqual(e.exception.code, 1)
mock_is_py3.assert_called()

View File

@ -0,0 +1,349 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import unittest
import selectors
import base64
from typing import cast
from unittest import mock
from proxy.common.flags import Flags
from proxy.common.utils import bytes_
from proxy.common.constants import CRLF
from proxy.http.parser import HttpParser
from proxy.http.proxy import HttpProxyPlugin
from proxy.http.parser import httpParserStates, httpParserTypes
from proxy.http.exception import ProxyAuthenticationFailed, ProxyConnectionFailed
from proxy.http.handler import HttpProtocolHandler
from proxy.main import load_plugins
from proxy.common.version import __version__
class TestHttpProtocolHandler(unittest.TestCase):
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
def setUp(self,
mock_fromfd: mock.Mock,
mock_selector: mock.Mock) -> None:
self.fileno = 10
self._addr = ('127.0.0.1', 54382)
self._conn = mock_fromfd.return_value
self.http_server_port = 65535
self.flags = Flags()
self.flags.plugins = load_plugins(
b'proxy.http.proxy.HttpProxyPlugin,proxy.http.server.HttpWebServerPlugin')
self.mock_selector = mock_selector
self.protocol_handler = HttpProtocolHandler(
self.fileno, self._addr, flags=self.flags)
self.protocol_handler.initialize()
@mock.patch('proxy.http.proxy.TcpServerConnection')
def test_http_get(self, mock_server_connection: mock.Mock) -> None:
server = mock_server_connection.return_value
server.connect.return_value = True
server.buffer_size.return_value = 0
self.mock_selector_for_client_read_read_server_write(
self.mock_selector, server)
# Send request line
assert self.http_server_port is not None
self._conn.recv.return_value = (b'GET http://localhost:%d HTTP/1.1' %
self.http_server_port) + CRLF
self.protocol_handler.run_once()
self.assertEqual(
self.protocol_handler.request.state,
httpParserStates.LINE_RCVD)
self.assertNotEqual(
self.protocol_handler.request.state,
httpParserStates.COMPLETE)
# Send headers and blank line, thus completing HTTP request
assert self.http_server_port is not None
self._conn.recv.return_value = CRLF.join([
b'User-Agent: proxy.py/%s' % bytes_(__version__),
b'Host: localhost:%d' % self.http_server_port,
b'Accept: */*',
b'Proxy-Connection: Keep-Alive',
CRLF
])
self.assert_data_queued(mock_server_connection, server)
self.protocol_handler.run_once()
server.flush.assert_called_once()
def assert_tunnel_response(
self, mock_server_connection: mock.Mock, server: mock.Mock) -> None:
self.protocol_handler.run_once()
self.assertTrue(
cast(HttpProxyPlugin, self.protocol_handler.plugins['HttpProxyPlugin']).server is not None)
self.assertEqual(
self.protocol_handler.client.buffer,
HttpProxyPlugin.PROXY_TUNNEL_ESTABLISHED_RESPONSE_PKT)
mock_server_connection.assert_called_once()
server.connect.assert_called_once()
server.queue.assert_not_called()
server.closed = False
parser = HttpParser(httpParserTypes.RESPONSE_PARSER)
parser.parse(self.protocol_handler.client.buffer)
self.assertEqual(parser.state, httpParserStates.COMPLETE)
assert parser.code is not None
self.assertEqual(int(parser.code), 200)
@mock.patch('proxy.http.proxy.TcpServerConnection')
def test_http_tunnel(self, mock_server_connection: mock.Mock) -> None:
server = mock_server_connection.return_value
server.connect.return_value = True
def has_buffer() -> bool:
return cast(bool, server.queue.called)
server.has_buffer.side_effect = has_buffer
self.mock_selector.return_value.select.side_effect = [
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ), ],
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=0,
data=None), selectors.EVENT_WRITE), ],
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ), ],
[(selectors.SelectorKey(
fileobj=server.connection,
fd=server.connection.fileno,
events=0,
data=None), selectors.EVENT_WRITE), ],
]
assert self.http_server_port is not None
self._conn.recv.return_value = CRLF.join([
b'CONNECT localhost:%d HTTP/1.1' % self.http_server_port,
b'Host: localhost:%d' % self.http_server_port,
b'User-Agent: proxy.py/%s' % bytes_(__version__),
b'Proxy-Connection: Keep-Alive',
CRLF
])
self.assert_tunnel_response(mock_server_connection, server)
# Dispatch tunnel established response to client
self.protocol_handler.run_once()
self.assert_data_queued_to_server(server)
self.protocol_handler.run_once()
self.assertEqual(server.queue.call_count, 1)
server.flush.assert_called_once()
def test_proxy_connection_failed(self) -> None:
self.mock_selector_for_client_read(self.mock_selector)
self._conn.recv.return_value = CRLF.join([
b'GET http://unknown.domain HTTP/1.1',
b'Host: unknown.domain',
CRLF
])
self.protocol_handler.run_once()
self.assertEqual(
self.protocol_handler.client.buffer,
ProxyConnectionFailed.RESPONSE_PKT)
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
def test_proxy_authentication_failed(
self,
mock_fromfd: mock.Mock,
mock_selector: mock.Mock) -> None:
self._conn = mock_fromfd.return_value
self.mock_selector_for_client_read(mock_selector)
flags = Flags(
auth_code=b'Basic %s' %
base64.b64encode(b'user:pass'))
flags.plugins = load_plugins(
b'proxy.http.proxy.HttpProxyPlugin,proxy.http.server.HttpWebServerPlugin')
self.protocol_handler = HttpProtocolHandler(
self.fileno, self._addr, flags=flags)
self.protocol_handler.initialize()
self._conn.recv.return_value = CRLF.join([
b'GET http://abhinavsingh.com HTTP/1.1',
b'Host: abhinavsingh.com',
CRLF
])
self.protocol_handler.run_once()
self.assertEqual(
self.protocol_handler.client.buffer,
ProxyAuthenticationFailed.RESPONSE_PKT)
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
@mock.patch('proxy.http.proxy.TcpServerConnection')
def test_authenticated_proxy_http_get(
self, mock_server_connection: mock.Mock,
mock_fromfd: mock.Mock,
mock_selector: mock.Mock) -> None:
self._conn = mock_fromfd.return_value
self.mock_selector_for_client_read(mock_selector)
server = mock_server_connection.return_value
server.connect.return_value = True
server.buffer_size.return_value = 0
flags = Flags(
auth_code=b'Basic %s' %
base64.b64encode(b'user:pass'))
flags.plugins = load_plugins(
b'proxy.http.proxy.HttpProxyPlugin,proxy.http.server.HttpWebServerPlugin')
self.protocol_handler = HttpProtocolHandler(
self.fileno, addr=self._addr, flags=flags)
self.protocol_handler.initialize()
assert self.http_server_port is not None
self._conn.recv.return_value = b'GET http://localhost:%d HTTP/1.1' % self.http_server_port
self.protocol_handler.run_once()
self.assertEqual(
self.protocol_handler.request.state,
httpParserStates.INITIALIZED)
self._conn.recv.return_value = CRLF
self.protocol_handler.run_once()
self.assertEqual(
self.protocol_handler.request.state,
httpParserStates.LINE_RCVD)
assert self.http_server_port is not None
self._conn.recv.return_value = CRLF.join([
b'User-Agent: proxy.py/%s' % bytes_(__version__),
b'Host: localhost:%d' % self.http_server_port,
b'Accept: */*',
b'Proxy-Connection: Keep-Alive',
b'Proxy-Authorization: Basic dXNlcjpwYXNz',
CRLF
])
self.assert_data_queued(mock_server_connection, server)
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
@mock.patch('proxy.http.proxy.TcpServerConnection')
def test_authenticated_proxy_http_tunnel(
self, mock_server_connection: mock.Mock,
mock_fromfd: mock.Mock,
mock_selector: mock.Mock) -> None:
server = mock_server_connection.return_value
server.connect.return_value = True
server.buffer_size.return_value = 0
self._conn = mock_fromfd.return_value
self.mock_selector_for_client_read_read_server_write(
mock_selector, server)
flags = Flags(
auth_code=b'Basic %s' %
base64.b64encode(b'user:pass'))
flags.plugins = load_plugins(
b'proxy.http.proxy.HttpProxyPlugin,proxy.http.server.HttpWebServerPlugin')
self.protocol_handler = HttpProtocolHandler(
self.fileno, self._addr, flags=flags)
self.protocol_handler.initialize()
assert self.http_server_port is not None
self._conn.recv.return_value = CRLF.join([
b'CONNECT localhost:%d HTTP/1.1' % self.http_server_port,
b'Host: localhost:%d' % self.http_server_port,
b'User-Agent: proxy.py/%s' % bytes_(__version__),
b'Proxy-Connection: Keep-Alive',
b'Proxy-Authorization: Basic dXNlcjpwYXNz',
CRLF
])
self.assert_tunnel_response(mock_server_connection, server)
self.protocol_handler.client.flush()
self.assert_data_queued_to_server(server)
self.protocol_handler.run_once()
server.flush.assert_called_once()
def mock_selector_for_client_read_read_server_write(
self, mock_selector: mock.Mock, server: mock.Mock) -> None:
mock_selector.return_value.select.side_effect = [
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ), ],
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=0,
data=None), selectors.EVENT_READ), ],
[(selectors.SelectorKey(
fileobj=server.connection,
fd=server.connection.fileno,
events=0,
data=None), selectors.EVENT_WRITE), ],
]
def assert_data_queued(
self, mock_server_connection: mock.Mock, server: mock.Mock) -> None:
self.protocol_handler.run_once()
self.assertEqual(
self.protocol_handler.request.state,
httpParserStates.COMPLETE)
mock_server_connection.assert_called_once()
server.connect.assert_called_once()
server.closed = False
assert self.http_server_port is not None
pkt = CRLF.join([
b'GET / HTTP/1.1',
b'User-Agent: proxy.py/%s' % bytes_(__version__),
b'Host: localhost:%d' % self.http_server_port,
b'Accept: */*',
b'Via: 1.1 proxy.py v%s' % bytes_(__version__),
CRLF
])
server.queue.assert_called_once_with(pkt)
server.buffer_size.return_value = len(pkt)
def assert_data_queued_to_server(self, server: mock.Mock) -> None:
assert self.http_server_port is not None
self.assertEqual(
self._conn.send.call_args[0][0],
HttpProxyPlugin.PROXY_TUNNEL_ESTABLISHED_RESPONSE_PKT)
self._conn.recv.return_value = CRLF.join([
b'GET / HTTP/1.1',
b'Host: localhost:%d' % self.http_server_port,
b'User-Agent: proxy.py/%s' % bytes_(__version__),
CRLF
])
self.protocol_handler.run_once()
pkt = CRLF.join([
b'GET / HTTP/1.1',
b'Host: localhost:%d' % self.http_server_port,
b'User-Agent: proxy.py/%s' % bytes_(__version__),
CRLF
])
server.queue.assert_called_once_with(pkt)
server.buffer_size.return_value = len(pkt)
server.flush.assert_not_called()
def mock_selector_for_client_read(self, mock_selector: mock.Mock) -> None:
mock_selector.return_value.select.return_value = [(
selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ), ]

View File

@ -0,0 +1,53 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import os
import unittest
from unittest import mock
from proxy.main import set_open_file_limit
if os.name != 'nt':
import resource
@unittest.skipIf(
os.name == 'nt',
'Open file limit tests disabled for Windows')
class TestSetOpenFileLimit(unittest.TestCase):
@mock.patch('resource.getrlimit', return_value=(128, 1024))
@mock.patch('resource.setrlimit', return_value=None)
def test_set_open_file_limit(
self,
mock_set_rlimit: mock.Mock,
mock_get_rlimit: mock.Mock) -> None:
set_open_file_limit(256)
mock_get_rlimit.assert_called_with(resource.RLIMIT_NOFILE)
mock_set_rlimit.assert_called_with(resource.RLIMIT_NOFILE, (256, 1024))
@mock.patch('resource.getrlimit', return_value=(256, 1024))
@mock.patch('resource.setrlimit', return_value=None)
def test_set_open_file_limit_not_called(
self,
mock_set_rlimit: mock.Mock,
mock_get_rlimit: mock.Mock) -> None:
set_open_file_limit(256)
mock_get_rlimit.assert_called_with(resource.RLIMIT_NOFILE)
mock_set_rlimit.assert_not_called()
@mock.patch('resource.getrlimit', return_value=(256, 1024))
@mock.patch('resource.setrlimit', return_value=None)
def test_set_open_file_limit_not_called_coz_upper_bound_check(
self,
mock_set_rlimit: mock.Mock,
mock_get_rlimit: mock.Mock) -> None:
set_open_file_limit(1024)
mock_get_rlimit.assert_called_with(resource.RLIMIT_NOFILE)
mock_set_rlimit.assert_not_called()

33
tests/test_text_bytes.py Normal file
View File

@ -0,0 +1,33 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import unittest
from proxy.common.utils import text_, bytes_
class TestTextBytes(unittest.TestCase):
def test_text(self) -> None:
self.assertEqual(text_(b'hello'), 'hello')
def test_text_int(self) -> None:
self.assertEqual(text_(1), '1')
def test_text_nochange(self) -> None:
self.assertEqual(text_('hello'), 'hello')
def test_bytes(self) -> None:
self.assertEqual(bytes_('hello'), b'hello')
def test_bytes_int(self) -> None:
self.assertEqual(bytes_(1), b'1')
def test_bytes_nochange(self) -> None:
self.assertEqual(bytes_(b'hello'), b'hello')

57
tests/test_utils.py Normal file
View File

@ -0,0 +1,57 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import socket
import unittest
from unittest import mock
from proxy.common.constants import DEFAULT_IPV6_HOSTNAME, DEFAULT_IPV4_HOSTNAME, DEFAULT_PORT
from proxy.common.utils import new_socket_connection, socket_connection
class TestSocketConnectionUtils(unittest.TestCase):
def setUp(self) -> None:
self.addr_ipv4 = (str(DEFAULT_IPV4_HOSTNAME), DEFAULT_PORT)
self.addr_ipv6 = (str(DEFAULT_IPV6_HOSTNAME), DEFAULT_PORT)
self.addr_dual = ('httpbin.org', 80)
@mock.patch('socket.socket')
def test_new_socket_connection_ipv4(self, mock_socket: mock.Mock) -> None:
conn = new_socket_connection(self.addr_ipv4)
mock_socket.assert_called_with(socket.AF_INET, socket.SOCK_STREAM, 0)
self.assertEqual(conn, mock_socket.return_value)
mock_socket.return_value.connect.assert_called_with(self.addr_ipv4)
@mock.patch('socket.socket')
def test_new_socket_connection_ipv6(self, mock_socket: mock.Mock) -> None:
conn = new_socket_connection(self.addr_ipv6)
mock_socket.assert_called_with(socket.AF_INET6, socket.SOCK_STREAM, 0)
self.assertEqual(conn, mock_socket.return_value)
mock_socket.return_value.connect.assert_called_with(
(self.addr_ipv6[0], self.addr_ipv6[1], 0, 0))
@mock.patch('socket.create_connection')
def test_new_socket_connection_dual(self, mock_socket: mock.Mock) -> None:
conn = new_socket_connection(self.addr_dual)
mock_socket.assert_called_with(self.addr_dual)
self.assertEqual(conn, mock_socket.return_value)
@mock.patch('proxy.common.utils.new_socket_connection')
def test_decorator(self, mock_new_socket_connection: mock.Mock) -> None:
@socket_connection(self.addr_ipv4)
def dummy(conn: socket.socket) -> None:
self.assertEqual(conn, mock_new_socket_connection.return_value)
dummy() # type: ignore
@mock.patch('proxy.common.utils.new_socket_connection')
def test_context_manager(
self, mock_new_socket_connection: mock.Mock) -> None:
with socket_connection(self.addr_ipv4) as conn:
self.assertEqual(conn, mock_new_socket_connection.return_value)

243
tests/test_web_server.py Normal file
View File

@ -0,0 +1,243 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import os
import tempfile
import unittest
import selectors
from unittest import mock
from proxy.main import load_plugins
from proxy.common.flags import Flags
from proxy.http.handler import HttpProtocolHandler
from proxy.http.parser import httpParserStates
from proxy.common.utils import build_http_response, build_http_request, bytes_, text_
from proxy.common.constants import CRLF, PROXY_PY_DIR
from proxy.http.server import HttpWebServerPlugin
class TestWebServerPlugin(unittest.TestCase):
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
def setUp(self, mock_fromfd: mock.Mock, mock_selector: mock.Mock) -> None:
self.fileno = 10
self._addr = ('127.0.0.1', 54382)
self._conn = mock_fromfd.return_value
self.mock_selector = mock_selector
self.flags = Flags()
self.flags.plugins = load_plugins(
b'proxy.http.proxy.HttpProxyPlugin,proxy.http.server.HttpWebServerPlugin')
self.protocol_handler = HttpProtocolHandler(
self.fileno, self._addr, flags=self.flags)
self.protocol_handler.initialize()
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
def test_pac_file_served_from_disk(
self, mock_fromfd: mock.Mock, mock_selector: mock.Mock) -> None:
pac_file = os.path.join(
os.path.dirname(PROXY_PY_DIR),
'helper',
'proxy.pac')
self._conn = mock_fromfd.return_value
self.mock_selector_for_client_read(mock_selector)
self.init_and_make_pac_file_request(pac_file)
self.protocol_handler.run_once()
self.assertEqual(
self.protocol_handler.request.state,
httpParserStates.COMPLETE)
with open(pac_file, 'rb') as f:
self._conn.send.called_once_with(build_http_response(
200, reason=b'OK', headers={
b'Content-Type': b'application/x-ns-proxy-autoconfig',
b'Connection': b'close'
}, body=f.read()
))
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
def test_pac_file_served_from_buffer(
self, mock_fromfd: mock.Mock, mock_selector: mock.Mock) -> None:
self._conn = mock_fromfd.return_value
self.mock_selector_for_client_read(mock_selector)
pac_file_content = b'function FindProxyForURL(url, host) { return "PROXY localhost:8899; DIRECT"; }'
self.init_and_make_pac_file_request(text_(pac_file_content))
self.protocol_handler.run_once()
self.assertEqual(
self.protocol_handler.request.state,
httpParserStates.COMPLETE)
self._conn.send.called_once_with(build_http_response(
200, reason=b'OK', headers={
b'Content-Type': b'application/x-ns-proxy-autoconfig',
b'Connection': b'close'
}, body=pac_file_content
))
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
def test_default_web_server_returns_404(
self, mock_fromfd: mock.Mock, mock_selector: mock.Mock) -> None:
self._conn = mock_fromfd.return_value
mock_selector.return_value.select.return_value = [(
selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ), ]
flags = Flags()
flags.plugins = load_plugins(
b'proxy.http.proxy.HttpProxyPlugin,proxy.http.server.HttpWebServerPlugin')
self.protocol_handler = HttpProtocolHandler(
self.fileno, self._addr, flags=flags)
self.protocol_handler.initialize()
self._conn.recv.return_value = CRLF.join([
b'GET /hello HTTP/1.1',
CRLF,
])
self.protocol_handler.run_once()
self.assertEqual(
self.protocol_handler.request.state,
httpParserStates.COMPLETE)
self.assertEqual(
self.protocol_handler.client.buffer,
HttpWebServerPlugin.DEFAULT_404_RESPONSE)
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
def test_static_web_server_serves(
self, mock_fromfd: mock.Mock, mock_selector: mock.Mock) -> None:
# Setup a static directory
static_server_dir = os.path.join(tempfile.gettempdir(), 'static')
index_file_path = os.path.join(static_server_dir, 'index.html')
html_file_content = b'''
<html>
<head></head>
<body></body>
</html>
'''
os.makedirs(static_server_dir, exist_ok=True)
with open(index_file_path, 'wb') as f:
f.write(html_file_content)
self._conn = mock_fromfd.return_value
self._conn.recv.return_value = build_http_request(
b'GET', b'/index.html')
mock_selector.return_value.select.side_effect = [
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)],
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_WRITE,
data=None), selectors.EVENT_WRITE)], ]
flags = Flags(
enable_static_server=True,
static_server_dir=static_server_dir)
flags.plugins = load_plugins(
b'proxy.http.proxy.HttpProxyPlugin,proxy.http.server.HttpWebServerPlugin')
self.protocol_handler = HttpProtocolHandler(
self.fileno, self._addr, flags=flags)
self.protocol_handler.initialize()
self.protocol_handler.run_once()
self.protocol_handler.run_once()
self.assertEqual(mock_selector.return_value.select.call_count, 2)
self.assertEqual(self._conn.send.call_count, 1)
self.assertEqual(self._conn.send.call_args[0][0], build_http_response(
200, reason=b'OK', headers={
b'Content-Type': b'text/html',
b'Connection': b'close',
b'Content-Length': bytes_(len(html_file_content)),
},
body=html_file_content
))
@mock.patch('selectors.DefaultSelector')
@mock.patch('socket.fromfd')
def test_static_web_server_serves_404(
self,
mock_fromfd: mock.Mock,
mock_selector: mock.Mock) -> None:
self._conn = mock_fromfd.return_value
self._conn.recv.return_value = build_http_request(
b'GET', b'/not-found.html')
mock_selector.return_value.select.side_effect = [
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ)],
[(selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_WRITE,
data=None), selectors.EVENT_WRITE)], ]
flags = Flags(enable_static_server=True)
flags.plugins = load_plugins(
b'proxy.http.proxy.HttpProxyPlugin,proxy.http.server.HttpWebServerPlugin')
self.protocol_handler = HttpProtocolHandler(
self.fileno, self._addr, flags=flags)
self.protocol_handler.initialize()
self.protocol_handler.run_once()
self.protocol_handler.run_once()
self.assertEqual(mock_selector.return_value.select.call_count, 2)
self.assertEqual(self._conn.send.call_count, 1)
self.assertEqual(self._conn.send.call_args[0][0],
HttpWebServerPlugin.DEFAULT_404_RESPONSE)
@mock.patch('socket.fromfd')
def test_on_client_connection_called_on_teardown(
self, mock_fromfd: mock.Mock) -> None:
flags = Flags()
plugin = mock.MagicMock()
flags.plugins = {b'HttpProtocolHandlerPlugin': [plugin]}
self._conn = mock_fromfd.return_value
self.protocol_handler = HttpProtocolHandler(
self.fileno, self._addr, flags=flags)
self.protocol_handler.initialize()
plugin.assert_called()
with mock.patch.object(self.protocol_handler, 'run_once') as mock_run_once:
mock_run_once.return_value = True
self.protocol_handler.run()
self.assertTrue(self._conn.closed)
plugin.return_value.on_client_connection_close.assert_called()
def init_and_make_pac_file_request(self, pac_file: str) -> None:
flags = Flags(pac_file=pac_file)
flags.plugins = load_plugins(
b'proxy.http.proxy.HttpProxyPlugin,proxy.http.server.HttpWebServerPlugin,'
b'proxy.http.server.HttpWebServerPacFilePlugin')
self.protocol_handler = HttpProtocolHandler(
self.fileno, self._addr, flags=flags)
self.protocol_handler.initialize()
self._conn.recv.return_value = CRLF.join([
b'GET / HTTP/1.1',
CRLF,
])
def mock_selector_for_client_read(self, mock_selector: mock.Mock) -> None:
mock_selector.return_value.select.return_value = [(
selectors.SelectorKey(
fileobj=self._conn,
fd=self._conn.fileno,
events=selectors.EVENT_READ,
data=None), selectors.EVENT_READ), ]

View File

@ -0,0 +1,32 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import unittest
from unittest import mock
from proxy.common.utils import build_websocket_handshake_response, build_websocket_handshake_request
from proxy.http.websocket import WebsocketClient, WebsocketFrame
from proxy.common.constants import DEFAULT_IPV4_HOSTNAME, DEFAULT_PORT
class TestWebsocketClient(unittest.TestCase):
@mock.patch('base64.b64encode')
@mock.patch('proxy.http.websocket.new_socket_connection')
def test_handshake(self, mock_connect: mock.Mock,
mock_b64encode: mock.Mock) -> None:
key = b'MySecretKey'
mock_b64encode.return_value = key
mock_connect.return_value.recv.return_value = \
build_websocket_handshake_response(
WebsocketFrame.key_to_accept(key))
_ = WebsocketClient(DEFAULT_IPV4_HOSTNAME, DEFAULT_PORT)
mock_connect.return_value.send.assert_called_with(
build_websocket_handshake_request(key)
)

View File

@ -0,0 +1,40 @@
# -*- coding: utf-8 -*-
"""
proxy.py
~~~~~~~~
Fast, Lightweight, Programmable Proxy Server in a single Python file.
:copyright: (c) 2013-present by Abhinav Singh and contributors.
:license: BSD, see LICENSE for more details.
"""
import unittest
from proxy.http.websocket import WebsocketFrame, websocketOpcodes
class TestWebsocketFrame(unittest.TestCase):
def test_build_with_mask(self) -> None:
raw = b'\x81\x85\xc6\ti\x8d\xael\x05\xe1\xa9'
frame = WebsocketFrame()
frame.fin = True
frame.opcode = websocketOpcodes.TEXT_FRAME
frame.masked = True
frame.mask = b'\xc6\ti\x8d'
frame.data = b'hello'
self.assertEqual(frame.build(), raw)
def test_parse_with_mask(self) -> None:
raw = b'\x81\x85\xc6\ti\x8d\xael\x05\xe1\xa9'
frame = WebsocketFrame()
frame.parse(raw)
self.assertEqual(frame.fin, True)
self.assertEqual(frame.rsv1, False)
self.assertEqual(frame.rsv2, False)
self.assertEqual(frame.rsv3, False)
self.assertEqual(frame.opcode, 0x1)
self.assertEqual(frame.masked, True)
assert frame.mask is not None
self.assertEqual(frame.mask, b'\xc6\ti\x8d')
self.assertEqual(frame.payload_length, 5)
self.assertEqual(frame.data, b'hello')