Document new testing approach (#1048)

* document new testing approach

* remove unused workflow branches

* remove stale folder

* fix link

* document new testing approach

* remove unused workflow branches

* fix link

* Extend tests details in contributing guide

* Typos

Co-authored-by: Julio Castillo <jccb@google.com>
This commit is contained in:
Ludovico Magnocavallo 2022-12-12 20:59:46 +01:00 committed by GitHub
parent 031b9f3476
commit cb87f3efa5
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
3 changed files with 117 additions and 64 deletions

View File

@ -16,8 +16,6 @@ name: "Linting"
on:
pull_request:
branches:
- fast-dev
- fast-dev-gke
- master
tags:
- ci

View File

@ -14,12 +14,10 @@
name: "Tests"
on:
schedule:
- cron: "45 2 * * *"
# schedule:
# - cron: "45 2 * * *"
pull_request:
branches:
- fast-dev
- fast-dev-gke
- master
tags:
- ci

View File

@ -72,11 +72,13 @@ pytest tests/examples
Once everything looks good, add/commit any pending changes then push and open a PR on GitHub. We typically enforce a set of design and style conventions, so please make sure you have familiarized yourself with the following sections and implemented them in your code, to avoid lengthy review cycles.
HINT: if you work on high-latency or low-bandwidth network use `TF_PLUGIN_CACHE_DIR` environment variable to dramatically speed up the tests, for example:
```bash
TF_PLUGIN_CACHE_DIR=/tmp/tfcache pytest tests
```
Or just add into your [terraformrc](https://developer.hashicorp.com/terraform/cli/config/config-file):
```
plugin_cache_dir = "$HOME/.terraform.d/plugin-cache"
```
@ -544,6 +546,7 @@ locals {
#### The `prefix` variable
If you would like to use a "prefix" variable for resource names, please keep its definition consistent across all modules:
```hcl
# variables.tf
variable "prefix" {
@ -563,6 +566,7 @@ locals {
```
For blueprints the prefix is mandatory:
```hcl
variable "prefix" {
description = "Prefix used for resource names."
@ -608,6 +612,7 @@ The linting workflow tests:
- that the correct copyright boilerplate is present in all files, using `tools/check_boilerplate.py`
- that all Terraform code is linted via `terraform fmt`
- that Terraform variables and outputs are sorted alphabetically
- that all README files have up to date outputs, variables, and files (where relevant) tables, via `tools/check_documentation.py`
- that all links in README files are syntactically correct and valid if internal, via `tools/check_links.py`
- that resource names used in FAST stages stay within a length limit, via `tools/check_names.py`
@ -639,90 +644,123 @@ The test workflow runs test suites in parallel. Refer to the next section for mo
#### Using and writing tests
Our testing approach follows a simple philosophy: we mainly test to ensure code works, and it does not break due to changes to dependencies (modules) or provider resources.
Our testing approach follows a simple philosophy: we mainly test to ensure code works, and that it does not break due to changes to dependencies (modules) or provider resources.
This makes testing very simple, as a successful `terraform plan` run in a test case is often enough. We only write more specialized tests when we need to check the output of complex transformations in `for` loops.
As our testing needs are very simple, we also wanted to reduce the friction required to write new tests as much as possible: our tests are written in Python, and use `pytest` which is the standard for the language. We adopted this approach instead of others (Inspec/Kitchen, Terratest) as it allows writing simple functions as test units using Python which is simple and widely known.
As our testing needs are very simple, we also wanted to reduce the friction required to write new tests as much as possible: our tests are written in Python and use `pytest` which is the standard for the language, leveraging our [`tftest`](https://pypi.org/project/tftest/) library, which wraps the Terraform executable and returns familiar data structures for most commands.
The last piece of our testing framework is our [`tftest`](https://pypi.org/project/tftest/) library, which wraps the Terraform executable and returns familiar data structures for most commands.
Writing `pytest` unit tests to check plan results is really easy, but since wrapping modules and examples in dedicated fixtures and hand-coding checks gets annoying after a while, we developed a thin layer that allows us to use `tfvars` files to run tests, and `yaml` results to check results. In some specific situations you might still want to interact directly with `tftest` via Python, if that's the case skip to the legacy approach below.
##### Testing end-to-end examples
##### Testing end-to-end examples via `tfvars` and `yaml`
Putting it all together, here is how an end-to-end blueprint test works.
Our new approach to testing requires you to:
Each example is a Python module in its own directory, and a Terraform fixture that calls the example as a module:
- create a folder in the right `tests` hierarchy where specific test files will be hosted
- define `tfvars` files each with a specific variable configuration to test
- define `yaml` "inventory" files with the plan and output results you want to test
- declare which of these files need to be run as tests in a `tftest.yaml` file
Let's go through each step in succession, assuming you are testing the new `net-glb` module.
First create a new folder under `tests/modules` replacing any dash in the module name with underscores. You also need to create an empty `__init__.py` file in it, since the folder represents a package from the point of view of `pytest`. Note that if you were testing a blueprint the folder would go in `tests/blueprints`.
```bash
tests/blueprints/cloud_operations/iam_delegated_role_grants/
├── fixture
│   ├── main.tf
│   └── variables.tf
├── __init__.py
└── test_plan.py
mkdir tests/modules/net_glb
touch tests/modules/net_glb/__init__.py
```
One point of note is that the folder contains a Python module, so any dash needs to be replaced with underscores to make it importable. The actual test in the `test_plan.py` file looks like this:
```python
def test_resources(e2e_plan_runner):
"Test that plan works and the numbers of resources is as expected."
modules, resources = e2e_plan_runner()
assert len(modules) == 6
assert len(resources) == 18
```
It uses our pytest `e2e_plan_runner` fixture, which assumes a Terraform test setup is present in the `fixture` folder alongside the test file, runs `plan` on it, and returns the number of modules and resources.
The Terraform fixture is a single block that runs the whole example as a module, and a handful of variables that can be used to test different configurations (not used above so they could be replaced with static strings).
Then define a `tfvars` file with one of the module configurations you want to test. If you have a lot of variables which are shared across different tests, you can group all the common variables in a single `tfvars` file and associate it with each test's specific `tfvars` file (check the [organization module test](./tests/modules/organization/tftest.yaml) for an example).
```hcl
module "test" {
source = "../../../../../blueprints/cloud-operations/asset-inventory-feed-remediation"
project_create = var.project_create
project_id = var.project_id
# file: tests/modules/net_glb/test-simple.tfvars
name = "glb-test-0"
project_id = "my-project"
backend_buckets_config = {
default = {
bucket_name = "my-bucket"
}
}
```
You can run this test as part of or entire suite of tests, the blueprints suite, or individually:
Next define the corresponding inventory `yaml` file which will be used to assert values from the plan that uses the `tfvars` file above. In the inventory file you have three sections available:
- `values` is a map of resource indexes (the same ones used by Terraform state) and their attribute name and values; you can define just the attributes you are interested in and the other will be ignored
- `counts` is a map of resource types (eg `google_compute_engine`) and the number of times each type occurs in the plan; here too just define the ones the need checking
- `outputs` is a map of outputs and their values; where a value is unknown at plan time use the special `__missing__` token
```yaml
# file: tests/modules/net_glb/test-simple.yaml
values:
google_compute_global_forwarding_rule.default:
description: Terraform managed.
load_balancing_scheme: EXTERNAL
google_compute_target_http_proxy.default[0]:
name: glb-test-1
counts:
google_compute_backend_bucket: 1
google_compute_global_forwarding_rule: 1
google_compute_health_check: 1
google_compute_target_http_proxy: 1
google_compute_url_map: 1
outputs:
address: __missing__
backend_service_ids: __missing__
forwarding_rule: __missing__
group_ids: __missing__
health_check_ids: __missing__
neg_ids: __missing__
```
Create as many pairs of `tfvars`/`yaml` files as you need to test every scenario and feature, then create the file that triggers our fixture and converts them into `pytest` tests.
```yaml
# file: tests/modules/net_glb/tftest.yaml
module: modules/net-glb
# if there are variables shared among all tests you can define a common file
# common_tfvars:
# - defaults.tfvars
tests:
test-plan:
tfvars:
- test-plan.tfvars
- test-plan-extra.tfvars
```
A good example of tests showing different ways of leveraging our framework is in the [`tests/modules/organization`](./tests/modules/organization) folder.
##### Writing tests in Python (legacy approach)
Where possible, we recommend using the testing framework described in the previous section. However, if you need it, you can still write tests using Python directly.
In general, you should try to use the `plan_summary` fixture, which runs a a terraform plan and returns a `PlanSummary` object. The most important arguments to `plan_summary` are:
- the path of the Terraform module you want to test, relative to the root of the repository
- a list of paths representing the tfvars file to pass in to terraform. These paths are relative to the python file defining the test.
If successful, `plan_summary` will return a `PlanSummary` object with the `values`, `counts` and `outputs` attributes following the same semantics described in the previous section. You can use this fields to write your custom tests.
Like before let's imagine we're writing a (python) test for `net-glb` module. First create a new folder under `tests/modules` replacing any dash in the module name with underscores. You also need to create an empty `__init__.py` file in it, to ensure `pytest` discovers you new tests automatically.
```bash
# run all tests
pytest
# only run example tests
pytest tests/blueprints
# only run this example tests
pytest tests/blueprints/cloud_operations/iam_delegated_role_grants/
# only run a single unit
pytest tests/blueprints/cloud_operations/iam_delegated_role_grants/test_plan.py::test_resources
mkdir tests/modules/net_glb
touch tests/modules/net_glb/__init__.py
```
##### Testing modules
The same approach used above can also be used for testing modules when a simple plan is enough to validate code. When specific features need to be tested though, the `plan_runner` pytest fixture can be used so that plan resources are returned for inspection.
The following example from the `project` module leverages variables in the Terraform fixture to define which module resources are returned from plan.
Now create a file containing your tests, e.g. `test_plan.py`:
```python
def test_iam(plan_runner):
"Test IAM bindings."
iam = (
'{"roles/owner" = ["user:one@example.org"],'
'"roles/viewer" = ["user:two@example.org", "user:three@example.org"]}'
)
_, resources = plan_runner(iam=iam)
roles = dict((r['values']['role'], r['values']['members'])
for r in resources if r['type'] == 'google_project_iam_binding')
assert roles == {
'roles/owner': ['user:one@example.org'],
'roles/viewer': ['user:three@example.org', 'user:two@example.org']}
def test_name(plan_summary, tfvars_to_yaml, tmp_path):
s = plan_summary('modules/net-glb', tf_var_files=['test-plan.tfvars'])
address = 'google_compute_url_map.default'
assert s.values[address]['project'] == 'my-project'
```
For more examples on how to write python tests, the tests for [`organization`](./tests/modules/organization/test_plan_org_policies.py) and [`net-vpc`](./tests/modules/net_vpc/test_routes.py) modules.
#### Testing documentation examples
Most of our documentation examples are also tested via the `examples` test suite. To enable an example for testing just use the special `tftest` comment as the last line in the example, listing the number of modules and resources tested.
Most of our documentation examples are also tested via the `examples` test suite. To enable an example for testing just use the special `tftest` comment as the last line in the example, listing the number of modules and resources expected.
A few preset variables are available for use, as shown in this example from the `dns` module documentation.
A [few preset variables](./tests/examples/variables.tf) are available for use, as shown in this example from the `dns` module documentation.
```hcl
module "private-dns" {
@ -739,6 +777,25 @@ module "private-dns" {
# tftest modules=1 resources=2
```
Note that all HCL code examples in READMEs are automatically tested. To prevent this behavior, include `tftest skip` somewhere in the code.
#### Running tests from a temporary directory
Most of the time you can run tests using the `pytest` command as described in previous. However, the `plan_summary` fixture allows copying the root module and running the test from a temporary directory.
To enable this option, just define the environment variable `TFTEST_COPY` and any tests using the `plan_summary` fixture will automatically run from a temporary directory.
Running tests from temporary directories is useful if:
- you're running tests in parallel using `pytest-xdist`. In this case, just run you tests as follows:
```bash
TFTEST_COPY=1 pytest -n 4
```
- you're running tests for the `fast/` directory which contain tfvars and auto.tfvars files (which are read by terraform automatically) making your tests fail. In this case, you can run
```
TFTEST_COPY=1 pytest fast/
```
#### Fabric tools
The main tool you will interact with in development is `tfdoc`, used to generate file, output and variable tables in README documents.