Fix composer roles and README.

This commit is contained in:
Lorenzo Caggioni 2022-02-07 08:20:20 +01:00
parent e0f83569f1
commit 78fcdc5374
2 changed files with 23 additions and 16 deletions

View File

@ -84,19 +84,24 @@ module "orc-prj" {
group_iam = local.group_iam_orc
oslogin = false
policy_boolean = var.composer_config.project_policy_boolean
services = concat(var.project_services, [
"artifactregistry.googleapis.com",
"bigquery.googleapis.com",
"bigqueryreservation.googleapis.com",
"bigquerystorage.googleapis.com",
"cloudkms.googleapis.com",
"composer.googleapis.com",
"container.googleapis.com",
"dataflow.googleapis.com",
"pubsub.googleapis.com",
"servicenetworking.googleapis.com",
"storage.googleapis.com",
"storage-component.googleapis.com"
services = concat(
var.project_services,
[
"artifactregistry.googleapis.com",
"bigquery.googleapis.com",
"bigqueryreservation.googleapis.com",
"bigquerystorage.googleapis.com",
"cloudbuild.googleapis.com",
"cloudkms.googleapis.com",
"composer.googleapis.com",
"compute.googleapis.com",
"container.googleapis.com",
"containerregistry.googleapis.com",
"dataflow.googleapis.com",
"pubsub.googleapis.com",
"servicenetworking.googleapis.com",
"storage.googleapis.com",
"storage-component.googleapis.com"
])
service_encryption_key_ids = {
composer = [try(local.service_encryption_keys.composer, null)]

View File

@ -64,11 +64,13 @@ Using of service account keys within a data pipeline exposes to several security
We use three groups to control access to resources:
- *Data Engineers* They handle and run the Data Hub, with read access to all resources in order to troubleshoot possible issues with pipelines. This team can also impersonate any service account.
- *Data Analyst*. They perform analysis on datasets, with read access to the data lake L2 project, and BigQuery READ/WRITE access to the playground project. - *Data Security*:. They handle security configurations related to the Data Hub.
- *Data Analyst*. They perform analysis on datasets, with read access to the data lake L2 project, and BigQuery READ/WRITE access to the playground project.
- *Data Security*:. They handle security configurations related to the Data Hub.
You can configure groups via the `groups` variable.
### Virtual Private Cloud (VPC) design
As is often the case in real-world configurations, this example accepts as input an existing [Shared-VPC](https://cloud.google.com/vpc/docs/shared-vpc) via the `network_config` variable.
As is often the case in real-world configurations, this example accepts as input an existing [Shared-VPC](https://cloud.google.com/vpc/docs/shared-vpc) via the `network_config` variable. Make sure that the GKE API (`container.googleapis.com`) is enabled in the VPC host project.
If the `network_config` variable is not provided, one VPC will be created in each project that supports network resources (load, transformation and orchestration).
@ -179,7 +181,7 @@ organization = {
}
```
For more fine details check variables on [`variables.tf`](./variables.tf) and update according to the desired configuration.
For more fine details check variables on [`variables.tf`](./variables.tf) and update according to the desired configuration. Remember to create team groups described [below](#groups).
Once the configuration is complete, run the project factory by running