Merge branch 'master' into feature/apigee-add-retention
This commit is contained in:
commit
a0bd0d4414
|
@ -32,7 +32,7 @@ Currently available modules:
|
|||
- **foundational** - [billing budget](./modules/billing-budget), [Cloud Identity group](./modules/cloud-identity-group/), [folder](./modules/folder), [service accounts](./modules/iam-service-account), [logging bucket](./modules/logging-bucket), [organization](./modules/organization), [project](./modules/project), [projects-data-source](./modules/projects-data-source)
|
||||
- **networking** - [DNS](./modules/dns), [DNS Response Policy](./modules/dns-response-policy/), [Cloud Endpoints](./modules/endpoints), [address reservation](./modules/net-address), [NAT](./modules/net-cloudnat), [VLAN Attachment](./modules/net-vlan-attachment/), [External Application LB](./modules/net-lb-app-ext/), [External Passthrough Network LB](./modules/net-lb-ext), [Internal Application LB](./modules/net-lb-app-int), [Internal Passthrough Network LB](./modules/net-lb-int), [Internal Proxy Network LB](./modules/net-lb-proxy-int), [IPSec over Interconnect](./modules/net-ipsec-over-interconnect), [VPC](./modules/net-vpc), [VPC firewall](./modules/net-vpc-firewall), [VPC firewall policy](./modules/net-vpc-firewall-policy), [VPC peering](./modules/net-vpc-peering), [VPN dynamic](./modules/net-vpn-dynamic), [HA VPN](./modules/net-vpn-ha), [VPN static](./modules/net-vpn-static), [Service Directory](./modules/service-directory), [Secure Web Proxy](./modules/net-swp)
|
||||
- **compute** - [VM/VM group](./modules/compute-vm), [MIG](./modules/compute-mig), [COS container](./modules/cloud-config-container/cos-generic-metadata/) (coredns, mysql, onprem, squid), [GKE cluster](./modules/gke-cluster-standard), [GKE hub](./modules/gke-hub), [GKE nodepool](./modules/gke-nodepool)
|
||||
- **data** - [AlloyDB instance](./modules/alloydb-instance), [BigQuery dataset](./modules/bigquery-dataset), [Bigtable instance](./modules/bigtable-instance), [Dataplex](./modules/dataplex), [Cloud SQL instance](./modules/cloudsql-instance), [Data Catalog Policy Tag](./modules/data-catalog-policy-tag), [Datafusion](./modules/datafusion), [Dataproc](./modules/dataproc), [GCS](./modules/gcs), [Pub/Sub](./modules/pubsub)
|
||||
- **data** - [AlloyDB instance](./modules/alloydb-instance), [BigQuery dataset](./modules/bigquery-dataset), [Bigtable instance](./modules/bigtable-instance), [Dataplex](./modules/dataplex), [Dataplex DataScan](./modules/dataplex-datascan/), [Cloud SQL instance](./modules/cloudsql-instance), [Data Catalog Policy Tag](./modules/data-catalog-policy-tag), [Datafusion](./modules/datafusion), [Dataproc](./modules/dataproc), [GCS](./modules/gcs), [Pub/Sub](./modules/pubsub)
|
||||
- **development** - [API Gateway](./modules/api-gateway), [Apigee](./modules/apigee), [Artifact Registry](./modules/artifact-registry), [Container Registry](./modules/container-registry), [Cloud Source Repository](./modules/source-repository)
|
||||
- **security** - [Binauthz](./modules/binauthz/), [KMS](./modules/kms), [SecretManager](./modules/secret-manager), [VPC Service Control](./modules/vpc-sc)
|
||||
- **serverless** - [Cloud Function v1](./modules/cloud-function-v1), [Cloud Function v2](./modules/cloud-function-v2), [Cloud Run](./modules/cloud-run)
|
||||
|
|
|
@ -24,26 +24,22 @@ A single pre-existing project and a VPC is used in this blueprint to keep variab
|
|||
The provided project needs a valid billing account and the Compute APIs enabled.
|
||||
|
||||
The two Dedicated Interconnect connections should already exist, either in the same project or in any other project belonging to the same GCP Organization.
|
||||
|
||||
|
||||
<!-- BEGIN TFDOC -->
|
||||
|
||||
## Variables
|
||||
|
||||
| name | description | type | required | default |
|
||||
|---|---|:---:|:---:|:---:|
|
||||
| [network](variables.tf#L18) | The VPC name to which resources are associated to. | <code>string</code> | ✓ | |
|
||||
| [overlay_config](variables.tf#L24) | Configuration for the overlay resources. | <code title="object({ gcp_bgp = object({ asn = number name = optional(string) keepalive = optional(number) custom_advertise = optional(object({ all_subnets = bool ip_ranges = map(string) })) }) onprem_vpn_gateway = object({ redundancy_type = optional(string, "TWO_IPS_REDUNDANCY") interfaces = list(string) }) gateways = map(map(object({ bgp_peer = object({ address = string asn = number route_priority = optional(number, 1000) custom_advertise = optional(object({ all_subnets = bool all_vpc_subnets = bool all_peer_vpc_subnets = bool ip_ranges = map(string) })) }) bgp_session_range = string ike_version = optional(number, 2) peer_external_gateway_interface = optional(number) peer_gateway = optional(string, "default") router = optional(string) shared_secret = optional(string) vpn_gateway_interface = number })) ) })">object({…})</code> | ✓ | |
|
||||
| [project_id](variables.tf#L66) | The project id. | <code>string</code> | ✓ | |
|
||||
| [region](variables.tf#L71) | GCP Region. | <code>string</code> | ✓ | |
|
||||
| [underlay_config](variables.tf#L76) | Configuration for the underlay resources. | <code title="object({ attachments = map(object({ bandwidth = optional(string, "BPS_10G") base_name = optional(string, "encrypted-vlan-attachment") bgp_range = string interconnect_self_link = string onprem_asn = number vlan_tag = number vpn_gateways_ip_range = string })) gcp_bgp = object({ asn = number }) interconnect_type = optional(string, "DEDICATED") })">object({…})</code> | ✓ | |
|
||||
| [overlay_config](variables.tf#L24) | Configuration for the overlay resources. | <code title="object({ gcp_bgp = object({ asn = number name = optional(string) keepalive = optional(number) custom_advertise = optional(object({ all_subnets = bool ip_ranges = map(string) })) }) onprem_vpn_gateway_interfaces = list(string) gateways = map(map(object({ bgp_peer = object({ address = string asn = number route_priority = optional(number, 1000) custom_advertise = optional(object({ all_subnets = bool all_vpc_subnets = bool all_peer_vpc_subnets = bool ip_ranges = map(string) })) }) bgp_session_range = string ike_version = optional(number, 2) peer_external_gateway_interface = optional(number) peer_gateway = optional(string, "default") router = optional(string) shared_secret = optional(string) vpn_gateway_interface = number })) ) })">object({…})</code> | ✓ | |
|
||||
| [project_id](variables.tf#L63) | The project id. | <code>string</code> | ✓ | |
|
||||
| [region](variables.tf#L68) | GCP Region. | <code>string</code> | ✓ | |
|
||||
| [underlay_config](variables.tf#L73) | Configuration for the underlay resources. | <code title="object({ attachments = map(object({ bandwidth = optional(string, "BPS_10G") base_name = optional(string, "encrypted-vlan-attachment") bgp_range = string interconnect_self_link = string onprem_asn = number vlan_tag = number vpn_gateways_ip_range = string })) gcp_bgp = object({ asn = number }) interconnect_type = optional(string, "DEDICATED") })">object({…})</code> | ✓ | |
|
||||
|
||||
## Outputs
|
||||
|
||||
| name | description | sensitive |
|
||||
|---|---|:---:|
|
||||
| [underlay](outputs.tf#L17) | Setup for the underlay connection. | |
|
||||
|
||||
<!-- END TFDOC -->
|
||||
## Test
|
||||
|
||||
|
@ -64,9 +60,7 @@ module "test" {
|
|||
}
|
||||
}
|
||||
}
|
||||
onprem_vpn_gateway = {
|
||||
interfaces = ["172.16.0.1", "172.16.0.2"]
|
||||
}
|
||||
onprem_vpn_gateway_interfaces = ["172.16.0.1", "172.16.0.2"]
|
||||
gateways = {
|
||||
a = {
|
||||
remote-0 = {
|
||||
|
|
|
@ -47,9 +47,9 @@ resource "google_compute_external_vpn_gateway" "default" {
|
|||
name = "peer-vpn-gateway"
|
||||
project = var.project_id
|
||||
description = "Peer IPSec over Interconnect VPN gateway"
|
||||
redundancy_type = length(var.overlay_config.onprem_vpn_gateway) == 2 ? "TWO_IPS_REDUNDANCY" : "SINGLE_IP_INTERNALLY_REDUNDANT"
|
||||
redundancy_type = length(var.overlay_config.onprem_vpn_gateway_interfaces) == 2 ? "TWO_IPS_REDUNDANCY" : "SINGLE_IP_INTERNALLY_REDUNDANT"
|
||||
dynamic "interface" {
|
||||
for_each = var.overlay_config.onprem_vpn_gateway.interfaces
|
||||
for_each = var.overlay_config.onprem_vpn_gateway_interfaces
|
||||
content {
|
||||
id = interface.key
|
||||
ip_address = interface.value
|
||||
|
|
|
@ -33,10 +33,7 @@ variable "overlay_config" {
|
|||
ip_ranges = map(string)
|
||||
}))
|
||||
})
|
||||
onprem_vpn_gateway = object({
|
||||
redundancy_type = optional(string, "TWO_IPS_REDUNDANCY")
|
||||
interfaces = list(string)
|
||||
})
|
||||
onprem_vpn_gateway_interfaces = list(string)
|
||||
gateways = map(map(object({
|
||||
bgp_peer = object({
|
||||
address = string
|
||||
|
|
|
@ -77,6 +77,7 @@ These modules are used in the examples included in this repository. If you are u
|
|||
- [BigQuery dataset](./bigquery-dataset)
|
||||
- [Bigtable instance](./bigtable-instance)
|
||||
- [Dataplex](./dataplex)
|
||||
- [Dataplex DataScan](./dataplex-datascan/)
|
||||
- [Cloud SQL instance](./cloudsql-instance)
|
||||
- [Data Catalog Policy Tag](./data-catalog-policy-tag)
|
||||
- [Datafusion](./datafusion)
|
||||
|
|
|
@ -0,0 +1,443 @@
|
|||
# Dataplex DataScan
|
||||
|
||||
This module manages the creation of Dataplex DataScan resources.
|
||||
|
||||
## Data Profiling
|
||||
|
||||
This example shows how to create a Data Profiling scan. To create an Data Profiling scan, provide the `data_profile_spec` input arguments as documented in https://cloud.google.com/dataplex/docs/reference/rest/v1/DataProfileSpec.
|
||||
|
||||
```hcl
|
||||
module "dataplex-datascan" {
|
||||
source = "./fabric/modules/dataplex-datascan"
|
||||
name = "datascan"
|
||||
prefix = "test"
|
||||
project_id = "my-project-name"
|
||||
region = "us-central1"
|
||||
labels = {
|
||||
billing_id = "a"
|
||||
}
|
||||
data = {
|
||||
resource = "//bigquery.googleapis.com/projects/bigquery-public-data/datasets/austin_bikeshare/tables/bikeshare_stations"
|
||||
}
|
||||
data_profile_spec = {
|
||||
sampling_percent = 100
|
||||
row_filter = "station_id > 1000"
|
||||
}
|
||||
incremental_field = "modified_date"
|
||||
}
|
||||
# tftest modules=1 resources=1 inventory=datascan_profiling.yaml
|
||||
```
|
||||
|
||||
## Data Quality
|
||||
|
||||
To create an Data Quality scan, provide the `data_quality_spec` input arguments as documented in https://cloud.google.com/dataplex/docs/reference/rest/v1/DataQualitySpec.
|
||||
|
||||
Documentation for the supported rule types and rule specifications can be found in https://cloud.example.com/dataplex/docs/reference/rest/v1/DataQualityRule.
|
||||
|
||||
This example shows how to create a Data Quality scan.
|
||||
|
||||
```hcl
|
||||
module "dataplex-datascan" {
|
||||
source = "./fabric/modules/dataplex-datascan"
|
||||
name = "datascan"
|
||||
prefix = "test"
|
||||
project_id = "my-project-name"
|
||||
region = "us-central1"
|
||||
labels = {
|
||||
billing_id = "a"
|
||||
}
|
||||
execution_schedule = "TZ=America/New_York 0 1 * * *"
|
||||
data = {
|
||||
resource = "//bigquery.googleapis.com/projects/bigquery-public-data/datasets/austin_bikeshare/tables/bikeshare_stations"
|
||||
}
|
||||
incremental_field = "modified_date"
|
||||
data_quality_spec = {
|
||||
sampling_percent = 100
|
||||
row_filter = "station_id > 1000"
|
||||
rules = [
|
||||
{
|
||||
dimension = "VALIDITY"
|
||||
non_null_expectation = {}
|
||||
column = "address"
|
||||
threshold = 0.99
|
||||
},
|
||||
{
|
||||
column = "council_district"
|
||||
dimension = "VALIDITY"
|
||||
ignore_null = true
|
||||
threshold = 0.9
|
||||
range_expectation = {
|
||||
min_value = 1
|
||||
max_value = 10
|
||||
strict_min_enabled = true
|
||||
strict_max_enabled = false
|
||||
}
|
||||
},
|
||||
{
|
||||
column = "council_district"
|
||||
dimension = "VALIDITY"
|
||||
threshold = 0.8
|
||||
range_expectation = {
|
||||
min_value = 3
|
||||
max_value = 9
|
||||
}
|
||||
},
|
||||
{
|
||||
column = "power_type"
|
||||
dimension = "VALIDITY"
|
||||
ignore_null = false
|
||||
regex_expectation = {
|
||||
regex = ".*solar.*"
|
||||
}
|
||||
},
|
||||
{
|
||||
column = "property_type"
|
||||
dimension = "VALIDITY"
|
||||
ignore_null = false
|
||||
set_expectation = {
|
||||
values = ["sidewalk", "parkland"]
|
||||
}
|
||||
},
|
||||
{
|
||||
column = "address"
|
||||
dimension = "UNIQUENESS"
|
||||
uniqueness_expectation = {}
|
||||
},
|
||||
{
|
||||
column = "number_of_docks"
|
||||
dimension = "VALIDITY"
|
||||
statistic_range_expectation = {
|
||||
statistic = "MEAN"
|
||||
min_value = 5
|
||||
max_value = 15
|
||||
strict_min_enabled = true
|
||||
strict_max_enabled = true
|
||||
}
|
||||
},
|
||||
{
|
||||
column = "footprint_length"
|
||||
dimension = "VALIDITY"
|
||||
row_condition_expectation = {
|
||||
sql_expression = "footprint_length > 0 AND footprint_length <= 10"
|
||||
}
|
||||
},
|
||||
{
|
||||
dimension = "VALIDITY"
|
||||
table_condition_expectation = {
|
||||
sql_expression = "COUNT(*) > 0"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
# tftest modules=1 resources=1 inventory=datascan_dq.yaml
|
||||
```
|
||||
|
||||
This example shows how you can pass the rules configurations as a separate YAML file into the module. This should produce the same DataScan configuration as the previous example.
|
||||
|
||||
```hcl
|
||||
module "dataplex-datascan" {
|
||||
source = "./fabric/modules/dataplex-datascan"
|
||||
name = "datascan"
|
||||
prefix = "test"
|
||||
project_id = "my-project-name"
|
||||
region = "us-central1"
|
||||
labels = {
|
||||
billing_id = "a"
|
||||
}
|
||||
execution_schedule = "TZ=America/New_York 0 1 * * *"
|
||||
data = {
|
||||
resource = "//bigquery.googleapis.com/projects/bigquery-public-data/datasets/austin_bikeshare/tables/bikeshare_stations"
|
||||
}
|
||||
incremental_field = "modified_date"
|
||||
data_quality_spec_file = {
|
||||
path = "config/data_quality_spec.yaml"
|
||||
}
|
||||
}
|
||||
# tftest modules=1 resources=1 files=data_quality_spec inventory=datascan_dq.yaml
|
||||
```
|
||||
|
||||
The content of the `config/data_quality_spec.yaml` files is as follows:
|
||||
|
||||
```yaml
|
||||
# tftest-file id=data_quality_spec path=config/data_quality_spec.yaml
|
||||
sampling_percent: 100
|
||||
row_filter: "station_id > 1000"
|
||||
rules:
|
||||
- column: address
|
||||
dimension: VALIDITY
|
||||
ignore_null: null
|
||||
non_null_expectation: {}
|
||||
threshold: 0.99
|
||||
- column: council_district
|
||||
dimension: VALIDITY
|
||||
ignore_null: true
|
||||
threshold: 0.9
|
||||
range_expectation:
|
||||
max_value: '10'
|
||||
min_value: '1'
|
||||
strict_max_enabled: false
|
||||
strict_min_enabled: true
|
||||
- column: council_district
|
||||
dimension: VALIDITY
|
||||
range_expectation:
|
||||
max_value: '9'
|
||||
min_value: '3'
|
||||
threshold: 0.8
|
||||
- column: power_type
|
||||
dimension: VALIDITY
|
||||
ignore_null: false
|
||||
regex_expectation:
|
||||
regex: .*solar.*
|
||||
- column: property_type
|
||||
dimension: VALIDITY
|
||||
ignore_null: false
|
||||
set_expectation:
|
||||
values:
|
||||
- sidewalk
|
||||
- parkland
|
||||
- column: address
|
||||
dimension: UNIQUENESS
|
||||
uniqueness_expectation: {}
|
||||
- column: number_of_docks
|
||||
dimension: VALIDITY
|
||||
statistic_range_expectation:
|
||||
max_value: '15'
|
||||
min_value: '5'
|
||||
statistic: MEAN
|
||||
strict_max_enabled: true
|
||||
strict_min_enabled: true
|
||||
- column: footprint_length
|
||||
dimension: VALIDITY
|
||||
row_condition_expectation:
|
||||
sql_expression: footprint_length > 0 AND footprint_length <= 10
|
||||
- dimension: VALIDITY
|
||||
table_condition_expectation:
|
||||
sql_expression: COUNT(*) > 0
|
||||
```
|
||||
|
||||
While the module only accepts input in snake_case, the YAML file provided to the `data_quality_spec_file` variable can use either camelCase or snake_case. This example below should also produce the same DataScan configuration as the previous examples.
|
||||
|
||||
```hcl
|
||||
module "dataplex-datascan" {
|
||||
source = "./fabric/modules/dataplex-datascan"
|
||||
name = "datascan"
|
||||
prefix = "test"
|
||||
project_id = "my-project-name"
|
||||
region = "us-central1"
|
||||
labels = {
|
||||
billing_id = "a"
|
||||
}
|
||||
execution_schedule = "TZ=America/New_York 0 1 * * *"
|
||||
data = {
|
||||
resource = "//bigquery.googleapis.com/projects/bigquery-public-data/datasets/austin_bikeshare/tables/bikeshare_stations"
|
||||
}
|
||||
incremental_field = "modified_date"
|
||||
data_quality_spec_file = {
|
||||
path = "config/data_quality_spec_camel_case.yaml"
|
||||
}
|
||||
}
|
||||
# tftest modules=1 resources=1 files=data_quality_spec_camel_case inventory=datascan_dq.yaml
|
||||
```
|
||||
|
||||
The content of the `config/data_quality_spec_camel_case.yaml` files is as follows:
|
||||
|
||||
```yaml
|
||||
# tftest-file id=data_quality_spec_camel_case path=config/data_quality_spec_camel_case.yaml
|
||||
samplingPercent: 100
|
||||
rowFilter: "station_id > 1000"
|
||||
rules:
|
||||
- column: address
|
||||
dimension: VALIDITY
|
||||
ignoreNull: null
|
||||
nonNullExpectation: {}
|
||||
threshold: 0.99
|
||||
- column: council_district
|
||||
dimension: VALIDITY
|
||||
ignoreNull: true
|
||||
threshold: 0.9
|
||||
rangeExpectation:
|
||||
maxValue: '10'
|
||||
minValue: '1'
|
||||
strictMaxEnabled: false
|
||||
strictMinEnabled: true
|
||||
- column: council_district
|
||||
dimension: VALIDITY
|
||||
rangeExpectation:
|
||||
maxValue: '9'
|
||||
minValue: '3'
|
||||
threshold: 0.8
|
||||
- column: power_type
|
||||
dimension: VALIDITY
|
||||
ignoreNull: false
|
||||
regexExpectation:
|
||||
regex: .*solar.*
|
||||
- column: property_type
|
||||
dimension: VALIDITY
|
||||
ignoreNull: false
|
||||
setExpectation:
|
||||
values:
|
||||
- sidewalk
|
||||
- parkland
|
||||
- column: address
|
||||
dimension: UNIQUENESS
|
||||
uniquenessExpectation: {}
|
||||
- column: number_of_docks
|
||||
dimension: VALIDITY
|
||||
statisticRangeExpectation:
|
||||
maxValue: '15'
|
||||
minValue: '5'
|
||||
statistic: MEAN
|
||||
strictMaxEnabled: true
|
||||
strictMinEnabled: true
|
||||
- column: footprint_length
|
||||
dimension: VALIDITY
|
||||
rowConditionExpectation:
|
||||
sqlExpression: footprint_length > 0 AND footprint_length <= 10
|
||||
- dimension: VALIDITY
|
||||
tableConditionExpectation:
|
||||
sqlExpression: COUNT(*) > 0
|
||||
```
|
||||
|
||||
## Data Source
|
||||
|
||||
The input variable 'data' is required to create a DataScan. This value is immutable. Once it is set, you cannot change the DataScan to another source.
|
||||
|
||||
The input variable 'data' should be an object containing a single key-value pair that can be one of:
|
||||
* `entity`: The Dataplex entity that represents the data source (e.g. BigQuery table) for DataScan, of the form: `projects/{project_number}/locations/{locationId}/lakes/{lakeId}/zones/{zoneId}/entities/{entityId}`.
|
||||
* `resource`: The service-qualified full resource name of the cloud resource for a DataScan job to scan against. The field could be: BigQuery table of type "TABLE" for DataProfileScan/DataQualityScan format, e.g: `//bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID/tables/TABLE_ID`.
|
||||
|
||||
The example below shows how to specify the data source for DataScan of type `resource`:
|
||||
|
||||
```hcl
|
||||
module "dataplex-datascan" {
|
||||
source = "./fabric/modules/dataplex-datascan"
|
||||
name = "datascan"
|
||||
prefix = "test"
|
||||
project_id = "my-project-name"
|
||||
region = "us-central1"
|
||||
data = {
|
||||
resource = "//bigquery.googleapis.com/projects/bigquery-public-data/datasets/austin_bikeshare/tables/bikeshare_stations"
|
||||
}
|
||||
data_profile_spec = {}
|
||||
}
|
||||
# tftest modules=1 resources=1
|
||||
```
|
||||
|
||||
The example below shows how to specify the data source for DataScan of type `entity`:
|
||||
|
||||
```hcl
|
||||
module "dataplex-datascan" {
|
||||
source = "./fabric/modules/dataplex-datascan"
|
||||
name = "datascan"
|
||||
prefix = "test"
|
||||
project_id = "my-project-name"
|
||||
region = "us-central1"
|
||||
data = {
|
||||
entity = "projects/<project_number>/locations/<locationId>/lakes/<lakeId>/zones/<zoneId>/entities/<entityId>"
|
||||
}
|
||||
data_profile_spec = {}
|
||||
}
|
||||
# tftest modules=1 resources=1 inventory=datascan_entity.yaml
|
||||
```
|
||||
|
||||
## Execution Schedule
|
||||
|
||||
The input variable 'execution_schedule' specifies when a scan should be triggered, based on a cron schedule expression.
|
||||
|
||||
If not specified, the default is `on_demand`, which means the scan will not run until the user calls `dataScans.run` API.
|
||||
|
||||
The following example shows how to schedule the DataScan at 1AM everyday using 'America/New_York' timezone.
|
||||
|
||||
```hcl
|
||||
module "dataplex-datascan" {
|
||||
source = "./fabric/modules/dataplex-datascan"
|
||||
name = "datascan"
|
||||
prefix = "test"
|
||||
project_id = "my-project-name"
|
||||
region = "us-central1"
|
||||
execution_schedule = "TZ=America/New_York 0 1 * * *"
|
||||
data = {
|
||||
resource = "//bigquery.googleapis.com/projects/bigquery-public-data/datasets/austin_bikeshare/tables/bikeshare_stations"
|
||||
}
|
||||
data_profile_spec = {}
|
||||
}
|
||||
|
||||
# tftest modules=1 resources=1 inventory=datascan_cron.yaml
|
||||
```
|
||||
|
||||
## IAM
|
||||
|
||||
There are three mutually exclusive ways of managing IAM in this module
|
||||
|
||||
- non-authoritative via the `iam_additive` and `iam_additive_members` variables, where bindings created outside this module will coexist with those managed here
|
||||
- authoritative via the `group_iam` and `iam` variables, where bindings created outside this module (eg in the console) will be removed at each `terraform apply` cycle if the same role is also managed here
|
||||
- authoritative policy via the `iam_policy` variable, where any binding created outside this module (eg in the console) will be removed at each `terraform apply` cycle regardless of the role
|
||||
|
||||
The authoritative and additive approaches can be used together, provided different roles are managed by each. The IAM policy is incompatible with the other approaches, and must be used with extreme care.
|
||||
|
||||
Some care must also be taken with the `group_iam` variable (and in some situations with the additive variables) to ensure that variable keys are static values, so that Terraform is able to compute the dependency graph.
|
||||
|
||||
An example is provided beow for using `group_iam` and `iam` variables.
|
||||
|
||||
```hcl
|
||||
module "dataplex-datascan" {
|
||||
source = "./fabric/modules/dataplex-datascan"
|
||||
name = "datascan"
|
||||
prefix = "test"
|
||||
project_id = "my-project-name"
|
||||
region = "us-central1"
|
||||
data = {
|
||||
resource = "//bigquery.googleapis.com/projects/bigquery-public-data/datasets/austin_bikeshare/tables/bikeshare_stations"
|
||||
}
|
||||
data_profile_spec = {}
|
||||
iam = {
|
||||
"roles/dataplex.dataScanAdmin" = [
|
||||
"serviceAccount:svc-1@project-id.iam.gserviceaccount.com"
|
||||
],
|
||||
"roles/dataplex.dataScanEditor" = [
|
||||
"user:admin-user@example.com"
|
||||
]
|
||||
}
|
||||
group_iam = {
|
||||
"user-group@example.com" = [
|
||||
"roles/dataplex.dataScanViewer"
|
||||
]
|
||||
}
|
||||
}
|
||||
# tftest modules=1 resources=4 inventory=datascan_iam.yaml
|
||||
```
|
||||
|
||||
## TODO
|
||||
<!-- BEGIN TFDOC -->
|
||||
## Variables
|
||||
|
||||
| name | description | type | required | default |
|
||||
|---|---|:---:|:---:|:---:|
|
||||
| [data](variables.tf#L17) | The data source for DataScan. The source can be either a Dataplex `entity` or a BigQuery `resource`. | <code title="object({ entity = optional(string) resource = optional(string) })">object({…})</code> | ✓ | |
|
||||
| [name](variables.tf#L146) | Name of Dataplex Scan. | <code>string</code> | ✓ | |
|
||||
| [project_id](variables.tf#L157) | The ID of the project where the Dataplex DataScan will be created. | <code>string</code> | ✓ | |
|
||||
| [region](variables.tf#L162) | Region for the Dataplex DataScan. | <code>string</code> | ✓ | |
|
||||
| [data_profile_spec](variables.tf#L29) | DataProfileScan related setting. Variable descriptions are provided in https://cloud.google.com/dataplex/docs/reference/rest/v1/DataProfileSpec. | <code title="object({ sampling_percent = optional(number) row_filter = optional(string) })">object({…})</code> | | <code>null</code> |
|
||||
| [data_quality_spec](variables.tf#L38) | DataQualityScan related setting. Variable descriptions are provided in https://cloud.google.com/dataplex/docs/reference/rest/v1/DataQualitySpec. | <code title="object({ sampling_percent = optional(number) row_filter = optional(string) rules = list(object({ column = optional(string) ignore_null = optional(bool, null) dimension = string threshold = optional(number) non_null_expectation = optional(object({})) range_expectation = optional(object({ min_value = optional(number) max_value = optional(number) strict_min_enabled = optional(bool) strict_max_enabled = optional(bool) })) regex_expectation = optional(object({ regex = string })) set_expectation = optional(object({ values = list(string) })) uniqueness_expectation = optional(object({})) statistic_range_expectation = optional(object({ statistic = string min_value = optional(number) max_value = optional(number) strict_min_enabled = optional(bool) strict_max_enabled = optional(bool) })) row_condition_expectation = optional(object({ sql_expression = string })) table_condition_expectation = optional(object({ sql_expression = string })) })) })">object({…})</code> | | <code>null</code> |
|
||||
| [data_quality_spec_file](variables.tf#L80) | Path to a YAML file containing DataQualityScan related setting. Input content can use either camelCase or snake_case. Variables description are provided in https://cloud.google.com/dataplex/docs/reference/rest/v1/DataQualitySpec. | <code title="object({ path = string })">object({…})</code> | | <code>null</code> |
|
||||
| [description](variables.tf#L88) | Custom description for DataScan. | <code>string</code> | | <code>null</code> |
|
||||
| [execution_schedule](variables.tf#L94) | Schedule DataScan to run periodically based on a cron schedule expression. If not specified, the DataScan is created with `on_demand` schedule, which means it will not run until the user calls `dataScans.run` API. | <code>string</code> | | <code>null</code> |
|
||||
| [group_iam](variables.tf#L100) | Authoritative IAM binding for organization groups, in {GROUP_EMAIL => [ROLES]} format. Group emails need to be static. Can be used in combination with the `iam` variable. | <code>map(list(string))</code> | | <code>{}</code> |
|
||||
| [iam](variables.tf#L107) | Dataplex DataScan IAM bindings in {ROLE => [MEMBERS]} format. | <code>map(list(string))</code> | | <code>{}</code> |
|
||||
| [iam_additive](variables.tf#L114) | IAM additive bindings in {ROLE => [MEMBERS]} format. | <code>map(list(string))</code> | | <code>{}</code> |
|
||||
| [iam_additive_members](variables.tf#L121) | IAM additive bindings in {MEMBERS => [ROLE]} format. This might break if members are dynamic values. | <code>map(list(string))</code> | | <code>{}</code> |
|
||||
| [iam_policy](variables.tf#L127) | IAM authoritative policy in {ROLE => [MEMBERS]} format. Roles and members not explicitly listed will be cleared, use with extreme caution. | <code>map(list(string))</code> | | <code>null</code> |
|
||||
| [incremental_field](variables.tf#L133) | The unnested field (of type Date or Timestamp) that contains values which monotonically increase over time. If not specified, a data scan will run for all data in the table. | <code>string</code> | | <code>null</code> |
|
||||
| [labels](variables.tf#L139) | Resource labels. | <code>map(string)</code> | | <code>{}</code> |
|
||||
| [prefix](variables.tf#L151) | Optional prefix used to generate Dataplex DataScan ID. | <code>string</code> | | <code>null</code> |
|
||||
|
||||
## Outputs
|
||||
|
||||
| name | description | sensitive |
|
||||
|---|---|:---:|
|
||||
| [data_scan_id](outputs.tf#L17) | Dataplex DataScan ID. | |
|
||||
| [id](outputs.tf#L22) | A fully qualified Dataplex DataScan identifier for the resource with format projects/{{project}}/locations/{{location}}/dataScans/{{data_scan_id}}. | |
|
||||
| [name](outputs.tf#L27) | The relative resource name of the scan, of the form: projects/{project}/locations/{locationId}/dataScans/{datascan_id}, where project refers to a project_id or project_number and locationId refers to a GCP region. | |
|
||||
| [type](outputs.tf#L32) | The type of DataScan. | |
|
||||
<!-- END TFDOC -->
|
|
@ -0,0 +1,89 @@
|
|||
/**
|
||||
* Copyright 2023 Google LLC
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
locals {
|
||||
_group_iam_roles = distinct(flatten(values(var.group_iam)))
|
||||
_group_iam = {
|
||||
for r in local._group_iam_roles : r => [
|
||||
for k, v in var.group_iam : "group:${k}" if try(index(v, r), null) != null
|
||||
]
|
||||
}
|
||||
_iam_additive_pairs = flatten([
|
||||
for role, members in var.iam_additive : [
|
||||
for member in members : { role = role, member = member }
|
||||
]
|
||||
])
|
||||
_iam_additive_member_pairs = flatten([
|
||||
for member, roles in var.iam_additive_members : [
|
||||
for role in roles : { role = role, member = member }
|
||||
]
|
||||
])
|
||||
iam = {
|
||||
for role in distinct(concat(keys(var.iam), keys(local._group_iam))) :
|
||||
role => concat(
|
||||
try(var.iam[role], []),
|
||||
try(local._group_iam[role], [])
|
||||
)
|
||||
}
|
||||
iam_additive = {
|
||||
for pair in concat(local._iam_additive_pairs, local._iam_additive_member_pairs) :
|
||||
"${pair.role}-${pair.member}" => {
|
||||
role = pair.role
|
||||
member = pair.member
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
resource "google_dataplex_datascan_iam_binding" "authoritative_for_role" {
|
||||
for_each = local.iam
|
||||
project = google_dataplex_datascan.datascan.project
|
||||
location = google_dataplex_datascan.datascan.location
|
||||
data_scan_id = google_dataplex_datascan.datascan.data_scan_id
|
||||
role = each.key
|
||||
members = each.value
|
||||
}
|
||||
|
||||
resource "google_dataplex_datascan_iam_member" "additive" {
|
||||
for_each = (
|
||||
length(var.iam_additive) + length(var.iam_additive_members) > 0
|
||||
? local.iam_additive
|
||||
: {}
|
||||
)
|
||||
project = google_dataplex_datascan.datascan.project
|
||||
location = google_dataplex_datascan.datascan.location
|
||||
data_scan_id = google_dataplex_datascan.datascan.data_scan_id
|
||||
role = each.value.role
|
||||
member = each.value.member
|
||||
}
|
||||
|
||||
resource "google_dataplex_datascan_iam_policy" "authoritative_for_resource" {
|
||||
count = var.iam_policy != null ? 1 : 0
|
||||
project = google_dataplex_datascan.datascan.project
|
||||
location = google_dataplex_datascan.datascan.location
|
||||
data_scan_id = google_dataplex_datascan.datascan.data_scan_id
|
||||
policy_data = data.google_iam_policy.authoritative.0.policy_data
|
||||
}
|
||||
|
||||
data "google_iam_policy" "authoritative" {
|
||||
count = var.iam_policy != null ? 1 : 0
|
||||
dynamic "binding" {
|
||||
for_each = try(var.iam_policy, {})
|
||||
content {
|
||||
role = binding.key
|
||||
members = binding.value
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,178 @@
|
|||
/**
|
||||
* Copyright 2023 Google LLC
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
locals {
|
||||
prefix = var.prefix == null || var.prefix == "" ? "" : "${var.prefix}-"
|
||||
_file_data_quality_spec = var.data_quality_spec_file == null ? null : {
|
||||
sampling_percent = try(local._file_data_quality_spec_raw.samplingPercent, local._file_data_quality_spec_raw.sampling_percent, null)
|
||||
row_filter = try(local._file_data_quality_spec_raw.rowFilter, local._file_data_quality_spec_raw.row_filter, null)
|
||||
rules = local._parsed_rules
|
||||
}
|
||||
data_quality_spec = (
|
||||
var.data_quality_spec != null || var.data_quality_spec_file != null ?
|
||||
merge(var.data_quality_spec, local._file_data_quality_spec) :
|
||||
null
|
||||
)
|
||||
}
|
||||
|
||||
resource "google_dataplex_datascan" "datascan" {
|
||||
project = var.project_id
|
||||
location = var.region
|
||||
data_scan_id = "${local.prefix}${var.name}"
|
||||
display_name = "${local.prefix}${var.name}"
|
||||
description = var.description == null ? "Terraform Managed." : "Terraform Managed. ${var.description}"
|
||||
labels = var.labels
|
||||
|
||||
data {
|
||||
resource = var.data.resource
|
||||
entity = var.data.entity
|
||||
}
|
||||
|
||||
execution_spec {
|
||||
field = var.incremental_field
|
||||
trigger {
|
||||
dynamic "on_demand" {
|
||||
for_each = var.execution_schedule == null ? [""] : []
|
||||
content {
|
||||
}
|
||||
}
|
||||
dynamic "schedule" {
|
||||
for_each = var.execution_schedule != null ? [""] : []
|
||||
content {
|
||||
cron = var.execution_schedule
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
dynamic "data_profile_spec" {
|
||||
for_each = var.data_profile_spec != null ? [""] : []
|
||||
content {
|
||||
sampling_percent = try(var.data_profile_spec.sampling_percent, null)
|
||||
row_filter = try(var.data_profile_spec.row_filter, null)
|
||||
}
|
||||
}
|
||||
|
||||
dynamic "data_quality_spec" {
|
||||
for_each = local.data_quality_spec != null ? [""] : []
|
||||
content {
|
||||
sampling_percent = try(local.data_quality_spec.sampling_percent, null)
|
||||
row_filter = try(local.data_quality_spec.row_filter, null)
|
||||
dynamic "rules" {
|
||||
for_each = local.data_quality_spec.rules
|
||||
content {
|
||||
column = try(rules.value.column, null)
|
||||
ignore_null = try(rules.value.ignore_null, null)
|
||||
dimension = rules.value.dimension
|
||||
threshold = try(rules.value.threshold, null)
|
||||
|
||||
dynamic "non_null_expectation" {
|
||||
for_each = try(rules.value.non_null_expectation, null) != null ? [""] : []
|
||||
content {
|
||||
}
|
||||
}
|
||||
|
||||
dynamic "range_expectation" {
|
||||
for_each = try(rules.value.range_expectation, null) != null ? [""] : []
|
||||
content {
|
||||
min_value = try(rules.value.range_expectation.min_value, null)
|
||||
max_value = try(rules.value.range_expectation.max_value, null)
|
||||
strict_min_enabled = try(rules.value.range_expectation.strict_min_enabled, null)
|
||||
strict_max_enabled = try(rules.value.range_expectation.strict_max_enabled, null)
|
||||
}
|
||||
}
|
||||
|
||||
dynamic "set_expectation" {
|
||||
for_each = try(rules.value.set_expectation, null) != null ? [""] : []
|
||||
content {
|
||||
values = rules.value.set_expectation.values
|
||||
}
|
||||
}
|
||||
|
||||
dynamic "uniqueness_expectation" {
|
||||
for_each = try(rules.value.uniqueness_expectation, null) != null ? [""] : []
|
||||
content {
|
||||
}
|
||||
}
|
||||
|
||||
dynamic "regex_expectation" {
|
||||
for_each = try(rules.value.regex_expectation, null) != null ? [""] : []
|
||||
content {
|
||||
regex = rules.value.regex_expectation.regex
|
||||
}
|
||||
}
|
||||
|
||||
dynamic "statistic_range_expectation" {
|
||||
for_each = try(rules.value.statistic_range_expectation, null) != null ? [""] : []
|
||||
content {
|
||||
min_value = try(rules.value.statistic_range_expectation.min_value, null)
|
||||
max_value = try(rules.value.statistic_range_expectation.max_value, null)
|
||||
strict_min_enabled = try(rules.value.statistic_range_expectation.strict_min_enabled, null)
|
||||
strict_max_enabled = try(rules.value.statistic_range_expectation.strict_max_enabled, null)
|
||||
statistic = rules.value.statistic_range_expectation.statistic
|
||||
}
|
||||
}
|
||||
|
||||
dynamic "row_condition_expectation" {
|
||||
for_each = try(rules.value.row_condition_expectation, null) != null ? [""] : []
|
||||
content {
|
||||
sql_expression = rules.value.row_condition_expectation.sql_expression
|
||||
}
|
||||
}
|
||||
|
||||
dynamic "table_condition_expectation" {
|
||||
for_each = try(rules.value.table_condition_expectation, null) != null ? [""] : []
|
||||
content {
|
||||
sql_expression = rules.value.table_condition_expectation.sql_expression
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
lifecycle {
|
||||
precondition {
|
||||
condition = length([for spec in [var.data_profile_spec, var.data_quality_spec, var.data_quality_spec_file] : spec if spec != null]) == 1
|
||||
error_message = "DataScan can only contain one of 'data_profile_spec', 'data_quality_spec', 'data_quality_spec_file'."
|
||||
}
|
||||
precondition {
|
||||
condition = alltrue([
|
||||
for rule in try(local.data_quality_spec.rules, []) :
|
||||
contains(["COMPLETENESS", "ACCURACY", "CONSISTENCY", "VALIDITY", "UNIQUENESS", "INTEGRITY"], rule.dimension)])
|
||||
error_message = "Datascan 'dimension' field in 'data_quality_spec' must be one of ['COMPLETENESS', 'ACCURACY', 'CONSISTENCY', 'VALIDITY', 'UNIQUENESS', 'INTEGRITY']."
|
||||
}
|
||||
precondition {
|
||||
condition = alltrue([
|
||||
for rule in try(local.data_quality_spec.rules, []) :
|
||||
length([
|
||||
for k, v in rule :
|
||||
v if contains([
|
||||
"non_null_expectation",
|
||||
"range_expectation",
|
||||
"regex_expectation",
|
||||
"set_expectation",
|
||||
"uniqueness_expectation",
|
||||
"statistic_range_expectation",
|
||||
"row_condition_expectation",
|
||||
"table_condition_expectation"
|
||||
], k) && v != null
|
||||
]) == 1])
|
||||
error_message = "Datascan rule must contain a key that is one of ['non_null_expectation', 'range_expectation', 'regex_expectation', 'set_expectation', 'uniqueness_expectation', 'statistic_range_expectation', 'row_condition_expectation', 'table_condition_expectation]."
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,35 @@
|
|||
/**
|
||||
* Copyright 2023 Google LLC
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
output "data_scan_id" {
|
||||
description = "Dataplex DataScan ID."
|
||||
value = google_dataplex_datascan.datascan.data_scan_id
|
||||
}
|
||||
|
||||
output "id" {
|
||||
description = "A fully qualified Dataplex DataScan identifier for the resource with format projects/{{project}}/locations/{{location}}/dataScans/{{data_scan_id}}."
|
||||
value = google_dataplex_datascan.datascan.id
|
||||
}
|
||||
|
||||
output "name" {
|
||||
description = "The relative resource name of the scan, of the form: projects/{project}/locations/{locationId}/dataScans/{datascan_id}, where project refers to a project_id or project_number and locationId refers to a GCP region."
|
||||
value = google_dataplex_datascan.datascan.name
|
||||
}
|
||||
|
||||
output "type" {
|
||||
description = "The type of DataScan."
|
||||
value = google_dataplex_datascan.datascan.type
|
||||
}
|
|
@ -0,0 +1,54 @@
|
|||
/**
|
||||
* Copyright 2023 Google LLC
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
locals {
|
||||
_file_data_quality_spec_raw = var.data_quality_spec_file != null ? yamldecode(file(var.data_quality_spec_file.path)) : tomap({})
|
||||
_parsed_rules = [
|
||||
for rule in try(local._file_data_quality_spec_raw.rules, []) : {
|
||||
column = try(rule.column, null)
|
||||
ignore_null = try(rule.ignoreNull, rule.ignore_null, null)
|
||||
dimension = rule.dimension
|
||||
threshold = try(rule.threshold, null)
|
||||
non_null_expectation = try(rule.nonNullExpectation, rule.non_null_expectation, null)
|
||||
range_expectation = can(rule.rangeExpectation) || can(rule.range_expectation) ? {
|
||||
min_value = try(rule.rangeExpectation.minValue, rule.range_expectation.min_value, null)
|
||||
max_value = try(rule.rangeExpectation.maxValue, rule.range_expectation.max_value, null)
|
||||
strict_min_enabled = try(rule.rangeExpectation.strictMinEnabled, rule.range_expectation.strict_min_enabled, null)
|
||||
strict_max_enabled = try(rule.rangeExpectation.strictMaxEnabled, rule.range_expectation.strict_max_enabled, null)
|
||||
} : null
|
||||
regex_expectation = can(rule.regexExpectation) || can(rule.regex_expectation) ? {
|
||||
regex = try(rule.regexExpectation.regex, rule.regex_expectation.regex, null)
|
||||
} : null
|
||||
set_expectation = can(rule.setExpectation) || can(rule.set_expectation) ? {
|
||||
values = try(rule.setExpectation.values, rule.set_expectation.values, null)
|
||||
} : null
|
||||
uniqueness_expectation = try(rule.uniquenessExpectation, rule.uniqueness_expectation, null)
|
||||
statistic_range_expectation = can(rule.statisticRangeExpectation) || can(rule.statistic_range_expectation) ? {
|
||||
statistic = try(rule.statisticRangeExpectation.statistic, rule.statistic_range_expectation.statistic)
|
||||
min_value = try(rule.statisticRangeExpectation.minValue, rule.statistic_range_expectation.min_value, null)
|
||||
max_value = try(rule.statisticRangeExpectation.maxValue, rule.statistic_range_expectation.max_value, null)
|
||||
strict_min_enabled = try(rule.statisticRangeExpectation.strictMinEnabled, rule.statistic_range_expectation.strict_min_enabled, null)
|
||||
strict_max_enabled = try(rule.statisticRangeExpectation.strictMaxEnabled, rule.statistic_range_expectation.strict_max_enabled, null)
|
||||
} : null
|
||||
row_condition_expectation = can(rule.rowConditionExpectation) || can(rule.row_condition_expectation) ? {
|
||||
sql_expression = try(rule.rowConditionExpectation.sqlExpression, rule.row_condition_expectation.sql_expression, null)
|
||||
} : null
|
||||
table_condition_expectation = can(rule.tableConditionExpectation) || can(rule.table_condition_expectation) ? {
|
||||
sql_expression = try(rule.tableConditionExpectation.sqlExpression, rule.table_condition_expectation.sql_expression, null)
|
||||
} : null
|
||||
}
|
||||
]
|
||||
}
|
|
@ -0,0 +1,165 @@
|
|||
/**
|
||||
* Copyright 2023 Google LLC
|
||||
*
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
variable "data" {
|
||||
description = "The data source for DataScan. The source can be either a Dataplex `entity` or a BigQuery `resource`."
|
||||
type = object({
|
||||
entity = optional(string)
|
||||
resource = optional(string)
|
||||
})
|
||||
validation {
|
||||
condition = length([for k, v in var.data : v if contains(["resource", "entity"], k) && v != null]) == 1
|
||||
error_message = "Datascan data must specify one of 'entity', 'resource'."
|
||||
}
|
||||
}
|
||||
|
||||
variable "data_profile_spec" {
|
||||
description = "DataProfileScan related setting. Variable descriptions are provided in https://cloud.google.com/dataplex/docs/reference/rest/v1/DataProfileSpec."
|
||||
default = null
|
||||
type = object({
|
||||
sampling_percent = optional(number)
|
||||
row_filter = optional(string)
|
||||
})
|
||||
}
|
||||
|
||||
variable "data_quality_spec" {
|
||||
description = "DataQualityScan related setting. Variable descriptions are provided in https://cloud.google.com/dataplex/docs/reference/rest/v1/DataQualitySpec."
|
||||
default = null
|
||||
type = object({
|
||||
sampling_percent = optional(number)
|
||||
row_filter = optional(string)
|
||||
rules = list(object({
|
||||
column = optional(string)
|
||||
ignore_null = optional(bool, null)
|
||||
dimension = string
|
||||
threshold = optional(number)
|
||||
non_null_expectation = optional(object({}))
|
||||
range_expectation = optional(object({
|
||||
min_value = optional(number)
|
||||
max_value = optional(number)
|
||||
strict_min_enabled = optional(bool)
|
||||
strict_max_enabled = optional(bool)
|
||||
}))
|
||||
regex_expectation = optional(object({
|
||||
regex = string
|
||||
}))
|
||||
set_expectation = optional(object({
|
||||
values = list(string)
|
||||
}))
|
||||
uniqueness_expectation = optional(object({}))
|
||||
statistic_range_expectation = optional(object({
|
||||
statistic = string
|
||||
min_value = optional(number)
|
||||
max_value = optional(number)
|
||||
strict_min_enabled = optional(bool)
|
||||
strict_max_enabled = optional(bool)
|
||||
}))
|
||||
row_condition_expectation = optional(object({
|
||||
sql_expression = string
|
||||
}))
|
||||
table_condition_expectation = optional(object({
|
||||
sql_expression = string
|
||||
}))
|
||||
}))
|
||||
})
|
||||
}
|
||||
|
||||
variable "data_quality_spec_file" {
|
||||
description = "Path to a YAML file containing DataQualityScan related setting. Input content can use either camelCase or snake_case. Variables description are provided in https://cloud.google.com/dataplex/docs/reference/rest/v1/DataQualitySpec."
|
||||
default = null
|
||||
type = object({
|
||||
path = string
|
||||
})
|
||||
}
|
||||
|
||||
variable "description" {
|
||||
description = "Custom description for DataScan."
|
||||
default = null
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "execution_schedule" {
|
||||
description = "Schedule DataScan to run periodically based on a cron schedule expression. If not specified, the DataScan is created with `on_demand` schedule, which means it will not run until the user calls `dataScans.run` API."
|
||||
type = string
|
||||
default = null
|
||||
}
|
||||
|
||||
variable "group_iam" {
|
||||
description = "Authoritative IAM binding for organization groups, in {GROUP_EMAIL => [ROLES]} format. Group emails need to be static. Can be used in combination with the `iam` variable."
|
||||
type = map(list(string))
|
||||
default = {}
|
||||
nullable = false
|
||||
}
|
||||
|
||||
variable "iam" {
|
||||
description = "Dataplex DataScan IAM bindings in {ROLE => [MEMBERS]} format."
|
||||
type = map(list(string))
|
||||
default = {}
|
||||
nullable = false
|
||||
}
|
||||
|
||||
variable "iam_additive" {
|
||||
description = "IAM additive bindings in {ROLE => [MEMBERS]} format."
|
||||
type = map(list(string))
|
||||
default = {}
|
||||
nullable = false
|
||||
}
|
||||
|
||||
variable "iam_additive_members" {
|
||||
description = "IAM additive bindings in {MEMBERS => [ROLE]} format. This might break if members are dynamic values."
|
||||
type = map(list(string))
|
||||
default = {}
|
||||
}
|
||||
|
||||
variable "iam_policy" {
|
||||
description = "IAM authoritative policy in {ROLE => [MEMBERS]} format. Roles and members not explicitly listed will be cleared, use with extreme caution."
|
||||
type = map(list(string))
|
||||
default = null
|
||||
}
|
||||
|
||||
variable "incremental_field" {
|
||||
description = "The unnested field (of type Date or Timestamp) that contains values which monotonically increase over time. If not specified, a data scan will run for all data in the table."
|
||||
type = string
|
||||
default = null
|
||||
}
|
||||
|
||||
variable "labels" {
|
||||
description = "Resource labels."
|
||||
type = map(string)
|
||||
default = {}
|
||||
nullable = false
|
||||
}
|
||||
|
||||
variable "name" {
|
||||
description = "Name of Dataplex Scan."
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "prefix" {
|
||||
description = "Optional prefix used to generate Dataplex DataScan ID."
|
||||
type = string
|
||||
default = null
|
||||
}
|
||||
|
||||
variable "project_id" {
|
||||
description = "The ID of the project where the Dataplex DataScan will be created."
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "region" {
|
||||
description = "Region for the Dataplex DataScan."
|
||||
type = string
|
||||
}
|
|
@ -0,0 +1,27 @@
|
|||
# Copyright 2022 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# https://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
terraform {
|
||||
required_version = ">= 1.4.4"
|
||||
required_providers {
|
||||
google = {
|
||||
source = "hashicorp/google"
|
||||
version = ">= 4.71.0" # tftest
|
||||
}
|
||||
google-beta = {
|
||||
source = "hashicorp/google-beta"
|
||||
version = ">= 4.71.0" # tftest
|
||||
}
|
||||
}
|
||||
}
|
|
@ -77,18 +77,42 @@ module "addresses" {
|
|||
}
|
||||
# tftest modules=1 resources=2 inventory=psc.yaml
|
||||
```
|
||||
<!-- BEGIN TFDOC -->
|
||||
|
||||
# IPSec Interconnect addresses
|
||||
|
||||
```hcl
|
||||
module "addresses" {
|
||||
source = "./fabric/modules/net-address"
|
||||
project_id = var.project_id
|
||||
ipsec_interconnect_addresses = {
|
||||
vpn-gw-range-1 = {
|
||||
address = "10.255.255.0"
|
||||
region = var.region
|
||||
network = var.vpc.self_link
|
||||
prefix_length = 29
|
||||
}
|
||||
vpn-gw-range-2 = {
|
||||
address = "10.255.255.8"
|
||||
region = var.region
|
||||
network = var.vpc.self_link
|
||||
prefix_length = 29
|
||||
}
|
||||
}
|
||||
}
|
||||
# tftest modules=1 resources=2 inventory=ipsec-interconnect.yaml
|
||||
```
|
||||
<!-- BEGIN TFDOC -->
|
||||
## Variables
|
||||
|
||||
| name | description | type | required | default |
|
||||
|---|---|:---:|:---:|:---:|
|
||||
| [project_id](variables.tf#L55) | Project where the addresses will be created. | <code>string</code> | ✓ | |
|
||||
| [project_id](variables.tf#L67) | Project where the addresses will be created. | <code>string</code> | ✓ | |
|
||||
| [external_addresses](variables.tf#L17) | Map of external address regions, keyed by name. | <code>map(string)</code> | | <code>{}</code> |
|
||||
| [global_addresses](variables.tf#L29) | List of global addresses to create. | <code>list(string)</code> | | <code>[]</code> |
|
||||
| [internal_addresses](variables.tf#L35) | Map of internal addresses to create, keyed by name. | <code title="map(object({ region = string subnetwork = string address = optional(string) description = optional(string, "Terraform managed.") labels = optional(map(string)) purpose = optional(string) tier = optional(string) }))">map(object({…}))</code> | | <code>{}</code> |
|
||||
| [psa_addresses](variables.tf#L60) | Map of internal addresses used for Private Service Access. | <code title="map(object({ address = string network = string description = optional(string, "Terraform managed.") prefix_length = number }))">map(object({…}))</code> | | <code>{}</code> |
|
||||
| [psc_addresses](variables.tf#L71) | Map of internal addresses used for Private Service Connect. | <code title="map(object({ address = string network = string description = optional(string, "Terraform managed.") }))">map(object({…}))</code> | | <code>{}</code> |
|
||||
| [ipsec_interconnect_addresses](variables.tf#L49) | Map of internal addresses used for HPA VPN over Cloud Interconnect. | <code title="map(object({ region = string address = string network = string description = optional(string, "Terraform managed.") prefix_length = number }))">map(object({…}))</code> | | <code>{}</code> |
|
||||
| [psa_addresses](variables.tf#L72) | Map of internal addresses used for Private Service Access. | <code title="map(object({ address = string network = string description = optional(string, "Terraform managed.") prefix_length = number }))">map(object({…}))</code> | | <code>{}</code> |
|
||||
| [psc_addresses](variables.tf#L83) | Map of internal addresses used for Private Service Connect. | <code title="map(object({ address = string network = string description = optional(string, "Terraform managed.") }))">map(object({…}))</code> | | <code>{}</code> |
|
||||
|
||||
## Outputs
|
||||
|
||||
|
@ -97,7 +121,7 @@ module "addresses" {
|
|||
| [external_addresses](outputs.tf#L17) | Allocated external addresses. | |
|
||||
| [global_addresses](outputs.tf#L25) | Allocated global external addresses. | |
|
||||
| [internal_addresses](outputs.tf#L33) | Allocated internal addresses. | |
|
||||
| [psa_addresses](outputs.tf#L41) | Allocated internal addresses for PSA endpoints. | |
|
||||
| [psc_addresses](outputs.tf#L49) | Allocated internal addresses for PSC endpoints. | |
|
||||
|
||||
| [ipsec_interconnect_addresses](outputs.tf#L41) | Allocated internal addresses for HA VPN over Cloud Interconnect. | |
|
||||
| [psa_addresses](outputs.tf#L49) | Allocated internal addresses for PSA endpoints. | |
|
||||
| [psc_addresses](outputs.tf#L57) | Allocated internal addresses for PSC endpoints. | |
|
||||
<!-- END TFDOC -->
|
||||
|
|
|
@ -69,3 +69,16 @@ resource "google_compute_global_address" "psa" {
|
|||
purpose = "VPC_PEERING"
|
||||
# labels = lookup(var.internal_address_labels, each.key, {})
|
||||
}
|
||||
|
||||
resource "google_compute_address" "ipsec_interconnect" {
|
||||
for_each = var.ipsec_interconnect_addresses
|
||||
project = var.project_id
|
||||
name = each.key
|
||||
description = each.value.description
|
||||
address = each.value.address
|
||||
address_type = "INTERNAL"
|
||||
region = each.value.region
|
||||
network = each.value.network
|
||||
prefix_length = each.value.prefix_length
|
||||
purpose = "IPSEC_INTERCONNECT"
|
||||
}
|
||||
|
|
|
@ -38,6 +38,14 @@ output "internal_addresses" {
|
|||
}
|
||||
}
|
||||
|
||||
output "ipsec_interconnect_addresses" {
|
||||
description = "Allocated internal addresses for HA VPN over Cloud Interconnect."
|
||||
value = {
|
||||
for address in google_compute_address.ipsec_interconnect :
|
||||
address.name => address
|
||||
}
|
||||
}
|
||||
|
||||
output "psa_addresses" {
|
||||
description = "Allocated internal addresses for PSA endpoints."
|
||||
value = {
|
||||
|
@ -52,4 +60,4 @@ output "psc_addresses" {
|
|||
for address in google_compute_global_address.psc :
|
||||
address.name => address
|
||||
}
|
||||
}
|
||||
}
|
|
@ -46,6 +46,18 @@ variable "internal_addresses" {
|
|||
default = {}
|
||||
}
|
||||
|
||||
variable "ipsec_interconnect_addresses" {
|
||||
description = "Map of internal addresses used for HPA VPN over Cloud Interconnect."
|
||||
type = map(object({
|
||||
region = string
|
||||
address = string
|
||||
network = string
|
||||
description = optional(string, "Terraform managed.")
|
||||
prefix_length = number
|
||||
}))
|
||||
default = {}
|
||||
}
|
||||
|
||||
# variable "internal_address_labels" {
|
||||
# description = "Optional labels for internal addresses, keyed by address name."
|
||||
# type = map(map(string))
|
||||
|
@ -76,4 +88,4 @@ variable "psc_addresses" {
|
|||
description = optional(string, "Terraform managed.")
|
||||
}))
|
||||
default = {}
|
||||
}
|
||||
}
|
|
@ -28,10 +28,9 @@ resource "google_compute_router" "encrypted-interconnect-overlay-router" {
|
|||
}
|
||||
|
||||
resource "google_compute_external_vpn_gateway" "default" {
|
||||
name = "peer-vpn-gateway"
|
||||
project = "myproject"
|
||||
description = "Peer IPSec over Interconnect VPN gateway"
|
||||
redundancy_type = "TWO_IPS_REDUNDANCY"
|
||||
name = "peer-vpn-gateway"
|
||||
project = "myproject"
|
||||
description = "Peer IPSec over Interconnect VPN gateway"
|
||||
interface {
|
||||
id = 0
|
||||
ip_address = "10.0.0.1"
|
||||
|
@ -58,7 +57,7 @@ module "vpngw-a" {
|
|||
}
|
||||
router_config = {
|
||||
create = false
|
||||
name = google_compute_router.encrypted-interconnect-overlay-router.id
|
||||
name = google_compute_router.encrypted-interconnect-overlay-router.name
|
||||
}
|
||||
tunnels = {
|
||||
remote-0 = {
|
||||
|
@ -102,7 +101,6 @@ module "vpngw-a" {
|
|||
# tftest modules=1 resources=16
|
||||
```
|
||||
<!-- BEGIN TFDOC -->
|
||||
|
||||
## Variables
|
||||
|
||||
| name | description | type | required | default |
|
||||
|
@ -110,11 +108,11 @@ module "vpngw-a" {
|
|||
| [interconnect_attachments](variables.tf#L17) | VLAN attachments used by the VPN Gateway. | <code title="object({ a = string b = string })">object({…})</code> | ✓ | |
|
||||
| [name](variables.tf#L25) | Common name to identify the VPN Gateway. | <code>string</code> | ✓ | |
|
||||
| [network](variables.tf#L30) | The VPC name to which resources are associated to. | <code>string</code> | ✓ | |
|
||||
| [peer_gateway_config](variables.tf#L35) | IP addresses for the external peer gateway. | <code title="object({ create = optional(bool, false) description = optional(string, "Terraform managed IPSec over Interconnect VPN gateway") name = optional(string, null) id = optional(string, null) redundancy_type = optional(string) interfaces = optional(list(string)) })">object({…})</code> | ✓ | |
|
||||
| [project_id](variables.tf#L55) | The project id. | <code>string</code> | ✓ | |
|
||||
| [region](variables.tf#L60) | GCP Region. | <code>string</code> | ✓ | |
|
||||
| [router_config](variables.tf#L65) | Cloud Router configuration for the VPN. If you want to reuse an existing router, set create to false and use name to specify the desired router. | <code title="object({ create = optional(bool, true) asn = optional(number) name = optional(string) keepalive = optional(number) custom_advertise = optional(object({ all_subnets = bool ip_ranges = map(string) })) })">object({…})</code> | ✓ | |
|
||||
| [tunnels](variables.tf#L80) | VPN tunnel configurations. | <code title="map(object({ bgp_peer = object({ address = string asn = number route_priority = optional(number, 1000) custom_advertise = optional(object({ all_subnets = bool all_vpc_subnets = bool all_peer_vpc_subnets = bool ip_ranges = map(string) })) }) bgp_session_range = string ike_version = optional(number, 2) peer_external_gateway_interface = optional(number) peer_gateway_id = optional(string, "default") router = optional(string) shared_secret = optional(string) vpn_gateway_interface = number }))">map(object({…}))</code> | | <code>{}</code> |
|
||||
| [peer_gateway_config](variables.tf#L35) | IP addresses for the external peer gateway. | <code title="object({ create = optional(bool, false) description = optional(string, "Terraform managed IPSec over Interconnect VPN gateway") name = optional(string, null) id = optional(string, null) interfaces = optional(list(string), []) })">object({…})</code> | ✓ | |
|
||||
| [project_id](variables.tf#L54) | The project id. | <code>string</code> | ✓ | |
|
||||
| [region](variables.tf#L59) | GCP Region. | <code>string</code> | ✓ | |
|
||||
| [router_config](variables.tf#L64) | Cloud Router configuration for the VPN. If you want to reuse an existing router, set create to false and use name to specify the desired router. | <code title="object({ create = optional(bool, true) asn = optional(number) name = optional(string) keepalive = optional(number) custom_advertise = optional(object({ all_subnets = bool ip_ranges = map(string) })) })">object({…})</code> | ✓ | |
|
||||
| [tunnels](variables.tf#L79) | VPN tunnel configurations. | <code title="map(object({ bgp_peer = object({ address = string asn = number route_priority = optional(number, 1000) custom_advertise = optional(object({ all_subnets = bool all_vpc_subnets = bool all_peer_vpc_subnets = bool ip_ranges = map(string) })) }) bgp_session_range = string ike_version = optional(number, 2) peer_external_gateway_interface = optional(number) peer_gateway_id = optional(string, "default") router = optional(string) shared_secret = optional(string) vpn_gateway_interface = number }))">map(object({…}))</code> | | <code>{}</code> |
|
||||
|
||||
## Outputs
|
||||
|
||||
|
@ -128,5 +126,4 @@ module "vpngw-a" {
|
|||
| [router_name](outputs.tf#L45) | Router name. | |
|
||||
| [self_link](outputs.tf#L50) | HA VPN gateway self link. | |
|
||||
| [tunnels](outputs.tf#L55) | VPN tunnel resources. | |
|
||||
|
||||
<!-- END TFDOC -->
|
||||
|
|
|
@ -15,11 +15,6 @@
|
|||
*/
|
||||
|
||||
locals {
|
||||
peer_gateway = (
|
||||
var.peer_gateway_config.create
|
||||
? try(google_compute_external_vpn_gateway.default[0], null)
|
||||
: var.peer_gateway_config
|
||||
)
|
||||
peer_gateway_id = (
|
||||
var.peer_gateway_config.create
|
||||
? try(google_compute_external_vpn_gateway.default[0].id, null)
|
||||
|
@ -35,7 +30,7 @@ locals {
|
|||
}
|
||||
|
||||
resource "google_compute_ha_vpn_gateway" "default" {
|
||||
name = var.name
|
||||
name = "vpn-gw-${var.name}"
|
||||
network = var.network
|
||||
project = var.project_id
|
||||
region = var.region
|
||||
|
@ -51,10 +46,10 @@ resource "google_compute_ha_vpn_gateway" "default" {
|
|||
|
||||
resource "google_compute_external_vpn_gateway" "default" {
|
||||
count = var.peer_gateway_config.create ? 1 : 0
|
||||
name = var.name
|
||||
name = coalesce(var.peer_gateway_config.name, "peer-vpn-gw-${var.name}")
|
||||
project = var.project_id
|
||||
description = var.peer_gateway_config.description
|
||||
redundancy_type = var.peer_gateway_config.redundancy_type
|
||||
redundancy_type = length(var.peer_gateway_config.interfaces) == 2 ? "TWO_IPS_REDUNDANCY" : "SINGLE_IP_INTERNALLY_REDUNDANT"
|
||||
dynamic "interface" {
|
||||
for_each = var.peer_gateway_config.interfaces
|
||||
content {
|
||||
|
@ -66,7 +61,7 @@ resource "google_compute_external_vpn_gateway" "default" {
|
|||
|
||||
resource "google_compute_router" "default" {
|
||||
count = var.router_config.create ? 1 : 0
|
||||
name = coalesce(var.router_config.name, "vpn-${var.name}")
|
||||
name = coalesce(var.router_config.name, "router-${var.name}")
|
||||
project = var.project_id
|
||||
region = var.region
|
||||
network = var.network
|
||||
|
|
|
@ -24,7 +24,7 @@ output "bgp_peers" {
|
|||
|
||||
output "external_gateway" {
|
||||
description = "External VPN gateway resource."
|
||||
value = local.peer_gateway
|
||||
value = try(google_compute_external_vpn_gateway.default[0], null)
|
||||
}
|
||||
|
||||
output "id" {
|
||||
|
|
|
@ -35,20 +35,19 @@ variable "network" {
|
|||
variable "peer_gateway_config" {
|
||||
description = "IP addresses for the external peer gateway."
|
||||
type = object({
|
||||
create = optional(bool, false)
|
||||
description = optional(string, "Terraform managed IPSec over Interconnect VPN gateway")
|
||||
name = optional(string, null)
|
||||
id = optional(string, null)
|
||||
redundancy_type = optional(string)
|
||||
interfaces = optional(list(string))
|
||||
create = optional(bool, false)
|
||||
description = optional(string, "Terraform managed IPSec over Interconnect VPN gateway")
|
||||
name = optional(string, null)
|
||||
id = optional(string, null)
|
||||
interfaces = optional(list(string), [])
|
||||
})
|
||||
nullable = false
|
||||
validation {
|
||||
condition = anytrue([
|
||||
var.peer_gateway_config.create == false && var.peer_gateway_config.id != null,
|
||||
var.peer_gateway_config.create == true && try(var.peer_gateway_config.redundancy_type, "") == "SINGLE_IP_INTERNALLY_REDUNDANT" && try(length(var.peer_gateway_config.interfaces) == 1, false),
|
||||
var.peer_gateway_config.create == true && try(var.peer_gateway_config.redundancy_type, "") == "TWO_IPS_REDUNDANCY" && try(length(var.peer_gateway_config.interfaces) == 2, false),
|
||||
var.peer_gateway_config.create == true && (try(length(var.peer_gateway_config.interfaces) == 1, false) || try(length(var.peer_gateway_config.interfaces) == 2, false))
|
||||
])
|
||||
error_message = "When using an existing gateway, an ID must be provided. SINGLE_IP_INTERNALLY_REDUNDANT requires exactly 1 interface, TWO_IPS_REDUNDANCY requires exactly 2."
|
||||
error_message = "When using an existing gateway, an ID must be provided. When not, the gateway can have one or two interfaces."
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -0,0 +1,116 @@
|
|||
# Copyright 2023 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
name = "datascan"
|
||||
prefix = "test"
|
||||
project_id = "test-project"
|
||||
region = "us-central1"
|
||||
labels = {
|
||||
billing_id = "a"
|
||||
}
|
||||
iam = {
|
||||
"roles/dataplex.dataScanViewer" = [
|
||||
"user:user@example.com"
|
||||
],
|
||||
"roles/dataplex.dataScanEditor" = [
|
||||
"user:user@example.com",
|
||||
]
|
||||
}
|
||||
group_iam = {
|
||||
"user-group@example.com" = [
|
||||
"roles/dataplex.dataScanEditor"
|
||||
]
|
||||
}
|
||||
execution_schedule = "TZ=America/New_York 1 1 * * *"
|
||||
data = {
|
||||
resource = "//bigquery.googleapis.com/projects/bigquery-public-data/datasets/austin_bikeshare/tables/bikeshare_stations"
|
||||
}
|
||||
description = "Custom description."
|
||||
incremental_field = "modified_date"
|
||||
data_quality_spec = {
|
||||
sampling_percent = 100
|
||||
row_filter = "station_id > 1000"
|
||||
rules = [
|
||||
{
|
||||
dimension = "VALIDITY"
|
||||
non_null_expectation = {}
|
||||
column = "address"
|
||||
threshold = 0.99
|
||||
},
|
||||
{
|
||||
column = "council_district"
|
||||
dimension = "VALIDITY"
|
||||
ignore_null = true
|
||||
threshold = 0.9
|
||||
range_expectation = {
|
||||
min_value = 1
|
||||
max_value = 10
|
||||
strict_min_enabled = true
|
||||
strict_max_enabled = false
|
||||
}
|
||||
},
|
||||
{
|
||||
column = "council_district"
|
||||
dimension = "VALIDITY"
|
||||
threshold = 0.8
|
||||
range_expectation = {
|
||||
min_value = 3
|
||||
}
|
||||
},
|
||||
{
|
||||
column = "power_type"
|
||||
dimension = "VALIDITY"
|
||||
ignore_null = false
|
||||
regex_expectation = {
|
||||
regex = ".*solar.*"
|
||||
}
|
||||
},
|
||||
{
|
||||
column = "property_type"
|
||||
dimension = "VALIDITY"
|
||||
ignore_null = false
|
||||
set_expectation = {
|
||||
values = ["sidewalk", "parkland"]
|
||||
}
|
||||
},
|
||||
{
|
||||
column = "address"
|
||||
dimension = "UNIQUENESS"
|
||||
uniqueness_expectation = {}
|
||||
},
|
||||
{
|
||||
column = "number_of_docks"
|
||||
dimension = "VALIDITY"
|
||||
statistic_range_expectation = {
|
||||
statistic = "MEAN"
|
||||
min_value = 5
|
||||
max_value = 15
|
||||
strict_min_enabled = true
|
||||
strict_max_enabled = true
|
||||
}
|
||||
},
|
||||
{
|
||||
column = "footprint_length"
|
||||
dimension = "VALIDITY"
|
||||
row_condition_expectation = {
|
||||
sql_expression = "footprint_length > 0 AND footprint_length <= 10"
|
||||
}
|
||||
},
|
||||
{
|
||||
dimension = "VALIDITY"
|
||||
table_condition_expectation = {
|
||||
sql_expression = "COUNT(*) > 0"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
|
@ -0,0 +1,195 @@
|
|||
# Copyright 2023 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
values:
|
||||
google_dataplex_datascan.datascan:
|
||||
data:
|
||||
- entity: null
|
||||
resource: //bigquery.googleapis.com/projects/bigquery-public-data/datasets/austin_bikeshare/tables/bikeshare_stations
|
||||
data_profile_spec: []
|
||||
data_quality_spec:
|
||||
- row_filter: station_id > 1000
|
||||
rules:
|
||||
- column: address
|
||||
dimension: VALIDITY
|
||||
ignore_null: null
|
||||
non_null_expectation:
|
||||
- {}
|
||||
range_expectation: []
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: 0.99
|
||||
uniqueness_expectation: []
|
||||
- column: council_district
|
||||
dimension: VALIDITY
|
||||
ignore_null: true
|
||||
non_null_expectation: []
|
||||
range_expectation:
|
||||
- max_value: '10'
|
||||
min_value: '1'
|
||||
strict_max_enabled: false
|
||||
strict_min_enabled: true
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: 0.9
|
||||
uniqueness_expectation: []
|
||||
- column: council_district
|
||||
dimension: VALIDITY
|
||||
ignore_null: null
|
||||
non_null_expectation: []
|
||||
range_expectation:
|
||||
- max_value: null
|
||||
min_value: '3'
|
||||
strict_max_enabled: false
|
||||
strict_min_enabled: false
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: 0.8
|
||||
uniqueness_expectation: []
|
||||
- column: power_type
|
||||
dimension: VALIDITY
|
||||
ignore_null: false
|
||||
non_null_expectation: []
|
||||
range_expectation: []
|
||||
regex_expectation:
|
||||
- regex: .*solar.*
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: null
|
||||
uniqueness_expectation: []
|
||||
- column: property_type
|
||||
dimension: VALIDITY
|
||||
ignore_null: false
|
||||
non_null_expectation: []
|
||||
range_expectation: []
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation:
|
||||
- values:
|
||||
- sidewalk
|
||||
- parkland
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: null
|
||||
uniqueness_expectation: []
|
||||
- column: address
|
||||
dimension: UNIQUENESS
|
||||
ignore_null: null
|
||||
non_null_expectation: []
|
||||
range_expectation: []
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: null
|
||||
uniqueness_expectation:
|
||||
- {}
|
||||
- column: number_of_docks
|
||||
dimension: VALIDITY
|
||||
ignore_null: null
|
||||
non_null_expectation: []
|
||||
range_expectation: []
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation:
|
||||
- max_value: '15'
|
||||
min_value: '5'
|
||||
statistic: MEAN
|
||||
strict_max_enabled: true
|
||||
strict_min_enabled: true
|
||||
table_condition_expectation: []
|
||||
threshold: null
|
||||
uniqueness_expectation: []
|
||||
- column: footprint_length
|
||||
dimension: VALIDITY
|
||||
ignore_null: null
|
||||
non_null_expectation: []
|
||||
range_expectation: []
|
||||
regex_expectation: []
|
||||
row_condition_expectation:
|
||||
- sql_expression: footprint_length > 0 AND footprint_length <= 10
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: null
|
||||
uniqueness_expectation: []
|
||||
- column: null
|
||||
dimension: VALIDITY
|
||||
ignore_null: null
|
||||
non_null_expectation: []
|
||||
range_expectation: []
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation:
|
||||
- sql_expression: COUNT(*) > 0
|
||||
threshold: null
|
||||
uniqueness_expectation: []
|
||||
sampling_percent: 100
|
||||
data_scan_id: test-datascan
|
||||
description: Terraform Managed. Custom description.
|
||||
display_name: test-datascan
|
||||
execution_spec:
|
||||
- field: modified_date
|
||||
trigger:
|
||||
- on_demand: []
|
||||
schedule:
|
||||
- cron: TZ=America/New_York 1 1 * * *
|
||||
labels:
|
||||
billing_id: a
|
||||
location: us-central1
|
||||
project: test-project
|
||||
timeouts: null
|
||||
google_dataplex_datascan_iam_binding.authoritative_for_role["roles/dataplex.dataScanEditor"]:
|
||||
condition: []
|
||||
data_scan_id: test-datascan
|
||||
location: us-central1
|
||||
members:
|
||||
- group:user-group@example.com
|
||||
- user:user@example.com
|
||||
project: test-project
|
||||
role: roles/dataplex.dataScanEditor
|
||||
google_dataplex_datascan_iam_binding.authoritative_for_role["roles/dataplex.dataScanViewer"]:
|
||||
condition: []
|
||||
data_scan_id: test-datascan
|
||||
location: us-central1
|
||||
members:
|
||||
- user:user@example.com
|
||||
project: test-project
|
||||
role: roles/dataplex.dataScanViewer
|
||||
|
||||
counts:
|
||||
google_dataplex_datascan: 1
|
||||
google_dataplex_datascan_iam_binding: 2
|
||||
modules: 0
|
||||
resources: 3
|
||||
|
||||
outputs:
|
||||
data_scan_id: test-datascan
|
||||
id: __missing__
|
||||
name: __missing__
|
||||
type: __missing__
|
|
@ -0,0 +1,42 @@
|
|||
# Copyright 2023 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
values:
|
||||
module.dataplex-datascan.google_dataplex_datascan.datascan:
|
||||
data:
|
||||
- entity: null
|
||||
resource: //bigquery.googleapis.com/projects/bigquery-public-data/datasets/austin_bikeshare/tables/bikeshare_stations
|
||||
data_profile_spec:
|
||||
- row_filter: null
|
||||
sampling_percent: null
|
||||
data_quality_spec: []
|
||||
data_scan_id: test-datascan
|
||||
description: Terraform Managed.
|
||||
display_name: test-datascan
|
||||
execution_spec:
|
||||
- field: null
|
||||
trigger:
|
||||
- on_demand: []
|
||||
schedule:
|
||||
- cron: TZ=America/New_York 0 1 * * *
|
||||
labels: null
|
||||
location: us-central1
|
||||
project: my-project-name
|
||||
timeouts: null
|
||||
|
||||
counts:
|
||||
google_dataplex_datascan: 1
|
||||
modules: 1
|
||||
resources: 1
|
||||
|
||||
outputs: {}
|
|
@ -0,0 +1,173 @@
|
|||
# Copyright 2023 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
values:
|
||||
module.dataplex-datascan.google_dataplex_datascan.datascan:
|
||||
data:
|
||||
- entity: null
|
||||
resource: //bigquery.googleapis.com/projects/bigquery-public-data/datasets/austin_bikeshare/tables/bikeshare_stations
|
||||
data_profile_spec: []
|
||||
data_quality_spec:
|
||||
- row_filter: station_id > 1000
|
||||
rules:
|
||||
- column: address
|
||||
dimension: VALIDITY
|
||||
ignore_null: null
|
||||
non_null_expectation:
|
||||
- {}
|
||||
range_expectation: []
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: 0.99
|
||||
uniqueness_expectation: []
|
||||
- column: council_district
|
||||
dimension: VALIDITY
|
||||
ignore_null: true
|
||||
non_null_expectation: []
|
||||
range_expectation:
|
||||
- max_value: '10'
|
||||
min_value: '1'
|
||||
strict_max_enabled: false
|
||||
strict_min_enabled: true
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: 0.9
|
||||
uniqueness_expectation: []
|
||||
- column: council_district
|
||||
dimension: VALIDITY
|
||||
ignore_null: null
|
||||
non_null_expectation: []
|
||||
range_expectation:
|
||||
- max_value: '9'
|
||||
min_value: '3'
|
||||
strict_max_enabled: False
|
||||
strict_min_enabled: False
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: 0.8
|
||||
uniqueness_expectation: []
|
||||
- column: power_type
|
||||
dimension: VALIDITY
|
||||
ignore_null: false
|
||||
non_null_expectation: []
|
||||
range_expectation: []
|
||||
regex_expectation:
|
||||
- regex: .*solar.*
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: null
|
||||
uniqueness_expectation: []
|
||||
- column: property_type
|
||||
dimension: VALIDITY
|
||||
ignore_null: false
|
||||
non_null_expectation: []
|
||||
range_expectation: []
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation:
|
||||
- values:
|
||||
- sidewalk
|
||||
- parkland
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: null
|
||||
uniqueness_expectation: []
|
||||
- column: address
|
||||
dimension: UNIQUENESS
|
||||
ignore_null: null
|
||||
non_null_expectation: []
|
||||
range_expectation: []
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: null
|
||||
uniqueness_expectation:
|
||||
- {}
|
||||
- column: number_of_docks
|
||||
dimension: VALIDITY
|
||||
ignore_null: null
|
||||
non_null_expectation: []
|
||||
range_expectation: []
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation:
|
||||
- max_value: '15'
|
||||
min_value: '5'
|
||||
statistic: MEAN
|
||||
strict_max_enabled: true
|
||||
strict_min_enabled: true
|
||||
table_condition_expectation: []
|
||||
threshold: null
|
||||
uniqueness_expectation: []
|
||||
- column: footprint_length
|
||||
dimension: VALIDITY
|
||||
ignore_null: null
|
||||
non_null_expectation: []
|
||||
range_expectation: []
|
||||
regex_expectation: []
|
||||
row_condition_expectation:
|
||||
- sql_expression: footprint_length > 0 AND footprint_length <= 10
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation: []
|
||||
threshold: null
|
||||
uniqueness_expectation: []
|
||||
- column: null
|
||||
dimension: VALIDITY
|
||||
ignore_null: null
|
||||
non_null_expectation: []
|
||||
range_expectation: []
|
||||
regex_expectation: []
|
||||
row_condition_expectation: []
|
||||
set_expectation: []
|
||||
statistic_range_expectation: []
|
||||
table_condition_expectation:
|
||||
- sql_expression: COUNT(*) > 0
|
||||
threshold: null
|
||||
uniqueness_expectation: []
|
||||
sampling_percent: 100
|
||||
data_scan_id: test-datascan
|
||||
description: Terraform Managed.
|
||||
display_name: test-datascan
|
||||
execution_spec:
|
||||
- field: modified_date
|
||||
trigger:
|
||||
- on_demand: []
|
||||
schedule:
|
||||
- cron: TZ=America/New_York 0 1 * * *
|
||||
labels:
|
||||
billing_id: a
|
||||
location: us-central1
|
||||
project: my-project-name
|
||||
timeouts: null
|
||||
|
||||
counts:
|
||||
google_dataplex_datascan: 1
|
||||
modules: 1
|
||||
resources: 1
|
||||
|
||||
outputs: {}
|
|
@ -0,0 +1,41 @@
|
|||
# Copyright 2023 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
values:
|
||||
module.dataplex-datascan.google_dataplex_datascan.datascan:
|
||||
data:
|
||||
- entity: projects/<project_number>/locations/<locationId>/lakes/<lakeId>/zones/<zoneId>/entities/<entityId>
|
||||
resource: null
|
||||
data_profile_spec:
|
||||
- row_filter: null
|
||||
sampling_percent: null
|
||||
data_quality_spec: []
|
||||
data_scan_id: test-datascan
|
||||
description: Terraform Managed.
|
||||
display_name: test-datascan
|
||||
execution_spec:
|
||||
- field: null
|
||||
trigger:
|
||||
- on_demand:
|
||||
- {}
|
||||
schedule: []
|
||||
location: us-central1
|
||||
project: my-project-name
|
||||
timeouts: null
|
||||
|
||||
counts:
|
||||
google_dataplex_datascan: 1
|
||||
modules: 1
|
||||
resources: 1
|
||||
|
||||
outputs: {}
|
|
@ -0,0 +1,67 @@
|
|||
# Copyright 2023 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
values:
|
||||
module.dataplex-datascan.google_dataplex_datascan.datascan:
|
||||
data:
|
||||
- entity: null
|
||||
resource: //bigquery.googleapis.com/projects/bigquery-public-data/datasets/austin_bikeshare/tables/bikeshare_stations
|
||||
data_profile_spec:
|
||||
- row_filter: null
|
||||
sampling_percent: null
|
||||
data_quality_spec: []
|
||||
data_scan_id: test-datascan
|
||||
description: Terraform Managed.
|
||||
display_name: test-datascan
|
||||
execution_spec:
|
||||
- field: null
|
||||
trigger:
|
||||
- on_demand:
|
||||
- {}
|
||||
schedule: []
|
||||
labels: null
|
||||
location: us-central1
|
||||
project: my-project-name
|
||||
timeouts: null
|
||||
module.dataplex-datascan.google_dataplex_datascan_iam_binding.authoritative_for_role["roles/dataplex.dataScanAdmin"]:
|
||||
condition: []
|
||||
data_scan_id: test-datascan
|
||||
location: us-central1
|
||||
members:
|
||||
- serviceAccount:svc-1@project-id.iam.gserviceaccount.com
|
||||
project: my-project-name
|
||||
role: roles/dataplex.dataScanAdmin
|
||||
module.dataplex-datascan.google_dataplex_datascan_iam_binding.authoritative_for_role["roles/dataplex.dataScanEditor"]:
|
||||
condition: []
|
||||
data_scan_id: test-datascan
|
||||
location: us-central1
|
||||
members:
|
||||
- user:admin-user@example.com
|
||||
project: my-project-name
|
||||
role: roles/dataplex.dataScanEditor
|
||||
module.dataplex-datascan.google_dataplex_datascan_iam_binding.authoritative_for_role["roles/dataplex.dataScanViewer"]:
|
||||
condition: []
|
||||
data_scan_id: test-datascan
|
||||
location: us-central1
|
||||
members:
|
||||
- group:user-group@example.com
|
||||
project: my-project-name
|
||||
role: roles/dataplex.dataScanViewer
|
||||
|
||||
counts:
|
||||
google_dataplex_datascan: 1
|
||||
google_dataplex_datascan_iam_binding: 3
|
||||
modules: 1
|
||||
resources: 4
|
||||
|
||||
outputs: {}
|
|
@ -0,0 +1,43 @@
|
|||
# Copyright 2023 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
values:
|
||||
module.dataplex-datascan.google_dataplex_datascan.datascan:
|
||||
data:
|
||||
- entity: null
|
||||
resource: //bigquery.googleapis.com/projects/bigquery-public-data/datasets/austin_bikeshare/tables/bikeshare_stations
|
||||
data_profile_spec:
|
||||
- row_filter: station_id > 1000
|
||||
sampling_percent: 100
|
||||
data_quality_spec: []
|
||||
data_scan_id: test-datascan
|
||||
description: Terraform Managed.
|
||||
display_name: test-datascan
|
||||
execution_spec:
|
||||
- field: modified_date
|
||||
trigger:
|
||||
- on_demand:
|
||||
- {}
|
||||
schedule: []
|
||||
labels:
|
||||
billing_id: a
|
||||
location: us-central1
|
||||
project: my-project-name
|
||||
timeouts: null
|
||||
|
||||
counts:
|
||||
google_dataplex_datascan: 1
|
||||
modules: 1
|
||||
resources: 1
|
||||
|
||||
outputs: {}
|
|
@ -0,0 +1,16 @@
|
|||
# Copyright 2023 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
module: modules/dataplex-datascan
|
||||
tests:
|
||||
datascan_test_inputs:
|
|
@ -0,0 +1,36 @@
|
|||
# Copyright 2023 Google LLC
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
values:
|
||||
module.addresses.google_compute_address.ipsec_interconnect["vpn-gw-range-1"]:
|
||||
address: 10.255.255.0
|
||||
address_type: INTERNAL
|
||||
name: vpn-gw-range-1
|
||||
network: projects/xxx/global/networks/aaa
|
||||
prefix_length: 29
|
||||
project: project-id
|
||||
purpose: IPSEC_INTERCONNECT
|
||||
region: region
|
||||
module.addresses.google_compute_address.ipsec_interconnect["vpn-gw-range-2"]:
|
||||
address: 10.255.255.8
|
||||
address_type: INTERNAL
|
||||
name: vpn-gw-range-2
|
||||
network: projects/xxx/global/networks/aaa
|
||||
prefix_length: 29
|
||||
project: project-id
|
||||
purpose: IPSEC_INTERCONNECT
|
||||
region: region
|
||||
|
||||
counts:
|
||||
google_compute_address: 2
|
Loading…
Reference in New Issue