cloud-foundation-fabric/cloud-operations/scheduled-asset-inventory-e.../README.md

75 lines
5.3 KiB
Markdown
Raw Normal View History

2020-09-17 08:26:17 -07:00
# Scheduled Cloud Asset Inventory Export to Bigquery
This example shows how to leverage [Cloud Asset Inventory Exporting to Bigquery](https://cloud.google.com/asset-inventory/docs/exporting-to-bigquery) feature to keep track of your project wide assets over time storing information in Bigquery.
2020-09-17 08:26:17 -07:00
The data stored in Bigquery can then be used for different purposes:
- dashboarding
- analysis
2020-09-24 09:48:40 -07:00
The example uses export resources at the project level for ease of testing, in actual use a few changes are needed to operate at the resource hierarchy level:
2020-09-24 09:48:40 -07:00
- the export should be set at the folder or organization level
- the `roles/cloudasset.viewer` on the service account should be set at the folder or organization level
2020-09-17 08:26:17 -07:00
The resources created in this example are shown in the high level diagram below:
<img src="diagram.png" width="640px">
2020-09-17 08:38:27 -07:00
## Prerequisites
Ensure that you grant your account one of the following roles on your project, folder, or organization:
- Cloud Asset Viewer role (`roles/cloudasset.viewer`)
- Owner primitive role (`roles/owner`)
2020-09-17 08:26:17 -07:00
2020-09-17 08:38:27 -07:00
## Running the example
2020-09-17 08:26:17 -07:00
Clone this repository, specify your variables in a `terraform.tvars` and then go through the following steps to create resources:
- `terraform init`
- `terraform apply`
Once done testing, you can clean up resources by running `terraform destroy`. To persist state, check out the `backend.tf.sample` file.
## Testing the example
Once resources are created, you can run queries on the data you exported on Bigquery. [Here](https://cloud.google.com/asset-inventory/docs/exporting-to-bigquery#querying_an_asset_snapshot) you can find some example of queries you can run.
You can also create a dashboard connecting [Datalab](https://datastudio.google.com/) or any other BI tools of your choice to your Bigquery datase.
2020-09-17 08:26:17 -07:00
2021-12-16 05:27:51 -08:00
## File exporter for JSON, CSV.
Regular file-based exports of Cloud Asset Inventory may be useful for scale-out network dependency discovery like [Planet Exporter](https://github.com/williamchanrico/planet-exporter) or to update asset tracking or configuration management workflows. Bigquery supports multiple [export formats](https://cloud.google.com/bigquery/docs/exporting-data#export_formats_and_compression_types) and one may upload objects to Storage Bucket using provided Cloud Function. Specify `job.DestinationFormat` as defined in [documentation](https://googleapis.dev/python/bigquery/latest/generated/google.cloud.bigquery.job.DestinationFormat.html), e.g. `NEWLINE_DELIMITED_JSON`.
It helps to create custom [scheduled query](https://cloud.google.com/bigquery/docs/scheduling-queries#console) for to comply with downstream systems' fields, and to time it with CAI export into BQ for freshness. See [sample queries](https://cloud.google.com/asset-inventory/docs/exporting-to-bigquery-sample-queries).
Note: Cloud Function's Service Account needs write-capable IAM role on `bucket`.
2020-09-17 08:26:17 -07:00
<!-- BEGIN TFDOC -->
## Variables
| name | description | type | required | default |
|---|---|:---: |:---:|:---:|
2021-12-16 05:27:51 -08:00
| cai_config | Cloud Asset Inventory export config. | <code title="object&#40;&#123;&#10;bq_dataset &#61; string&#10;bq_table &#61; string&#10;bq_table_overwrite &#61; bool&#10;target_node &#61; string&#10;&#125;&#41;">object({...})</code> | ✓ | |
2020-09-17 08:26:17 -07:00
| project_id | Project id that references existing project. | <code title="">string</code> | ✓ | |
2021-09-02 00:45:51 -07:00
| *billing_account* | Billing account id used as default for new projects. | <code title="">string</code> | | <code title="">null</code> |
2020-09-17 08:26:17 -07:00
| *bundle_path* | Path used to write the intermediate Cloud Function code bundle. | <code title="">string</code> | | <code title="">./bundle.zip</code> |
2021-12-16 05:27:51 -08:00
| *bundle_path_cffile* | Path used to write the intermediate Cloud Function code bundle. | <code title="">string</code> | | <code title="">./bundle_cffile.zip</code> |
| *file_config* | Optional BQ table as a file export function config. | <code title="object&#40;&#123;&#10;bucket &#61; string&#10;filename &#61; string&#10;format &#61; string&#10;bq_dataset &#61; string&#10;bq_table &#61; string&#10;&#125;&#41;">object({...})</code> | | <code title="&#123;&#10;bucket &#61; null&#10;filename &#61; null&#10;format &#61; null&#10;bq_dataset &#61; null&#10;bq_table &#61; null&#10;&#125;">...</code> |
2020-09-24 09:49:48 -07:00
| *location* | Appe Engine location used in the example. | <code title="">string</code> | | <code title="">europe-west</code> |
2020-09-17 08:26:17 -07:00
| *name* | Arbitrary string used to name created resources. | <code title="">string</code> | | <code title="">asset-inventory</code> |
2021-12-16 05:27:51 -08:00
| *name_cffile* | Arbitrary string used to name created resources. | <code title="">string</code> | | <code title="">cffile-exporter</code> |
2020-09-24 09:49:48 -07:00
| *project_create* | Create project instead ofusing an existing one. | <code title="">bool</code> | | <code title="">true</code> |
2020-09-17 08:26:17 -07:00
| *region* | Compute region used in the example. | <code title="">string</code> | | <code title="">europe-west1</code> |
| *root_node* | The resource name of the parent folder or organization for project creation, in 'folders/folder_id' or 'organizations/org_id' format. | <code title="">string</code> | | <code title="">null</code> |
2020-09-17 08:26:17 -07:00
## Outputs
| name | description | sensitive |
|---|---|:---:|
| bq-dataset | Bigquery instance details. | |
| cloud-function | Cloud Function instance details. | |
2020-09-17 08:26:17 -07:00
<!-- END TFDOC -->