update changelog, readmes

This commit is contained in:
Lorenzo Caggioni 2020-09-17 17:38:27 +02:00
parent c5b29f3923
commit 6362185a00
4 changed files with 12 additions and 7 deletions

View File

@ -3,6 +3,7 @@
All notable changes to this project will be documented in this file.
## [Unreleased]
- end to end example: `Scheduled Cloud Asset Inventory Export to Bigquery`
## [3.1.1] - 2020-08-26
- fix error in `project` module

View File

@ -19,7 +19,7 @@ Currently available examples:
- **foundations** - [single level hierarchy](./foundations/environments/) (environments), [multiple level hierarchy](./foundations/business-units/) (business units + environments)
- **networking** - [hub and spoke via peering](./networking/hub-and-spoke-peering/), [hub and spoke via VPN](./networking/hub-and-spoke-vpn/), [DNS and Google Private Access for on-premises](./networking/onprem-google-access-dns/), [Shared VPC with GKE support](./networking/shared-vpc-gke/), [ILB as next hop](./networking/ilb-next-hop)
- **data solutions** - [GCE/GCS CMEK via centralized Cloud KMS](./data-solutions/cmek-via-centralized-kms/), [Cloud Storage to Bigquery with Cloud Dataflow](./data-solutions/gcs-to-bq-with-dataflow/)
- **cloud operations** - [Resource tracking and remediation via Cloud Asset feeds](.//cloud-operations/asset-inventory-feed-remediation), [Granular Cloud DNS IAM via Service Directory](./cloud-operations/dns-fine-grained-iam)
- **cloud operations** - [Resource tracking and remediation via Cloud Asset feeds](.//cloud-operations/asset-inventory-feed-remediation), [Scheduled Cloud Asset Inventory Export to Bigquery](./cloud-operations/scheduled-asset-inventory-export-bq), [Granular Cloud DNS IAM via Service Directory](./cloud-operations/dns-fine-grained-iam)
For more information see the README files in the [foundations](./foundations/), [networking](./networking/), [data solutions](./data-solutions/) and [cloud operations](./cloud-operations/) folders.

View File

@ -10,6 +10,12 @@ The example's feed tracks changes to Google Compute instances, and the Cloud Fun
<br clear="left">
## Scheduled Cloud Asset Inventory Export to Bigquery
<a href="./scheduled-asset-inventory-export-bq" title="Scheduled Cloud Asset Inventory Export to Bigquery"><img src="./scheduled-asset-inventory-export-bq/diagram.png" align="left" width="280px"></a> This [example](./scheduled-asset-inventory-export-bq) shows how to leverage [Cloud Asset Inventory Exporting to Bigquery](https://cloud.google.com/asset-inventory/docs/exporting-to-bigquery) feature to keep track of your organization wide assets over time storing information in Bigquery. Data stored in Bigquery can then be used for different purposes, for example: dashboarding, analysis.
<br clear="left">
## Granular Cloud DNS IAM via Service Directory
<a href="./dns-fine-grained-iam" title="Fine-grained Cloud DNS IAM with Service Directory"><img src="./dns-fine-grained-iam/diagram.png" align="left" width="280px"></a> This [example](./dns-fine-grained-iam) shows how to leverage Service Directory](https://cloud.google.com/blog/products/networking/introducing-service-directory) and Cloud DNS Service Directory private zones, to implement fine-grained IAM controls on DNS. The example creates a Service Directory namespace, a Cloud DNS private zone that uses it as its authoritative source, service accounts with different levels of permissions, and VMs to test them.

View File

@ -7,19 +7,18 @@ The data stored in Bigquery can then be used for different purposes:
- dashboarding
- analysis
This example shows a export to Bigquery scheduled on a daily basis.
This example shows an export to Bigquery scheduled on a daily basis.
The resources created in this example are shown in the high level diagram below:
<img src="diagram.png" width="640px">
## Running the example
### Prerequisites
## Prerequisites
Ensure that you grant your account one of the following roles on your project, folder, or organization.
- Cloud Asset Viewer role (roles/cloudasset.viewer)
- Owner primitive role (roles/owner)
## Running the example
Clone this repository, specify your variables in a `terraform.tvars` and then go through the following steps to create resources:
- `terraform init`
@ -29,7 +28,7 @@ Once done testing, you can clean up resources by running `terraform destroy`. To
## Testing the example
You can now run queries on the data you exported on Bigquery. Here you can find some explample of queries you can run.
You can now run queries on the data you exported on Bigquery. [Here](https://cloud.google.com/asset-inventory/docs/exporting-to-bigquery#querying_an_asset_snapshot) you can find some example of queries you can run.
You can also create a dashborad connecting [Datalab](https://datastudio.google.com/) or any other BI tools of your choice to your Bigquery datase..
<!-- BEGIN TFDOC -->
@ -51,4 +50,3 @@ You can also create a dashborad connecting [Datalab](https://datastudio.google.c
| bq-dataset | Bigquery instance details. | |
| cloud-function | Bigquery instance details. | |
<!-- END TFDOC -->