diff --git a/CHANGELOG.md b/CHANGELOG.md
index cfc68935..6c238e84 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -3,6 +3,7 @@
All notable changes to this project will be documented in this file.
## [Unreleased]
+ - end to end example: `Scheduled Cloud Asset Inventory Export to Bigquery`
## [3.1.1] - 2020-08-26
- fix error in `project` module
diff --git a/README.md b/README.md
index b541bfcf..ed81ca43 100644
--- a/README.md
+++ b/README.md
@@ -19,7 +19,7 @@ Currently available examples:
- **foundations** - [single level hierarchy](./foundations/environments/) (environments), [multiple level hierarchy](./foundations/business-units/) (business units + environments)
- **networking** - [hub and spoke via peering](./networking/hub-and-spoke-peering/), [hub and spoke via VPN](./networking/hub-and-spoke-vpn/), [DNS and Google Private Access for on-premises](./networking/onprem-google-access-dns/), [Shared VPC with GKE support](./networking/shared-vpc-gke/), [ILB as next hop](./networking/ilb-next-hop)
- **data solutions** - [GCE/GCS CMEK via centralized Cloud KMS](./data-solutions/cmek-via-centralized-kms/), [Cloud Storage to Bigquery with Cloud Dataflow](./data-solutions/gcs-to-bq-with-dataflow/)
-- **cloud operations** - [Resource tracking and remediation via Cloud Asset feeds](.//cloud-operations/asset-inventory-feed-remediation), [Granular Cloud DNS IAM via Service Directory](./cloud-operations/dns-fine-grained-iam)
+- **cloud operations** - [Resource tracking and remediation via Cloud Asset feeds](.//cloud-operations/asset-inventory-feed-remediation), [Scheduled Cloud Asset Inventory Export to Bigquery](./cloud-operations/scheduled-asset-inventory-export-bq), [Granular Cloud DNS IAM via Service Directory](./cloud-operations/dns-fine-grained-iam)
For more information see the README files in the [foundations](./foundations/), [networking](./networking/), [data solutions](./data-solutions/) and [cloud operations](./cloud-operations/) folders.
diff --git a/cloud-operations/README.md b/cloud-operations/README.md
index 016a5f86..a17e6de4 100644
--- a/cloud-operations/README.md
+++ b/cloud-operations/README.md
@@ -10,6 +10,12 @@ The example's feed tracks changes to Google Compute instances, and the Cloud Fun
+## Scheduled Cloud Asset Inventory Export to Bigquery
+
+ This [example](./scheduled-asset-inventory-export-bq) shows how to leverage [Cloud Asset Inventory Exporting to Bigquery](https://cloud.google.com/asset-inventory/docs/exporting-to-bigquery) feature to keep track of your organization wide assets over time storing information in Bigquery. Data stored in Bigquery can then be used for different purposes, for example: dashboarding, analysis.
+
+
+
## Granular Cloud DNS IAM via Service Directory
This [example](./dns-fine-grained-iam) shows how to leverage Service Directory](https://cloud.google.com/blog/products/networking/introducing-service-directory) and Cloud DNS Service Directory private zones, to implement fine-grained IAM controls on DNS. The example creates a Service Directory namespace, a Cloud DNS private zone that uses it as its authoritative source, service accounts with different levels of permissions, and VMs to test them.
diff --git a/cloud-operations/scheduled-asset-inventory-export-bq/README.md b/cloud-operations/scheduled-asset-inventory-export-bq/README.md
index f5dcd99e..149453d1 100644
--- a/cloud-operations/scheduled-asset-inventory-export-bq/README.md
+++ b/cloud-operations/scheduled-asset-inventory-export-bq/README.md
@@ -7,19 +7,18 @@ The data stored in Bigquery can then be used for different purposes:
- dashboarding
- analysis
-This example shows a export to Bigquery scheduled on a daily basis.
+This example shows an export to Bigquery scheduled on a daily basis.
The resources created in this example are shown in the high level diagram below:
-## Running the example
-
-### Prerequisites
+## Prerequisites
Ensure that you grant your account one of the following roles on your project, folder, or organization.
- Cloud Asset Viewer role (roles/cloudasset.viewer)
- Owner primitive role (roles/owner)
+## Running the example
Clone this repository, specify your variables in a `terraform.tvars` and then go through the following steps to create resources:
- `terraform init`
@@ -29,7 +28,7 @@ Once done testing, you can clean up resources by running `terraform destroy`. To
## Testing the example
-You can now run queries on the data you exported on Bigquery. Here you can find some explample of queries you can run.
+You can now run queries on the data you exported on Bigquery. [Here](https://cloud.google.com/asset-inventory/docs/exporting-to-bigquery#querying_an_asset_snapshot) you can find some example of queries you can run.
You can also create a dashborad connecting [Datalab](https://datastudio.google.com/) or any other BI tools of your choice to your Bigquery datase..
@@ -51,4 +50,3 @@ You can also create a dashborad connecting [Datalab](https://datastudio.google.c
| bq-dataset | Bigquery instance details. | |
| cloud-function | Bigquery instance details. | |
-