# Scheduled Cloud Asset Inventory Export to Bigquery
This example shows how to leverage [Cloud Asset Inventory Exporting to Bigquery](https://cloud.google.com/asset-inventory/docs/exporting-to-bigquery) feature to keep track of your project wide assets over time storing information in Bigquery.
The data stored in Bigquery can then be used for different purposes:
- dashboarding
- analysis
The example uses export resources at the project level for ease of testing, in actual use a few changes are needed to operate at the resource hierarchy level:
- the export should be set at the folder or organization level
- the `roles/cloudasset.viewer` on the service account should be set at the folder or organization level
The resources created in this example are shown in the high level diagram below:
## Prerequisites
Ensure that you grant your account one of the following roles on your project, folder, or organization:
- Cloud Asset Viewer role (`roles/cloudasset.viewer`)
- Owner primitive role (`roles/owner`)
## Running the example
Clone this repository, specify your variables in a `terraform.tvars` and then go through the following steps to create resources:
- `terraform init`
- `terraform apply`
Once done testing, you can clean up resources by running `terraform destroy`. To persist state, check out the `backend.tf.sample` file.
## Testing the example
Once resources are created, you can run queries on the data you exported on Bigquery. [Here](https://cloud.google.com/asset-inventory/docs/exporting-to-bigquery#querying_an_asset_snapshot) you can find some example of queries you can run.
You can also create a dashboard connecting [Datalab](https://datastudio.google.com/) or any other BI tools of your choice to your Bigquery datase.
## File exporter for JSON, CSV.
Regular file-based exports of Cloud Asset Inventory may be useful for scale-out network dependency discovery like [Planet Exporter](https://github.com/williamchanrico/planet-exporter) or to update asset tracking or configuration management workflows. Bigquery supports multiple [export formats](https://cloud.google.com/bigquery/docs/exporting-data#export_formats_and_compression_types) and one may upload objects to Storage Bucket using provided Cloud Function. Specify `job.DestinationFormat` as defined in [documentation](https://googleapis.dev/python/bigquery/latest/generated/google.cloud.bigquery.job.DestinationFormat.html), e.g. `NEWLINE_DELIMITED_JSON`.
It helps to create custom [scheduled query](https://cloud.google.com/bigquery/docs/scheduling-queries#console) for to comply with downstream systems' fields, and to time it with CAI export into BQ for freshness. See [sample queries](https://cloud.google.com/asset-inventory/docs/exporting-to-bigquery-sample-queries).
Note: Cloud Function's Service Account needs write-capable IAM role on `bucket`.
## Variables
| name | description | type | required | default |
|---|---|:---: |:---:|:---:|
| cai_config | Cloud Asset Inventory export config. | object({...})
| ✓ | |
| project_id | Project id that references existing project. | string
| ✓ | |
| *billing_account* | Billing account id used as default for new projects. | string
| | null
|
| *bundle_path* | Path used to write the intermediate Cloud Function code bundle. | string
| | ./bundle.zip
|
| *bundle_path_cffile* | Path used to write the intermediate Cloud Function code bundle. | string
| | ./bundle_cffile.zip
|
| *file_config* | Optional BQ table as a file export function config. | object({...})
| | ...
|
| *location* | Appe Engine location used in the example. | string
| | europe-west
|
| *name* | Arbitrary string used to name created resources. | string
| | asset-inventory
|
| *name_cffile* | Arbitrary string used to name created resources. | string
| | cffile-exporter
|
| *project_create* | Create project instead ofusing an existing one. | bool
| | true
|
| *region* | Compute region used in the example. | string
| | europe-west1
|
| *root_node* | The resource name of the parent folder or organization for project creation, in 'folders/folder_id' or 'organizations/org_id' format. | string
| | null
|
## Outputs
| name | description | sensitive |
|---|---|:---:|
| bq-dataset | Bigquery instance details. | |
| cloud-function | Cloud Function instance details. | |