cloud-foundation-fabric/blueprints/data-solutions/README.md

6.1 KiB

GCP Data Services blueprints

The blueprints in this folder implement typical data service topologies and end-to-end scenarios, that allow testing specific features like Cloud KMS to encrypt your data, or VPC-SC to mitigate data exfiltration.

They are meant to be used as minimal but complete starting points to create actual infrastructure, and as playgrounds to experiment with specific Google Cloud features.

Blueprints

Cloud SQL instance with multi-region read replicas

This blueprint creates a Cloud SQL instance with multi-region read replicas as described in the Cloud SQL for PostgreSQL disaster recovery article.


GCE and GCS CMEK via centralized Cloud KMS

This blueprint implements CMEK for GCS and GCE, via keys hosted in KMS running in a centralized project. The blueprint shows the basic resources and permissions for the typical use case of application projects implementing encryption at rest via a centrally managed KMS service.


Cloud Composer version 2 private instance, supporting Shared VPC and external CMEK key

This blueprint creates a Cloud Composer version 2 instance on a VPC with a dedicated service account. The solution supports as inputs: a Shared VPC and Cloud KMS CMEK keys.


Data Platform Foundations

This blueprint implements a robust and flexible Data Platform on GCP that provides opinionated defaults, allowing customers to build and scale out additional data pipelines quickly and reliably.


Minimal Data Platform

This blueprint implements a minimal Data Platform on GCP that provides opinionated defaults, allowing customers to build and scale out additional data pipelines quickly and reliably.


Data Playground starter with Cloud Vertex AI Notebook and GCS

This blueprint creates a Vertex AI Notebook running on a VPC with a private IP and a dedicated Service Account. A GCS bucket and a BigQuery dataset are created to store inputs and outputs of data experiments.


Cloud Storage to Bigquery with Cloud Dataflow with least privileges

This blueprint implements resources required to run GCS to BigQuery Dataflow pipelines. The solution rely on a set of Services account created with the least privileges principle.


SQL Server Always On Availability Groups

This blueprint implements SQL Server Always On Availability Groups using Fabric modules. It builds a two node cluster with a fileshare witness instance in an existing VPC and adds the necessary firewalling. The actual setup process (apart from Active Directory operations) has been scripted, so that least amount of manual works needs to performed.


MLOps with Vertex AI

This blueprint implements the infrastructure required to have a fully functional MLOPs environment using Vertex AI: required GCP services activation, Vertex Workbench, GCS buckets to host Vertex AI and Cloud Build artifacts, Artifact Registry docker repository to host custom images, required service accounts, networking and Workload Identity Federation Provider for Github integration (optional).


Shielded Folder

This blueprint implements an opinionated folder configuration according to GCP best practices. Configurations implemented on the folder would be beneficial to host workloads inheriting constraints from the folder they belong to.


BigQuery ML and Vertex AI Pipeline

This blueprint provides the necessary infrastructure to create a complete development environment for building and deploying machine learning models using BigQuery ML and Vertex AI. With this blueprint, you can deploy your models to a Vertex AI endpoint or use them within BigQuery ML.