


Using Google Cloud Functions for Three-Tier Data Processing with Google Composer and Automated Deployments via GitHub Actions
Oct 26, 2024 am 05:42 AMTable of Contents
- About
- Tools Used
- Solution Architecture Diagram
-
Deployment Process
- Prerequisites
- Secrets for GitHub Actions
- To create a new secret:
- Setting Up the DevOps Service Account
-
GitHub Actions Pipeline: Steps
- enable-services
- deploy-buckets
- deploy-cloud-function
- deploy-composer-service-account
- deploy-bigquery-dataset-bigquery-tables
- deploy-composer-environment
- deploy-composer-http-connection
- deploy-dags
- GitHub Actions Workflow Explanation
- Resources Created After Deployment
- Conclusion
- References
About
This post explores the use of Google Cloud Functions for processing data in a three-tier architecture. The solution is orchestrated with Google Composer and features automated deployments using GitHub Actions. We will walk through the tools used, deployment process, and pipeline steps, providing a clear guide for building an end-to-end cloud-based data pipeline.
Tools Used
- Google Cloud Platform (GCP): The primary cloud environment.
- Cloud Storage: For storing input and processed data across different layers (Bronze, Silver, Gold).
- Cloud Functions: Serverless functions responsible for data processing in each tier.
- Google Composer: An orchestration tool based on Apache Airflow, used to schedule and manage workflows.
- GitHub Actions: Automation tool for deploying and managing the pipeline infrastructure.
Solution Architecture Diagram
Deployment Process
Prerequisites
Before setting up the project, ensure you have the following:
- GCP Account: A Google Cloud account with billing enabled.
- Service Account for DevOps: A service account with the required permissions to deploy resources in GCP.
- Secrets in GitHub: Store the GCP service account credentials, project ID, and bucket name as secrets in GitHub for secure access.
Secrets for GitHub Actions
To securely access your GCP project and resources, set the following secrets in GitHub Actions:
- BUCKET_DATALAKE: Your Cloud Storage bucket for the data lake.
- GCP_DEVOPS_SA_KEY: The service account key in JSON format.
- PROJECT_ID: Your GCP project ID.
- REGION_PROJECT_ID: The region where your GCP project is deployed.
To create a new secret:
1. In project repository, menu **Settings** 2. **Security**, 3. **Secrets and variables**,click in access **Action** 4. **New repository secret**, type a **name** and **value** for secret.
For more details , access :
https://docs.github.com/pt/actions/security-for-github-actions/security-guides/using-secrets-in-github-actions
Setting Up the DevOps Service Account
Create a service account in GCP with permissions for Cloud Functions, Composer, BigQuery, and Cloud Storage. Grant the necessary roles such as:
- Cloud Functions Admin
- Composer User
- BigQuery Data Editor
- Storage Object Admin
GitHub Actions Pipeline: Steps
The pipeline automates the entire deployment process, ensuring all components are set up correctly. Here's a breakdown of the key jobs from the GitHub Actions file, each responsible for a different aspect of the deployment.
enable-services
enable-services: runs-on: ubuntu-22.04 steps: - uses: actions/checkout@v2 # Step to Authenticate with GCP - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Configure Cloud SDK - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker - name: Set up python 3.8 uses: actions/setup-python@v2 with: python-version: 3.8.16 # Step to Create GCP Bucket - name: Enable gcp service api's run: |- gcloud services enable ${{ env.GCP_SERVICE_API_0 }} gcloud services enable ${{ env.GCP_SERVICE_API_1 }} gcloud services enable ${{ env.GCP_SERVICE_API_2 }} gcloud services enable ${{ env.GCP_SERVICE_API_3 }} gcloud services enable ${{ env.GCP_SERVICE_API_4 }}
deploy-buckets
deploy-buckets: needs: [enable-services] runs-on: ubuntu-22.04 timeout-minutes: 10 steps: - name: Checkout uses: actions/checkout@v4 - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Authenticate with GCP - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker # Step to Create GCP Bucket - name: Create Google Cloud Storage - datalake run: |- if ! gsutil ls -p ${{ secrets.PROJECT_ID }} gs://${{ secrets.BUCKET_DATALAKE }} &> /dev/null; \ then \ gcloud storage buckets create gs://${{ secrets.BUCKET_DATALAKE }} --default-storage-class=nearline --location=${{ env.REGION }} else echo "Cloud Storage : gs://${{ secrets.BUCKET_DATALAKE }} already exists" ! fi # Step to Upload the file to GCP Bucket - transient files - name: Upload transient files to Google Cloud Storage run: |- TARGET=${{ env.INPUT_FOLDER }} BUCKET_PATH=${{ secrets.BUCKET_DATALAKE }}/${{ env.INPUT_FOLDER }} gsutil cp -r $TARGET gs://${BUCKET_PATH}
deploy-cloud-function
deploy-cloud-function: needs: [enable-services, deploy-buckets] runs-on: ubuntu-22.04 steps: - uses: actions/checkout@v2 # Step to Authenticate with GCP - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Configure Cloud SDK - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker - name: Set up python 3.10 uses: actions/setup-python@v2 with: python-version: 3.10.12 #cloud_function_scripts/csv_to_parquet - name: Create cloud function - ${{ env.CLOUD_FUNCTION_1_NAME }} run: |- cd ${{ env.FUNCTION_SCRIPTS }}/${{ env.CLOUD_FUNCTION_1_NAME }} gcloud functions deploy ${{ env.CLOUD_FUNCTION_1_NAME }} \ --gen2 \ --cpu=${{ env.FUNCTION_CPU }} \ --memory=${{ env.FUNCTION_MEMORY }} \ --runtime ${{ env.PYTHON_FUNCTION_RUNTIME }} \ --trigger-http \ --region ${{ env.REGION }} \ --entry-point ${{ env.CLOUD_FUNCTION_1_NAME }} - name: Create cloud function - ${{ env.CLOUD_FUNCTION_2_NAME }} run: |- cd ${{ env.FUNCTION_SCRIPTS }}/${{ env.CLOUD_FUNCTION_2_NAME }} gcloud functions deploy ${{ env.CLOUD_FUNCTION_2_NAME }} \ --gen2 \ --cpu=${{ env.FUNCTION_CPU }} \ --memory=${{ env.FUNCTION_MEMORY }} \ --runtime ${{ env.PYTHON_FUNCTION_RUNTIME }} \ --trigger-http \ --region ${{ env.REGION }} \ --entry-point ${{ env.CLOUD_FUNCTION_2_NAME }} - name: Create cloud function - ${{ env.CLOUD_FUNCTION_3_NAME }} run: |- cd ${{ env.FUNCTION_SCRIPTS }}/${{ env.CLOUD_FUNCTION_3_NAME }} gcloud functions deploy ${{ env.CLOUD_FUNCTION_3_NAME }} \ --gen2 \ --cpu=${{ env.FUNCTION_CPU }} \ --memory=${{ env.FUNCTION_MEMORY }} \ --runtime ${{ env.PYTHON_FUNCTION_RUNTIME }} \ --trigger-http \ --region ${{ env.REGION }} \ --entry-point ${{ env.CLOUD_FUNCTION_3_NAME }}
deploy-composer-service-account
deploy-composer-service-account: needs: [enable-services, deploy-buckets, deploy-cloud-function ] runs-on: ubuntu-22.04 timeout-minutes: 10 steps: - name: Checkout uses: actions/checkout@v4 - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Authenticate with GCP - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker - name: Create service account run: |- if ! gcloud iam service-accounts list | grep -i ${{ env.SERVICE_ACCOUNT_NAME}} &> /dev/null; \ then \ gcloud iam service-accounts create ${{ env.SERVICE_ACCOUNT_NAME }} \ --display-name=${{ env.SERVICE_ACCOUNT_DESCRIPTION }} fi - name: Add permissions to service account run: |- gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/composer.user" gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/storage.objectAdmin" gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/cloudfunctions.invoker" # Permiss?o para criar e gerenciar ambientes Composer gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/composer.admin" gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/composer.worker" # Permiss?o para criar e configurar instancias e recursos na VPC gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/compute.networkAdmin" # Permiss?o para interagir com o Cloud Storage, necessário para buckets e logs gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/storage.admin" # Permiss?o para criar e gerenciar recursos no projeto, como buckets e instancias gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/editor" # Permiss?o para acessar e usar recursos necessários para o IAM gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/iam.serviceAccountUser" gcloud functions add-iam-policy-binding ${{env.CLOUD_FUNCTION_1_NAME}} \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/cloudfunctions.invoker" gcloud functions add-invoker-policy-binding ${{env.CLOUD_FUNCTION_1_NAME}} \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" gcloud functions add-iam-policy-binding ${{env.CLOUD_FUNCTION_2_NAME}} \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/cloudfunctions.invoker" gcloud functions add-invoker-policy-binding ${{env.CLOUD_FUNCTION_2_NAME}} \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" gcloud functions add-iam-policy-binding ${{env.CLOUD_FUNCTION_3_NAME}} \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/cloudfunctions.invoker" gcloud functions add-invoker-policy-binding ${{env.CLOUD_FUNCTION_3_NAME}} \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" SERVICE_NAME_1=$(gcloud functions describe ${{ env.CLOUD_FUNCTION_1_NAME }} --region=${{ env.REGION }} --format="value(serviceConfig.service)") gcloud run services add-iam-policy-binding $SERVICE_NAME_1 \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/run.invoker" SERVICE_NAME_2=$(gcloud functions describe ${{ env.CLOUD_FUNCTION_2_NAME }} --region=${{ env.REGION }} --format="value(serviceConfig.service)") gcloud run services add-iam-policy-binding $SERVICE_NAME_2 \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/run.invoker" SERVICE_NAME_3=$(gcloud functions describe ${{ env.CLOUD_FUNCTION_3_NAME }} --region=${{ env.REGION }} --format="value(serviceConfig.service)") gcloud run services add-iam-policy-binding $SERVICE_NAME_3 \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/run.invoker" gcloud functions add-invoker-policy-binding ${{env.CLOUD_FUNCTION_1_NAME}} \ --region="${{env.REGION}}" \ --member="allUsers" gcloud functions add-invoker-policy-binding ${{env.CLOUD_FUNCTION_2_NAME}} \ --region="${{env.REGION}}" \ --member="allUsers" gcloud functions add-invoker-policy-binding ${{env.CLOUD_FUNCTION_3_NAME}} \ --region="${{env.REGION}}" \ --member="allUsers"
deploy-bigquery-dataset-bigquery-tables
deploy-bigquery-dataset-bigquery-tables: needs: [enable-services, deploy-buckets, deploy-cloud-function, deploy-composer-service-account ] runs-on: ubuntu-22.04 timeout-minutes: 10 steps: - name: Checkout uses: actions/checkout@v4 - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Authenticate with GCP - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker - name: Create Big Query Dataset run: |- if ! bq ls --project_id ${{ secrets.PROJECT_ID}} -a | grep -w ${{ env.BIGQUERY_DATASET}} &> /dev/null; \ then bq --location=${{ env.REGION }} mk \ --default_table_expiration 0 \ --dataset ${{ env.BIGQUERY_DATASET }} else echo "Big Query Dataset : ${{ env.BIGQUERY_DATASET }} already exists" ! fi - name: Create Big Query table run: |- TABLE_NAME_CUSTOMER=${{ env.BIGQUERY_DATASET}}.${{ env.BIGQUERY_TABLE_CUSTOMER}} c=0 for table in $(bq ls --max_results 1000 "${{ secrets.PROJECT_ID}}:${{ env.BIGQUERY_DATASET}}" | tail -n +3 | awk '{print }'); do # Determine the table type and file extension if bq show --format=prettyjson $BIGQUERY_TABLE_CUSTOMER | jq -r '.type' | grep -q -E "TABLE"; then echo "Dataset ${{ env.BIGQUERY_DATASET}} already has table named : $table " ! if [ "$table" == "${{ env.BIGQUERY_TABLE_CUSTOMER}}" ]; then echo "Dataset ${{ env.BIGQUERY_DATASET}} already has table named : $table " ! ((c=c+1)) fi else echo "Ignoring $table" continue fi done echo " contador $c " if [ $c == 0 ]; then echo "Creating table named : $table for Dataset ${{ env.BIGQUERY_DATASET}} " ! bq mk --table \ $TABLE_NAME_CUSTOMER \ ./big_query_schemas/customer_schema.json fi
deploy-composer-environment
deploy-composer-environment: needs: [enable-services, deploy-buckets, deploy-cloud-function, deploy-composer-service-account, deploy-bigquery-dataset-bigquery-tables ] runs-on: ubuntu-22.04 timeout-minutes: 40 steps: - name: Checkout uses: actions/checkout@v4 - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Authenticate with GCP - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker # Create composer environments - name: Create composer environments run: |- if ! gcloud composer environments list --project=${{ secrets.PROJECT_ID }} --locations=${{ env.REGION }} | grep -i ${{ env.COMPOSER_ENV_NAME }} &> /dev/null; then gcloud composer environments create ${{ env.COMPOSER_ENV_NAME }} \ --project ${{ secrets.PROJECT_ID }} \ --location ${{ env.REGION }} \ --environment-size ${{ env.COMPOSER_ENV_SIZE }} \ --image-version ${{ env.COMPOSER_IMAGE_VERSION }} \ --service-account ${{ env.SERVICE_ACCOUNT_NAME }}@${{ secrets.PROJECT_ID }}.iam.gserviceaccount.com else echo "Composer environment ${{ env.COMPOSER_ENV_NAME }} already exists!" fi # Create composer environments - name: Create composer variable PROJECT_ID run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION}} variables \ -- set PROJECT_ID ${{ secrets.PROJECT_ID }} - name: Create composer variable REGION run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set REGION ${{ env.REGION }} - name: Create composer variable CLOUD_FUNCTION_1_NAME run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }}\ --location ${{ env.REGION }} variables \ -- set CLOUD_FUNCTION_1_NAME ${{ env.CLOUD_FUNCTION_1_NAME }} - name: Create composer variable CLOUD_FUNCTION_2_NAME run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set CLOUD_FUNCTION_2_NAME ${{ env.CLOUD_FUNCTION_2_NAME }} - name: Create composer variable CLOUD_FUNCTION_3_NAME run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set CLOUD_FUNCTION_3_NAME ${{ env.CLOUD_FUNCTION_3_NAME }} - name: Create composer variable BUCKET_DATALAKE run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION}} variables \ -- set BUCKET_NAME ${{ secrets.BUCKET_DATALAKE }} - name: Create composer variable TRANSIENT_FILE_PATH run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set TRANSIENT_FILE_PATH ${{ env.TRANSIENT_FILE_PATH }} - name: Create composer variable BRONZE_PATH run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set BRONZE_PATH ${{ env.BRONZE_PATH }} - name: Create composer variable SILVER_PATH run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set SILVER_PATH ${{ env.SILVER_PATH }} - name: Create composer variable REGION_PROJECT_ID run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set REGION_PROJECT_ID "${{ env.REGION }}-${{ secrets.PROJECT_ID }}" - name: Create composer variable BIGQUERY_DATASET run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set BIGQUERY_DATASET "${{ env.BIGQUERY_DATASET }}" - name: Create composer variable BIGQUERY_TABLE_CUSTOMER run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set BIGQUERY_TABLE_CUSTOMER "${{ env.BIGQUERY_TABLE_CUSTOMER }}"
deploy-composer-http-connection
deploy-composer-http-connection: needs: [enable-services, deploy-buckets, deploy-cloud-function, deploy-composer-service-account, deploy-bigquery-dataset-bigquery-tables, deploy-composer-environment ] runs-on: ubuntu-22.04 steps: - name: Checkout uses: actions/checkout@v4 - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Authenticate with GCP - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker - name: Create composer http connection HTTP_CONNECTION run: |- HOST="https://${{ env.REGION }}-${{ secrets.PROJECT_ID }}.cloudfunctions.net" gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} connections \ -- add ${{ env.HTTP_CONNECTION }} \ --conn-type ${{ env.CONNECTION_TYPE }} \ --conn-host $HOST
deploy-dags
deploy-dags: needs: [enable-services, deploy-buckets, deploy-cloud-function, deploy-composer-service-account, deploy-bigquery-dataset-bigquery-tables, deploy-composer-environment, deploy-composer-http-connection ] runs-on: ubuntu-22.04 steps: - name: Checkout uses: actions/checkout@v4 - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Authenticate with GCP - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} - name: Get Composer bucket name and Deploy DAG to Composer run: |- COMPOSER_BUCKET=$(gcloud composer environments describe ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} \ --format="value(config.dagGcsPrefix)") gsutil -m cp -r ./dags/* $COMPOSER_BUCKET/dags/
Resources Created After Deployment
After the deployment process is complete, the following resources will be available:
- Cloud Storage Buckets: Organized into Bronze, Silver, and Gold layers.
- Cloud Functions:: Responsible for processing data across the three layers.
- Service Account for Composer:: With appropriate permissions for invoking Cloud Functions.
- BigQuery Dataset and Tables: A DataFrame created for storing processed data.
- Google Composer Environment: Orchestrates the Cloud Functions with daily executions.
- Composer DAG: The DAG manages the workflow that invokes Cloud Functions and processes data.
Conclusion
This solution demonstrates how to leverage Google Cloud Functions, Composer, and BigQuery to create a robust three-tier data processing pipeline. The automation using GitHub Actions ensures a smooth, reproducible deployment process, making it easier to manage cloud-based data pipelines at scale.
References
- Google Cloud Platform Documentation: https://cloud.google.com/docs
- GitHub Actions Documentation:: https://docs.github.com/en/actions
- Google Composer Documentation:: https://cloud.google.com/composer/docs
- Cloud Functions Documentation: https://cloud.google.com/functions/docs
- GitHub Repo: https://github.com/jader-lima/gcp-cloud-functions-to-bigquery
The above is the detailed content of Using Google Cloud Functions for Three-Tier Data Processing with Google Composer and Automated Deployments via GitHub Actions. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undress AI Tool
Undress images for free

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

The key to dealing with API authentication is to understand and use the authentication method correctly. 1. APIKey is the simplest authentication method, usually placed in the request header or URL parameters; 2. BasicAuth uses username and password for Base64 encoding transmission, which is suitable for internal systems; 3. OAuth2 needs to obtain the token first through client_id and client_secret, and then bring the BearerToken in the request header; 4. In order to deal with the token expiration, the token management class can be encapsulated and automatically refreshed the token; in short, selecting the appropriate method according to the document and safely storing the key information is the key.

Assert is an assertion tool used in Python for debugging, and throws an AssertionError when the condition is not met. Its syntax is assert condition plus optional error information, which is suitable for internal logic verification such as parameter checking, status confirmation, etc., but cannot be used for security or user input checking, and should be used in conjunction with clear prompt information. It is only available for auxiliary debugging in the development stage rather than substituting exception handling.

TypehintsinPythonsolvetheproblemofambiguityandpotentialbugsindynamicallytypedcodebyallowingdeveloperstospecifyexpectedtypes.Theyenhancereadability,enableearlybugdetection,andimprovetoolingsupport.Typehintsareaddedusingacolon(:)forvariablesandparamete

A common method to traverse two lists simultaneously in Python is to use the zip() function, which will pair multiple lists in order and be the shortest; if the list length is inconsistent, you can use itertools.zip_longest() to be the longest and fill in the missing values; combined with enumerate(), you can get the index at the same time. 1.zip() is concise and practical, suitable for paired data iteration; 2.zip_longest() can fill in the default value when dealing with inconsistent lengths; 3.enumerate(zip()) can obtain indexes during traversal, meeting the needs of a variety of complex scenarios.

InPython,iteratorsareobjectsthatallowloopingthroughcollectionsbyimplementing__iter__()and__next__().1)Iteratorsworkviatheiteratorprotocol,using__iter__()toreturntheiteratorand__next__()toretrievethenextitemuntilStopIterationisraised.2)Aniterable(like

To create modern and efficient APIs using Python, FastAPI is recommended; it is based on standard Python type prompts and can automatically generate documents, with excellent performance. After installing FastAPI and ASGI server uvicorn, you can write interface code. By defining routes, writing processing functions, and returning data, APIs can be quickly built. FastAPI supports a variety of HTTP methods and provides automatically generated SwaggerUI and ReDoc documentation systems. URL parameters can be captured through path definition, while query parameters can be implemented by setting default values ??for function parameters. The rational use of Pydantic models can help improve development efficiency and accuracy.

To test the API, you need to use Python's Requests library. The steps are to install the library, send requests, verify responses, set timeouts and retry. First, install the library through pipinstallrequests; then use requests.get() or requests.post() and other methods to send GET or POST requests; then check response.status_code and response.json() to ensure that the return result is in compliance with expectations; finally, add timeout parameters to set the timeout time, and combine the retrying library to achieve automatic retry to enhance stability.

In Python, variables defined inside a function are local variables and are only valid within the function; externally defined are global variables that can be read anywhere. 1. Local variables are destroyed as the function is executed; 2. The function can access global variables but cannot be modified directly, so the global keyword is required; 3. If you want to modify outer function variables in nested functions, you need to use the nonlocal keyword; 4. Variables with the same name do not affect each other in different scopes; 5. Global must be declared when modifying global variables, otherwise UnboundLocalError error will be raised. Understanding these rules helps avoid bugs and write more reliable functions.
