GoogleBigQuery Automate

GoogleBigQuery automates data integration by fetching satellite position data every minute and storing it in a structured format. This workflow simplifies data management, enabling timely insights and analysis while reducing manual effort.

7/8/2025
4 nodes
Simple
schedulesimplegooglebigqueryautomationapiintegration
Categories:
Schedule TriggeredSimple Workflow
Integrations:
GoogleBigQuery

Target Audience

  • Data Analysts: Need to retrieve and analyze satellite position data efficiently.
    - Developers: Looking for a simple automation to integrate Google BigQuery with APIs.
    - Business Analysts: Interested in tracking satellite data for reporting and insights.
    - Researchers: Require up-to-date satellite position information for studies or projects.
  • Problem Solved

    This workflow automates the process of fetching satellite position data from an external API and storing it in Google BigQuery. It eliminates the need for manual data retrieval and entry, thereby saving time and reducing errors. By running on a schedule, it ensures that the data is always current, providing users with accurate and timely information.

    Workflow Steps

  • Cron Node: Triggers the workflow every minute, ensuring timely data retrieval.
    2. HTTP Request Node: Makes a call to the API endpoint https://api.wheretheiss.at/v1/satellites/25544/positions, fetching the latest satellite position data including latitude, longitude, and timestamp.
    3. Set Node: Extracts the relevant data from the HTTP response and formats it for insertion into Google BigQuery. It organizes the data into key-value pairs, ready for upload.
    4. Google BigQuery Node: Inserts the formatted data into the specified table within the dataset, ensuring that the satellite position data is stored for future analysis.
  • Customization Guide

  • Change the API Endpoint: Modify the URL in the HTTP Request node to fetch data from a different API if needed.
    - Adjust the Schedule: Update the Cron node to change how frequently the workflow runs (e.g., every hour instead of every minute).
    - Modify Data Fields: In the Set node, you can add or remove fields based on the data you want to store in Google BigQuery.
    - Change BigQuery Configuration: Update the projectId, datasetId, and tableId in the Google BigQuery node to point to your specific Google Cloud project and dataset.