GCP BigQuery connector comes as a standard with Stood. This acts both as a bridge to any analytics / BI tool of the market (most of them connect to BigQuery in minutes), or as a bridge to any data warehouse (if not BQ itself).
This document describes the BigQuery sync functionality that automatically exports deals data from Stood CRM to Google BigQuery for analytics and reporting purposes.
The BigQuery sync feature allows teams to automatically export their deals data to BigQuery on a daily basis. This enables advanced analytics, reporting, and data integration with other business intelligence tools.
Scheduled Sync: Runs daily at 11:59 PM UTC
Team-based Activation: Each team can enable/disable sync independently
Comprehensive Data: Includes all deals + aggregates names or counts from accounts, activities, posts, and user collections
Partitioned Tables: Data is partitioned by snapshot date for efficient querying
Idempotent: Safe to run multiple times - replaces data for the same snapshot date
Make sure the BigQuery API is enabled in your Google Cloud project:
gcloud services enable bigquery.googleapis.comDeploy the Cloud Functions with the BigQuery sync:
cd functions
npm install
firebase deploy --only functionsGo to the Admin panel in Stood CRM
Navigate to Teams management
For each team you want to sync, toggle the "BigQuery Sync" switch
The sync will start on the next scheduled run (11:59 PM UTC)
The table contains the following columns:
Column | Type | Description |
| STRING | Unique deal identifier |
| STRING | Name of the deal |
| STRING | Associated account ID |
| STRING | Associated account name |
| STRING | Team ID (required) |
| STRING | Team name (required) |
| INTEGER | Deal amount |
| STRING | Deal stage (s0, s1, s2, s3, s4) |
| STRING | Human-readable stage label |
| STRING | Deal description |
| STRING | Deal solution |
| DATE | Deal creation date |
| DATE | Deal closing date |
| STRING[] | Array of deal tags |
| STRING | Deal owner user ID |
| STRING | Deal owner full name |
| STRING | Deal owner email |
| STRING | Parent deal ID (for deal hierarchy) |
| INTEGER | Number of posts for this deal |
| INTEGER | Number of open activities |
| INTEGER | Number of closed activities |
| DATE | Date of the snapshot (partitioning field) |
| TIMESTAMP | When the sync was performed |
-- Get all deals for a specific team
SELECT * FROM `your-project.stood_deals.deals_snapshot`
WHERE team_id = 'your-team-id'
AND snapshot_date = '2024-01-15';
-- Get deals by stage
SELECT
stage_label,
COUNT(*) as deal_count,
SUM(amount) as total_amount
FROM `your-project.stood_deals.deals_snapshot`
WHERE snapshot_date = '2024-01-15'
GROUP BY stage_label;
-- Get team performance over time
SELECT
snapshot_date,
team_name,
COUNT(*) as total_deals,
SUM(amount) as total_value,
AVG(amount) as avg_deal_size
FROM `your-project.stood_deals.deals_snapshot`
WHERE snapshot_date >= '2024-01-01'
GROUP BY snapshot_date, team_name
ORDER BY snapshot_date DESC;-- Deal velocity analysis
SELECT
team_name,
AVG(DATE_DIFF(closing_date, creation_date, DAY)) as avg_days_to_close,
COUNT(*) as total_deals
FROM `your-project.stood_deals.deals_snapshot`
WHERE closing_date IS NOT NULL
AND snapshot_date = '2024-01-15'
GROUP BY team_name;
-- Activity correlation with deal success
SELECT
stage_label,
AVG(open_activities_count) as avg_open_activities,
AVG(closed_activities_count) as avg_closed_activities,
AVG(post_count) as avg_posts
FROM `your-project.stood_deals.deals_snapshot`
WHERE snapshot_date = '2024-01-15'
GROUP BY stage_label;You can manually trigger the sync for testing:
# Call the test endpoint
curl -X POST https://your-region-your-project.cloudfunctions.net/bigQuerySyncDealsMonitor the Cloud Functions logs to see sync activity:
firebase functions:log --only bigQuerySyncDealsCloud Scheduler: 3 free jobs per billing account, then $0.10/month per job
Cloud Functions: Free tier covers 2M invocations/month
BigQuery: Storage and query costs apply based on usage
Firestore: Read operations for data collection
Permission Errors: Ensure the Cloud Functions service account has BigQuery permissions
Dataset Creation Fails: Check if BigQuery API is enabled
No Data Synced: Verify teams have bigQuerySyncEnabled: true
Timeout Errors: Increase function timeout if dealing with large datasets
The Cloud Functions service account needs these BigQuery permissions:
bigquery.datasets.create
bigquery.tables.create
bigquery.tables.updateData
bigquery.jobs.create
Check Cloud Functions logs for sync status
Monitor BigQuery usage in the Google Cloud Console
Set up alerts for failed syncs
Data is synced within the same Google Cloud project
No external API keys or credentials are required
Team data is isolated by team_id in the BigQuery table