Skip to content

zkohi/firebase-export-firestore-to-bigquery-samples

Repository files navigation

Export Firebase Authentication users and Cloud Firestore Collections to BigQuery

  • Export Firebase Authentication users to Cloud Firestore Collection
  • Export All Cloud Firestore Collections and Specified collections to Cloud Storage
  • Export Specified Cloud Firestore Collections to BigQuery(Partitioned tables)

See

Inspired

Setup

Cloud SDK

See: Cloud SDK quickstarts

firebase

See: Firebase CLI

firebase login
firebase use <projectId>

See: Schedule data exports

You must complete the following tasks.

BigQuery

See: Creating datasets

Create new dataset. Dataset ID is firestore in this sample code.

node_modules

npm i
cd appengine
npm i
cd ../functions/
npm i

Deploy

appengine

cd appengine/
npm run deploy

functions

cd functions/
npm run deploy

Creating and Configuring Cron Jobs by Cloud Scheduler

See: Creating and Configuring Cron Jobs

Export Firebase Authentication users to Cloud Firestore Collection

Choose the Pub/Sub target and set topic.

Topic is cron-export-user-list-job in this sample code.

Payload string must be not null. But Any string is OK. Because Payload string doesn't be used in this sample code.

You can create the Firebase Authentication User using the Firebase Console if you don't have any the Firebase Authentication User yet.

You should create the Cloud Firestore Database using the Firebase Console if you don't have created the Cloud Firestore Database yet.

If you create the Cloud Firestore Database then you should select a Locked mode for your Cloud Firestore Security Rules.

Run Cloud Scheduler Job then check Cloud Firestore Collection named users using the Firebase Console.

Export All Cloud Firestore Collections and Specified collections to Cloud Storage

Choose the App Engine HTTP target and set URL and Choose Get method.

Ex.

  • /cloud-firestore-export?outputUriPrefix=gs://PROJECT-ID_backups-firestore&collections=users
  • /cloud-firestore-export?outputUriPrefix=gs://PROJECT-ID_backups-firestore&collections=users,etc

Run Cloud Scheduler Job then check Cloud Run Cloud Scheduler Job then check Cloud Storage bucket using the GCP Console.

Export Specified Cloud Firestore Collections to BigQuery(Partitioned tables)

Choose the App Engine HTTP target and set URL and Choose Get method.

Ex.

  • /cloud-firestore-export-to-bigquery?outputUriPrefix=gs://PROJECT-ID_backups-firestore&collections=users

  • /cloud-firestore-export-to-bigquery?outputUriPrefix=gs://PROJECT-ID_backups-firestore&collections=users,etc

Run Cloud Scheduler Job then check Cloud Run Cloud Scheduler Job then check BigQuery dataset using the GCP Console.

Additional

You can create the table to use Scheduling queries.

You can create the interactive dashboards to use Data Portal.

Coution

If collection fields is changed then set the projectionFields property or update BigQuery table schema

And edit Scheduling queries, do manual run if you need.