- Export Firebase Authentication users to Cloud Firestore Collection
- Export All Cloud Firestore Collections and Specified collections to Cloud Storage
- Export Specified Cloud Firestore Collections to BigQuery(Partitioned tables)
- Cloud Pub/Sub triggers
- admin.auth().listUsers()
- Schedule data exports
- Loading data from Cloud Firestore exports
- BigQuery API For Jobs Resource
- Partitioned tables
See: Firebase CLI
firebase login
firebase use <projectId>
You must complete the following tasks.
- Before you begin
- Configure access permissions
- You should assign the Storage Admin role on your bucket using the GCP Console if error occurred when you run gsutil
- Create Cloud Storage bucket for export and import operations(ex: gs://PROJECT-ID_backups-firestore)
- Enable Object Lifecycle Management for a bucket if you want
See: Creating datasets
Create new dataset. Dataset ID is firestore in this sample code.
npm i
cd appengine
npm i
cd ../functions/
npm i
cd appengine/
npm run deploy
cd functions/
npm run deploy
See: Creating and Configuring Cron Jobs
Choose the Pub/Sub target and set topic.
Topic is cron-export-user-list-job in this sample code.
Payload string must be not null. But Any string is OK. Because Payload string doesn't be used in this sample code.
You can create the Firebase Authentication User using the Firebase Console if you don't have any the Firebase Authentication User yet.
You should create the Cloud Firestore Database using the Firebase Console if you don't have created the Cloud Firestore Database yet.
If you create the Cloud Firestore Database then you should select a Locked mode for your Cloud Firestore Security Rules.
Run Cloud Scheduler Job then check Cloud Firestore Collection named users using the Firebase Console.
Choose the App Engine HTTP target and set URL and Choose Get method.
Ex.
- /cloud-firestore-export?outputUriPrefix=gs://PROJECT-ID_backups-firestore&collections=users
- /cloud-firestore-export?outputUriPrefix=gs://PROJECT-ID_backups-firestore&collections=users,etc
Run Cloud Scheduler Job then check Cloud Run Cloud Scheduler Job then check Cloud Storage bucket using the GCP Console.
Choose the App Engine HTTP target and set URL and Choose Get method.
Ex.
-
/cloud-firestore-export-to-bigquery?outputUriPrefix=gs://PROJECT-ID_backups-firestore&collections=users
-
/cloud-firestore-export-to-bigquery?outputUriPrefix=gs://PROJECT-ID_backups-firestore&collections=users,etc
Run Cloud Scheduler Job then check Cloud Run Cloud Scheduler Job then check BigQuery dataset using the GCP Console.
You can create the table to use Scheduling queries.
You can create the interactive dashboards to use Data Portal.
If collection fields is changed then set the projectionFields property or update BigQuery table schema
And edit Scheduling queries, do manual run if you need.