Cloud Functions: How to copy Firestore Collection to a new document?

前端 未结 3 1448
余生分开走
余生分开走 2020-12-16 19:11

I\'d like to make a copy of a collection in Firestore upon an event using Cloud Functions

I already have this code that iterates over the collection and copies each

相关标签:
3条回答
  • 2020-12-16 19:14

    There is no fast way at the moment. I recommend you rewrite your code like this though:

    import { firestore }  from "firebase-admin";
    async function copyCollection() {
        const products = await firestore().collection("products").get();
        products.forEach(async (doc)=> {
            await firestore().collection(uid).doc(doc.get('barcode')).set(doc.data());
        })
    }
    
    0 讨论(0)
  • 2020-12-16 19:32

    I wrote a small nodejs snippet for this.

    const firebaseAdmin = require('firebase-admin');
    const serviceAccount = '../../firebase-service-account-key.json';
    const firebaseUrl = 'https://my-app.firebaseio.com';
    
    firebaseAdmin.initializeApp({
        credential: firebaseAdmin.credential.cert(require(serviceAccount)),
        databaseURL: firebaseUrl
    });
    const firestore = firebaseAdmin.firestore();
    
    async function copyCollection(srcCollectionName, destCollectionName) {
        const documents = await firestore.collection(srcCollectionName).get();
        let writeBatch = firebaseAdmin.firestore().batch();
        const destCollection = firestore.collection(destCollectionName);
        let i = 0;
        for (const doc of documents.docs) {
            writeBatch.set(destCollection.doc(doc.id), doc.data());
            i++;
            if (i > 400) {  // write batch only allows maximum 500 writes per batch
                i = 0;
                console.log('Intermediate committing of batch operation');
                await writeBatch.commit();
                writeBatch = firebaseAdmin.firestore().batch();
            }
        }
        if (i > 0) {
            console.log('Firebase batch operation completed. Doing final committing of batch operation.');
            await writeBatch.commit();
        } else {
            console.log('Firebase batch operation completed.');
        }
    }
    
    copyCollection('customers', 'customers_backup').then(() => console.log('copy complete')).catch(error => console.log('copy failed. ' + error));
    
    0 讨论(0)
  • 2020-12-16 19:35

    Currently, no. Looping through each document using Cloud Functions and then setting a new document to a different collection with the specified data is the only way to do this. Perhaps this would make a good feature request.

    How many documents are we talking about? For something like 10,000 it should only take a few minutes, tops.

    0 讨论(0)
提交回复
热议问题