Home > Back-end >  Why batchWrite only write up to 3 data? Firestore
Why batchWrite only write up to 3 data? Firestore

Time:02-05

I am using Firestore to store data for my Reactjs app. I have a function as such:

export async function batchAddProduct(data) {
  const productRef = doc(collection(db, "product"));
  const batch = writeBatch(db);

  for (const datum of data) {
    batch.set(productRef, datum);
  }

  return await batch
    .commit()
    .then(() => {
      return { data: true, error: null };
    })
    .catch((err) => {
      return { data: null, error: err };
    });
}

So basically, I want to add lots of data at once. Hence, I'm using the writeBatch method. I see from an answer in SO where they use doc(collection(db, "product") to generate an empty doc first then use batch.set() to fill the doc. So I'm doing that here, and I'm passing up to 500 data at once (which is the maximum limit of a batch write), but somehow only up to 3 data is being written into the database. Why is that? Am I missing something?

Update:

According to the comment:

  1. When I console.log(data), it basically prints out an array with 500 objects in it (which I definitely can't paste in here). But I can assure you that it is receiving the correct data.
  2. batchAddProduct is called in a redux sagas as such:
function* BATCH_ADD_PRODUCT(input) {
  yield put({
    type: actions.SET_STATE,
    payload: {
      loadingUpdate: true,
    },
  });

  const { data, error } = yield call(batchAddProduct, input.payload.data);
  if (data) {
    yield put({
      type: actions.GET_PRODUK,
    });

    yield put({
      type: actions.SET_STATE,
      payload: {
        loadingUpdate: false,
        alert: {
          type: "success",
          message: "Product is added successfully.",
        },
      },
    });
  }

  if (error) {
    console.log(error);
    yield put({
      type: actions.SET_STATE,
      payload: {
        loadingUpdate: false,
        alert: {
          type: "error",
          message: error.message || "Error occured.",
        },
      },
    });
  }
}

and I use this in a dispatch as such:

dispatch({
    type: actions.BATCH_ADD_PRODUK,
    payload: {
        data: data, // WHICH CONTAINS UP TO 500 OBJECTS
    },
});

CodePudding user response:

I haven't tried the generator function with a batched write yet but try the following:

const myArray: any = []
const batches: WriteBatch[] = []
myArray.forEach((doc, i) => {
  if (i % 500 === 0) {
    batches.push(writeBatch(db))
  }

  const productRef = doc(collection(db, 'colName'))
  const batch = batches[batches.length - 1]
  batch.set(productRef, { ...data  })
})

await Promise.all(batches.map((batch) => batch.commit()))
console.log('done')
  • Related