Home > Mobile >  Angular NGRX split large array to chunks and combine API requests results
Angular NGRX split large array to chunks and combine API requests results

Time:04-20

So the problem is that I have a large array of objects, let's say a 1000 that I need to send to our API in NGRX Effect and then dispatch a new Action with the response, but API takes too long to process such a large requests at once, so I want to divide this array into smaller chunks, send a request with each of them one after another and then combine responses from all of them into a single action. I've tried it with mapping this large array of data into an array of Observables with from() operator, but couldn't figure out what to use next after it was mapped, I've tried casting it to Promises and use Promise.all() on them but again, no luck. In summary what I need to do is this:

  1. Action triggers an Effect that needs to send API call with an array of data from Action,
  2. In the Effect, array of data gets divided into chunks of 100 objects,
  3. These chunks are mapped into API requests, sent one after another, and the response from each needs to be combined with previous ones,
  4. After all of these requests are done, an Action with the combined response from them needs to be dispatched to update Store.

Here is a simplified Effect that I'm using currently and what I need to divide is action.rows:

loadRows$ = createEffect(() => {
  return this.actions$.pipe(
    ofType(RowsActions.LoadRows),
    switchMap((action) => {
      return this.eventsService.getBatchRows(action.rows).pipe(
        switchMap((response) => {
          return [
            new RowsLoaded({ rows: response.rows }),
            new LoadedTableData(response.rows),
          ];
        }),
        catchError(() => {
          return of(new RowsFailedToLoad());
        })
      );
    })
  );
});

CodePudding user response:

have implemented similiar logic, if one chunk will have 100 items - it still to big number.

I am running ALL items paralelly.

UPD:

if you will change mergeMap to concatMap - all calls will be sequential

it does not matter if it is single row or chunk of rows

        from(action.rows)
            .pipe(
                mergeMap(id => api(id).pipe(catchError(e => EMPTY))),
                toArray())
             )

here is working example: https://codesandbox.io/s/rxjs-playground-forked-u8c7jd?file=/src/index.js

FINAL VERSION:

const { of, from } = require("rxjs");
const {
  switchMap,
  concatMap,
  toArray,
  catchError,
  map,
  tap
} = require("rxjs/operators");

const chunkArray = (arr, size) =>
  arr.length > size
    ? [arr.slice(0, size), ...chunkArray(arr.slice(size), size)]
    : [arr];
const getBatchRows = (rows) => of({ response: { rows } });

of({ rows: ["id1", "id2", "id3", "id4", "id5"] })
  .pipe(
    switchMap((action) =>
      from(chunkArray(action.rows, 2))
        .pipe(
          concatMap((rows) =>
            getBatchRows(rows).pipe(catchError((e) => of({})))
          )
        )
        .pipe(toArray())
    ),

    tap((data) => console.log(data)),
    map((responseArray) => responseArray.map((i) => i.response.rows).flat()),
    map(rows => [
      new RowsLoaded({ rows }),
      new LoadedTableData(rows),
    ])
  )
  .subscribe((data) => {
    console.log(data);
  });

to use chunk way - almost nothing to do

will use

const chunkArray = (arr, size) =>
  arr.length > size
    ? [arr.slice(0, size), ...chunkArray(arr.slice(size), size)]
    : [arr];

and use here:

 from(chunkArray(action.rows, 100))

//and change this
 mergeMap(rows => this.eventsService.getBatchRows(rows)
  • Related