Home > database >  What would be the best way filtering the huge data using Javascript/react/redux
What would be the best way filtering the huge data using Javascript/react/redux

Time:04-08

In my case minimum JSON data which I am using 90k [...] and currently I am using .filter method. Nothing is wrong working everything perfectly without any issue, but just for the performance point of view I am wondering and, need suggestion as well, which way we can use for improving the permeance do you agree or we have better way to improve the performance.

All the request coming from backend can not modify and split.

For the reference adding a 5k data which takes around 1sec.

I value all the developer times, I added a code snippet as well.

Appreciate any help and suggestion.

const load5kData = async () => {
    let url = 'https://jsonplaceholder.typicode.com/photos';
    let obj = await (await fetch(url)).json();
    const filteredValue = obj.filter(item => item.albumId == 36);
    console.log(filteredValue)
}
load5kData();
<h1>5k data</h1>

CodePudding user response:

It looks like the response is returned with the albumId ordered in ascending order. You could make use of that by using a traditional for loop and short circuiting once you reach id 37.

In my opinion, if you aren't having performance issues just using the filter method, I would say just leave it and dont over-optimize!

Another option is that there are only 50 items with albumId == 36. You could just make your own array of all those objects in a json file. However you obviously lose out on fetching the latest images if the results of the api ever change

CodePudding user response:

Currently using: obj.filter(item => item.albumId == 36);

CodePudding user response:

filter is really your only solution which will involve iterating over each element.

If you need to do multiple searches against the same data you can index data by the key you will be using so finding data with a specific albumId requires no additional filtering but would still require iterating over each element when initially indexing.

const indexByAlbumId = data =>
    data.reduce((a, c) => {
        if (a[c.albumId] === undefined) {
            a[c.albumId] = [c];
        } else {
            a[c.albumId].push(c);
        }
        return a;
    }, {});

const load5kData = async () => {
    const url = 'https://jsonplaceholder.typicode.com/photos';
    const data = await (await fetch(url)).json();

    const indexedData = indexByAlbumId(data);

    console.log('36', indexedData[36]);
    console.log('28', indexedData[28]);
};

load5kData();

Another optimisation is that if the data is sorted by the index you are searching you can take advantage of this by doing a divide and conquer search where you first try to find an entry where the value is what you need, then from there you find where the chunk begins/ends by doing the same divide and conquer to the left/right of that element.

  • Related