Home > database >  Second axios function not running after first one is called - React
Second axios function not running after first one is called - React

Time:05-09

I have two axios functions set up in React. The first function scrapes a page to retrieve a list of links and randomly returns one of those links. The next function uses that link as an argument to scrape that webpage and returns the title and the first paragraph in an array. So the second function depends on the first.

When I try to implement this using the useEffect hook, I can get the first function to work, but the second function doesn't retrieve the link returned by the first one and gives an error. It's like the second one isn't waiting for the first one to be returned, despite me using await. Can anyone help me to see where I've gone wrong?

import getLink from "./getLink.js";
// Scrapes a page for an array of links,
// returns one of those links randomly (string)

import getPage from "./getPage.js";
// Scrapes the page from getLink,
// returns [h1 text, p text] (array)

import {useEffect,useState} from "react";

const Body = () => {
    const [text, setText] = useState([]);
    
    useEffect(() => {
        const getData = async () => {
            const link = await getLink();
            const data = await getPage(link);
            setText(data);
        }
        getData();
    }, []);
    
    return (
        <div>
            <h1>{text[0]}</h1>
            <div dangerouslySetInnerHTML={{__html: text[1]}}></div>
        </div>
    )
}

export default Body;

Error message: Access to XMLHttpRequest at 'link from getLink' from origin 'http://localhost:3000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.

CodePudding user response:

You need to make sure there is CORS at the server you are requesting the text from. You can include the right headers using middleware, in case you are using express.js, please have a look at the cors package docs.

You can check the response headers by running

curl -v http://localhost:3000

A temporary workaround would be to include { mode: 'no-cors' } in your request options.

CodePudding user response:

The scraping works when I added "proxy": "[domain name]" in package.json and removed the domain names from the links I use in my project.

From: https://www.youtube.com/watch?v=hxyp_LkKDdk

  • Related