Home > Blockchain >  Does inserting meta elements into the <head/> using JS effect SEO?
Does inserting meta elements into the <head/> using JS effect SEO?

Time:10-15

I am developing a static HTML/CSS website I have multiple meta tags I add to the head on each page. As it stands currently whenever I want to change the meta information I need to manually update each page. I can easily write a script to insert the meta elements into the head, however I am not sure if this will have a negative effect on SEO performance. Will Google crawl pickup on the meta elements if they are loaded dynamically?

A basic example of what I am doing:

index.html

<!DOCTYPE html>

<html>

<head>
    <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.5.1/jquery.min.js" defer></script>
    <script src="/assets/js/main.js" defer></script>
</head>

<body>
    <h1>Example Page</h1>
</body>

</html>

head_elements.html

<meta http-equiv=Content-Type content="text/html; charset=UTF-8">
<meta name=viewport content="width=device-width, initial-scale=1.0, user-scalable=0">
<meta name=robots content="index, follow">
<meta name=googlebot content="index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1">
<meta name=bingbot content="index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1">
<meta property=og:locale content=en_US>
<meta property=og:type content=website>

main.js

$(() => {
    var $head = $("head")
    $.get("head_elements.html", function (component) {
        $head.prepend(component)
    })
})

Does this process of dynamically adding the meta tags to the head have any drawbacks?

CodePudding user response:

We ran a series of tests that verified Google is able to execute and index JavaScript with a multitude of implementations. We also confirmed Google is able to render the entire page and read the DOM, thereby indexing dynamically generated content.

Even tho Google crawlers can execute javascript and index dynamic content, Experts still don't recommend it. Because it may cause poor ranking and may take time.

Experts recommend using Pre-Rendering or Server-Side-Rendering.

CodePudding user response:

It's fine to use JavaScript to inject links into the DOM, as long as such links follow the guidelines for crawlable links.

Crawling a URL and parsing the HTML response works well for classical websites or server-side rendered pages where the HTML in the HTTP response contains all content. Some JavaScript sites may use the app shell model where the initial HTML does not contain the actual content and Googlebot needs to execute JavaScript before being able to see the actual page content that JavaScript generates.

Googlebot queues all pages for rendering, unless a robots meta tag or header tells Googlebot not to index the page. The page may stay on this queue for a few seconds, but it can take longer than that. Once Googlebot's resources allow, a headless Chromium renders the page and executes the JavaScript. Googlebot parses the rendered HTML for links again and queues the URLs it finds for crawling. Googlebot also uses the rendered HTML to index the page.

Keep in mind that server-side or pre-rendering is still a great idea because it makes your website faster for users and crawlers, and not all bots can run JavaScript.

Check more related to this topic here

  • Related