I have read about the render tree and how it's made out of the CSSOM and DOM trees and I want to achieve the optimal UI performance on my sites.
I'm wondering - what are the best practices that I can provide to the external CSS files so it will allow the fastest CSSOM and therefore provide the fastest construction of the render tree?
CodePudding user response:
First, minify your style.css
with gulp or other tools or online tools like Cssminifier that convert your style.css
to style.min.css
, for example, then add it to the site header as shown below. .
<link rel="stylesheet" href="style.min.css">
CodePudding user response:
How to Maximize Rendering of a Web Page in Web Browsers
I must warn you, that CSS is not an issue at all when it comes to slow website rendering. Small CSS text files download extremely fast. There is very little you can change in CSS design today that will have much of an impact on that simply because most external CSS files are teeny-tiny compared to other downloaded resources. The main killer of performance are HUGE JavaScript API Libraries, slow web servers, and poor connection speeds on the user's end. But some tricks you can use are listed below, though I do not recommend you hijack your user's browsers to gain a few 100 milliseconds like this:
Create a single external style sheet with just the exact amount of styles needed to render your homepage, or first-visit page in your site. Move all secondary CSS to a separate page (see below). This is your "synchronized" CSS that will load prior to the HTML download, parse, and rendering of the DOM. Link to this file using a simple
<link>
tag. If this is a small footprint, and you have removed all other CSS, it reduces the wait time for HTML to download, parse, render, and paint of the DOM tree. (Because of HTTP/2, increased browser connections, parallel CSS downloading, and faster networks, this is almost a non-issue today)Set all unnecessary JavaScript external files to use
"defer"
when downloading. This means they will load in order but in parallel to other files over shared connections and not interfere with web page, CSS download, and HTML parsing. They will not run until all HTML and the DOM is rendered and painted, as well:
< script src="myscript.js" type="text/javascript" defer="defer" >< /script >
As an option, consider setting all external or global JavaScript files that affect all pages in your website, but which ARE required to draw the DOM, such that they download "asynchronously". Like "defer", these files will not stop HTML from rendering but as soon as these asynchronous files appear, they could block rendering of the page. Avoid this code unless these script files are mandatory to the rendering of the DOM and very small. In the old days we dropped these external script links at the bottom of the web page after the last HTML inside the body element. This way they only started downloading after the HTML was downloaded. But this async feature when supported may speed that process up slightly. Be sure to do Windows.load
or other tricks to verify the page is rendered before running your async scripts.
< script type="text/javascript" src="{your path}/MyJavaScript.js?v=1.0.0.0" async="async" >< /script >
- All the other CSS you don't need to render in the homepage, but which can load after the DOM, set them to "preload". However, the attribute
preload
in the<link>
tag is not 100% supported yet in modern browsers and in none of the older ones (pre-HTML5). So, use this "print" trick below instead to get the browser to download these CSS external files via a "queue", which often means they download last in the order of resources and after the DOM is already rendered. This "print" media type trick has very wide browser support. Browsers have been doing this withmedia=print
linked style sheets for 20 years naturally, so this is just a "hack" to force regular sheets to use that model. After these are downloaded, all the other pages will use them, including your homepage and its scripts, if needed:
< link rel="stylesheet" href="style.css" media="print" type="text/css" onl oad="this.media='screen';this.onload=null;" / >
Make sure your server uses HTTP/2 which comes with multiplexing. Its just a silly term that means downloaded files from browsers can now connect and download over a single connection, if needed, and in parallel. But that rarely is needed in modern browsers over fast network connections today unless you have lots and lots of scripts and media files in your page that are required for download. If you do, I would question the design of your website.
There is a number of prefetch, prerender, preconnect, and other
<link>
tricks you can add, though in most cases they won't give you huge speed gains as most servers and browsers negotiate all these connections quite fast. Many of these connections are already cached by host providers, etc. These might come into play for poor countries with cell phone users who's devices must negotiate uncached DNS over distant Western servers. But here are some "tricks":
Add DNS prefetch calls below to external URL's or websites your page requires to access content quickly. These will fire calls to uncached DNS routes and prepare those connections for the browser to use. Only do this if you are accessing resources from foreign servers your page depends upon:
< link rel="dns-prefetch" href="//somewebsite.com" / >
Add preconnect calls below to external URL's or websites your page requires to access content quickly. This works the same as above but for the actual URL's and their IP's, which often are cached already in your network:
< link rel="preconnect" href="//somewebsite.com" / >
Add resource prefetch call below to assets in other pages you hope users will access AFTER they visit your homepage. These files will be cached, but implementation of the call is optional for the browser. This assume your users will access these resources after clicking through the site, which often is not true. So I would avoid this. Note: Browser support of this feature is limited.
< link rel="prefetch" href="someresource.jpg" / >
Add resource prerender call below to web pages and all their associated resources you hope users will access after the viewing the current web page. These files will be cached, but implementation of the call is optional for the browser. As above, you need to assume a user will navigate to these cached pages. Also, "prerender" not only downloads the pages and resources, but builds the DOM, runs the JavaScript, and then switches out the page when the user clicks the link. This means you'e asking the browser for more CPU and thats wasteful for a page the user may never visit. This only works right in modern Chrome browsers, I believe. Note: Browser support of this feature is very limited.
< link rel="prerender" href="somewebpage.html" / >
Give all your images, iframes, video, and other HTML media elements a
width
andheight
attribute so the browser can reserve space in the painted browser canvas for these images while it downloads those items. This mainly prevent a shift in the layout but can give you speed if the browser does several paints.Give all your
<img>
elements aloading="lazy"
attribute, which means they do not load till the user scrolls down the web page. This only works in modern HTML 5 browsers.Give all your
<video>
,<audio>
, etc. apreload="metadata"
attribute, which means they do not load and play till the user scrolls down the web page. This only works in modern HTML 5 browsers.
Keep in mind, most of these hacks and tricks for CSS and other things are rarely the culprit in slow sites. CSS has only recently gotten a bad rap when it comes to browser rendering speed, when its simply 100% untrue. CSS preloading and minimizing are solutions without a problem which some search engines (i.e. Google) have asked developers to implement, but which has shown very little effect on overall page rendering and paint performance, much less page rank compared to the download of these new gigantic JavaScript API's. Changing CSS design is like squeezing blood from a turnip. There is no evidence today changing CSS architecture or download strategies has any major effect on page rendering. These tech corporations are starting to now move away from that myth now. Besides, CSS has been downloading lightning fast in browsers for almost 25 years now with no extra help.
So, with 5G and fiber optic Internet why would there suddenly be a delay downloading 20 kilobytes of CSS?
The fact is, there is NO delay. It's almost zero. Minimization of CSS and HTML does nothing if your CSS is just a few dozen kilobytes, for example. A typical CSS file is very small compared to 1-5 Megabytes of JavaScript, images, video, or other resources downloaded to a typical web browser today. A lot of CSS is less than 100Kb, but the majority is probably less than 30Kb in a typical website today, even after minimization. If you use external script and styles in your site they will be cached for long periods of time in the user's browser. Revisits by your users to your site will reuse those same style sheets for days, weeks, or even months and keep your page rendering lightning fast when pulled from the browser cache compared to say using embedded styles which many JavaScript API's use that force reloads of the same CSS.
In addition, web browsers today are very smart at negotiating DNS calls and server connections, even though in the past browsers only had 2 connections by default (many use 6 connections by default today). That means external CSS and scripts often download in parallel and very quickly prior to HTML parsing and rendering begins. Because you never want unstyled HTML, this design works great and prevents the flash of unstyled HTML. Some kids today now hijack this natural browser process using tricks mentioned above which can create a huge mess where layouts shift, single-threaded scripts eat up CPU on phones to manipulate DOM's, HTML layouts rendered by scripts jump around, or half of web pages fail to load text content. Uncompressed images, on-demand video, large downloaded files, and huge JavaScript libraries create most major delays in web pages. So trust me...CSS is NOT the problem!
Huge Megabytes of blocking JavaScript is often the main culprit!
Many client-side, JavaScript API's today only get around their huge payload problem by trying to hijack the browser and preloading very tiny files on the homepage, but then behind the scenes make gigantic calls to volumes and volumes of script libraries, HTML pages, and resources with the assumption a user might visit other pages in the website. This can be a gamble and very wasteful and unnecessary if user's don't click where they expect. It loads a typical web browser with megabytes of cached but unnecessary scripts and resources, too. These scripting circus tricks are obviously a poorly thought out model, if you ask me.
Stay heavy server-side, not heavy client-side and your pages will be lightning fast!
For those reasons and more, as mentioned, I would avoid CSS and HTML "speed" tricks and focus on all the other bottlenecks.