Browsers are pretty good at loading pages, it turns out

Carter Sande Carter Sande • 5 min read

The <a> tag is one of the most important building blocks of the Internet. It lets you create a hyperlink: a piece of text, usually colored blue, that you can use to go to a new page. When you click on a hyperlink, your web browser downloads the new page from the server and displays it on the screen. Most web browsers also store the pages you previously visited so you can quickly go back to them. The best part is, the <a> tag gives you all of that behavior for free! Just tell the browser where you want to go, and it handles the rest.

Lately, though, that hasn’t been enough for website developers. The new fad is “client-side navigation”, where instead of relying on the browser to load new pages for you, you write a bunch of JavaScript code to do it instead. It’s actually really hard to get it right—loading the new page is simple enough, but you also have to write code to display a loading bar, make the Back and Forward buttons work, show an error page if the connection drops, and so on.

For a while, I didn’t understand why anyone did this. Was it just silly make-work, like how every social network redesigns their website every couple years for no discernable reason? Do <a> tags interfere with some creepy ad-tracking technique? Was there some really complicated technical reason why you shouldn’t use them?

I finally got my answer from a website called MDN. If you haven’t heard about them, MDN is a documentation/tutorial website run by the creators of Firefox. They basically wrote the book on how to make websites. So when they created a beta version of their site that used client-side navigation, I asked them why, and one of their developers responded that it was to load pages more quickly. With the normal <a> tag, the browser has to download the HTML code for the whole page, but if you write the behavior yourself in JavaScript, you can make sure to only download the part of the new page that’s different from the old one. I’m usually lucky enough to have a pretty fast Internet connection, so I never really noticed the difference.

But I got the chance to experience the benefits of client-side navigation on a recent trip to Canada. Canada is a socialist state in North America where every citizen is given free healthcare and a 128 kilobit cellular Internet connection. (T-Mobile told me I could buy an “all-access high-speed day pass” for $5 on the black market, but I didn’t want to get arrested by the Royal Canadian Mounted Secret Police.) I fired up both versions of MDN on my phone, and clicked a link on each one to see how much faster the JavaScript version would be:

Hang on a second! The JavaScript version wasn’t faster, it was way slower! What gives?

It turns out that browsers have a lot of tricks up their sleeves that help them put pages on the screen more quickly. A big one we’re seeing here is called progressive rendering: browsers download the top part of the page first, then show it on the screen while the rest of the page finishes downloading. The JavaScript version has to wait for the entire JSON response to come back before it can show anything on the screen, so it feels slower. This is also part of why the Reddit redesign is so awful to use on a slow connection—the old version could show you the top comment as soon as it loaded, while the new version has to load all the comments before showing you anything.

While it’s technically possible to re-create progressive rendering in JavaScript, that might not be a great idea. See, another reason these pages feel so slow on my mid-range smartphone is that their JavaScript code takes a long time to run. Part of this comes down to developers not bothering to optimize their code (it probably runs pretty fast on their top-of-the-line MacBook Pros and iPhones), part of it comes down to technology choices (some currently-fashionable libraries, like React, make it difficult to write fast code), but also, it just isn’t possible for a highly dynamic language like JavaScript to run as fast as the C++ code in browsers.

I think it’s best to build small, simple, standards-based web pages. Browser developers are really good at their jobs, and they’ve spent a lot of time on features like progressive rendering that help your sites feel fast without any effort on your part. It’s not worth it to try and go behind their backs—premature optimizations like client-side navigation are hard to build, don’t work very well, will probably be obsolete in a couple years, and make life worse for a decent portion of your users.