Menu

How to avoid enormous network payloads

How to avoid enormous network payloads

If you're websites are suffering from extremely large page file download sizes, then there are steps that you can take to help fix that issue, such as real time image resizing and configuring your websites to use a CDN.

Performance on the web is important in order to provide a fast and smooth overall web surfing experience to your site visitors. I tend to check my performance regularly by using various online tools, such as Google's PageSpeed Insights, which I highly recommend to any and all web developers looking to get a feel for how people see their websites. Because they may appear fast to us, but we could be blinded to the harsher truth that is network performance speeds.

While some of the 'faults' that they list are difficult to bypass due to the current nature of web standards and developer habits, some of them are fixable with a relatively small amount of work.

Certain 3rd party libraries, for example, will almost always appear as a red flag and correcting those issues tend to be more trouble than the value they provide. For example, if you are using jQuery heavily on your websites, then that could potentially be flagged as an issue to resolve. However removing jQuery altogether might not be the best solution.

One of the latest test runs that I performed on a page on this blog yielded the following results, which caught my eye:

The word "enormous" definitely is attention grabbing. Essentially, the page was loading way too much data for a single page load. And if you look at the file size of roughly 5MB, I would say that while a bit exaggerated, it is definitely not an ideal scenario. Users visiting this blog on a mobile device might find the page load times way too slow for them and look elsewhere for a solution to their question.

The real issue above is that the images that are listed as being too large, were never resized to fit their relative screen size. They were uploaded directly in their original size and the CSS was set to display a tiny thumbnail sized image.

So it's time to get to work and to correct that issue. Down below will be a few potential solutions that you can use to avoid this case, including the primary method that I myself use on this blog.

Scale physical image sizes

While we all want the highest resolutions images and best quality photographs on our websites, it won't come without a cost. That cost being page load times and bandwidth. You don't need a 4000 x 3000px photo on your website. Particularly when the maximum size that any user will ever see is a 800 x 600px thumbnail as an example.

If you scale down, you might end up losing some of the clarity and fidelity of the images. But there is a good chance that without the proper hardware, you wouldn't be able to make out the difference in any case. So if you don't have a 4k or 8k resolution display, then losing some of that detail might be just fine.

Scaling on the server is the key here. You can set the maximum width of your images to 500px in CSS but if the original size is still 5000px, then you will still need to download that entire file.

There exist various solutions to resizing image files on the fly. Almost every modern server-side programming language, such as C# or PHP includes graphic libraries for image manipulation. I myself use a custom C# script that does just this. It creates a virtual bitmap and shrinks down until a certain minimum is reached, which it will then save on the server and delete the original to save space.

Resize on upload

This is the approach that I take on my projects. I have written before about the custom file upload widget that I use written entirely in JavaScript. One of the features that I coded on the server-side is that exact ability to resize and scale down files.

Why weren't the files resized in the screenshot above you may ask? Well, they were. The newly generated files on the server get renamed by adding a _thumb.extension to the original file name, which was omitted when rendering that particular section.

I bring that up just to point out the fact that when working with custom code and architecture, these things will occur. Which is why the following approach might be more appropriate for you depending on the nature of your project.

Using a CDN

If you don't want to do the work yourself, or don't have direct access to the server, then you can consider going the CDN route. If you are not familiar with a CDN, here is a quick breakdown:

- CDN - Content Delivery Network
- Serves files to site visitors
- Chooses servers that are closest to site visitors to improve performance
- Can optimize files before they are delivered
- Can cache data to further improve speed

Because CDN's can manipulate files before they are actually served to your users, they can be a useful tool when serving appropriately sized image files.

CloudFlare for example offers image resizing when serving files and offers the feature directly through the image tag as such:

<img src="/cdn-cgi/image/width=80,height=75/uploads/avatar1.jpg">

The only downside here, is that most CDN's are not completely free to use and require some work in order to fully integrate them into your websites.

Note that I am not affiliated with CloudFlare and just bring them up as an example. Since I don't use CDN's yet myself on this website, I don't have any particular recommendation for you on where to go.

Lazy Loading

Lazy loading is a technique in web development in which "non-critical" components defer loading until they are needed by the page. Think along the lines of scrolling through a page and having the images pop into view as you scroll downward.

By adding the following attribute to your images, you can signal to the browser that this image will be deferred until it is needed.

<img src='image1.png' loading='lazy' />

This, combined with image resizing, can vastly improve the initial page load time.

You can read a more detailed description of lazy loading right over here.

The benefits

It's definitely way more work to monitor the assets on your webpages and it is no simple task to set up the architecture for it. I know, as I am still working towards improving the setup that I use. But it does come with huge benefits to your overall projects. A few of those being:

- Faster load times
- Potentially better SEO traffic
- User retention
- Financial savings to your user base

That last item is interesting and goes unnoticed much of the time. But data cost money folks. At the current data rates of most phone plans, you are looking at $10 per 1GB of data (after your cap). I hit my cap all the time, as I work remotely freelancing my way through life. And having to download an extra 5MB of data to read a quick article is definitely not an ideal scenario.

And if I can save my blog visitors money in the long term, than I shall do my best to optimize as much as possible on this site.

I hope that you found this article helpful in your own performance optimization on your websites. Stay tuned for future posts where I will tackle other areas that may need a fix or two on this site.

Walter G. author of blog post
Walter Guevara is a Computer Scientist, software engineer, startup founder and previous mentor for a coding bootcamp. He has been creating software for the past 20 years.

Get the latest programming news directly in your inbox!

Have a question on this article?

You can leave me a question on this particular article (or any other really).

Ask a question

Community Comments

E
6/6/2020 12:48:56 AM
First off all, congratulations on this post. This i really awesome but that's what you always crank out my friend. Great posts that we can sink our teeth into and really go to work.
thatsoftwaredude.com logo
Walter
6/6/2020 11:58:45 AM
Many thanks Emila! Glad you found the post useful!
Ad Unit

Current Poll

Total Votes:
Q:
Submit

Add a comment