How to reduce website loading time and increase its responsiveness

Friday, 15th March 2019

The following are the things you should do as a website owner to make it respond faster and increase other good website requirements

1. Ensure that page title looks good

Titles give your potential customers a quick insight into the content of a result and why it’s relevant to their query. Many times it’s the primary piece of information used to decide which result to click on, so it's essential to use high-quality titles on your web pages.

Page titles should be descriptive and concise. Avoid vague descriptors like "Home" for your home page or "Profile" for a specific person.

Avoid unnecessarily long or verbose titles, which are likely to get truncated when they show up in the search results Try to keep page titles short, i.e. less than 60 characters.

2. Meta Description

A Meta description is nothing but a concise, human-readable summary of each page's content.

Accurate Meta descriptions can help improve your click through.

Here are some guidelines on how to use the Meta description properly.

·       Use site-level descriptions on the main home page or other aggregation pages, and use page-level descriptions everywhere else.

3. HTML Headings

h1 status is the existence of any content inside h1 tag. Although not important like Meta titles and descriptions for search engine ranking but still a good way to describe your contents in search engine result.

h2 status less important but should be used for proper understanding of your website.

4. robot.txt

A robot.txt file is a file at the root of your site that indicates those parts of your site you do not want to be accessed by search engine crawlers. The file uses the Robots Exclusion Standard, which is a protocol with a small set of commands that can be used to indicate access to your site by section and by specific kinds of web crawlers (such as mobile crawlers vs desktop crawlers).

5. Sitemap

A sitemap is a file where you can list the web pages of your site to tell search engines about the arrangement of your site content. Search engine web crawlers read this file to more intelligently crawl your site.

 Sitemaps can be created either manually or using third-party tools.

Using a sitemap does not guarantee that all the items in your sitemap will be crawled and indexed, as Google processes rely on sophisticated algorithms to schedule crawling.

However, in most cases, your site will benefit from having a sitemap, and you'll never be penalized for having one.

6. Image 'alt' Text

For any image on your site, the alt tag should describe what on it. Alt tags and title tags strengthen the message towards search engine spiders and improve the accessibility of your website.

7. DOC Type : <!DOCTYPE html>

Doc type is not an SEO factor, but it is a check to validate your web page. So make sure your site has this.

8. Avoid Deprecated HTML Tag

Older HTML tags and attributes superseded with other more functional or flexible alternatives (whether as HTML or as CSS ) are declared as deprecated in HTML4 by the W3C - the consortium that sets the HTML standards.

Browsers should continue to support deprecated tags and attributes, but eventually, these tags are likely to become obsolete and so future support cannot be guaranteed.

9. HTML Page Size

To make pages load faster, reduce the size of the data (HTML markup, images, CSS, JavaScript and other web resources) that is needed to render each page on your website.

 We recommend you to make your website structured so that the most critical things loads first.

10.                 GZIP Compression

Enable and Test gzip compression. Gzip compression reduces the size of the transferred response by more than 80%. Compression reduces the time to download and significantly improves the time to render your webpages.

All servers have the sample configuration files, and all modern browsers do support gzip compression.

11.                 Inline CSS

Internal CSS is the CSS code which resides on HTML page inside the style tag.

Internal CSS increases loading time since no page caching is possible for internal CSS.

Try to put your CSS code in an external file.

12.                 MicroData Schema Test

Microdata is the information underlying in an HTML string or paragraph.

Consider an avatar, it could refer to a profile picture on forum, blog or social networking site or maybe it refers to a highly successful 3D movie. Microdata is used to specify the reference or underlying information about an HTML string. Microdata gives chances to search engines and other applications for a better understanding of your content and significantly encourages better search results.

13.                 IP Canonicalization Test

If multiple domain names get registered under single IP address, the search bots can label other sites as duplicates; this is IP canonicalization. To solve this use redirects.

14.                 Plain Text Email Test

A plain text email address is vulnerable to email scrapping agents. An email scrapping agent crawls your website and collects every email address written in plain text.

So the existence of plain text email addresses on your site can help spammers in email harvesting.

Plain text emails could be a bad sign for search engines. To fight this you can obfuscate your email addresses in several ways:

1) CSS pseudo-classes.

2) Writing backwards your email address.

3) Turn off display using CSS.

4) Obfuscate your email address using javascript.

5) Using WordPress and PHP to send emails to your email address.

15.                 Configure Viewport

A viewport dictates how a web page should be displayed on a mobile device. If a viewport is not specified then a mobile device will display the page as if it were a desktop screen and not scaled to fit the device. Setting a viewport gives control over the page's width and scaling on different devices.

16.                 Size Content to Viewport

Screen dimensions vary widely across devices so always configure the viewport in such a way that your pages render correctly on many different devices.

Always try to make sure that the web page content doesn't rely on a particular viewport width to render well.

17.                 Size Tap Targets Appropriately

Buttons that are small or too close together are more difficult for users to press on a touchscreen than with a traditional mouse cursor.

The average adult finger pad size is about 10mm wide, and it is recommended to have a minimum tap target size of roughly 7mm or 48 CSS pixels on a site with a correctly set mobile viewport.

18.                 Avoid the use of some plugins

Plugins such as Flash and Silverlight may not work on mobile devices.

Native web technologies have advanced quite a lot in the recent past. So most of the content which needed these plugins might not actually be needed if created using native web technologies.

For example: Audio and Video now can be easily played using HTML5 Multimedia.

19.                 Use Legible Font Sizes

Configure the viewport to make sure fonts will be scaled as expected across various devices.

Once a viewport is configured, implement these additional recommendations:

1. Use a base font size of 16 CSS pixels. Adjust the size as needed based on properties of the font being used.

2. Use sizes relative to the base size to define the typographic scale.

3. Text needs vertical space between its characters and may need to be adjusted for each font. The general recommendation is to use the browser default line-height of 1.2em.

4. Restrict the number of fonts used and the typographic scale. Too many fonts and font sizes lead to messy and overly complex page layouts.

20.                 Landing Page Redirects

When a website is made to serve different devices using different URLs, search engines want to redirect users to the best serving URL automatically.

It is the best practice to use either HTTP or JavaScript redirects. This is to make sure your website is shown correctly on different devices.

21.                 Leverage Browser Caching

Fetching web resources like CSS and JavaScript again and again over the network is not only taxing but may also prevent the page from rendering.

For better page speed, all server responses should specify a caching policy to help the client determine if and when it can reuse a previously fetched response.

Minimum cache time should be 6 days.

22.                 Server Response Time

The server should respond in less than 0.2 seconds.

Server response time is the time it takes to load the necessary HTML and start rendering the page from the server.

The first step is to measure server response times and identify Potential problems. Then, with that data, consult the appropriate guides on how to address the issue.

23.                     Minify CSS and JavaScript

Minification is the process of removing unnecessary data without any impact on how the browser processes the resource. For CSS Minification, use the tools like CSSNano.

24.                 Optimize Images

Image Optimization is very tricky; you need to make sure not to reduce the quality of the image but rather its size. Give more importance to the vector format images because they are independent of resolution and scale.

Compress the images in the tools like Adobe before loading them onto a webpage.

25.                 SEO Friendly Links

An SEO friendly link follows these basic rules: The URL should use a dash as a separator,

Not use parameters or numbers, and should be a static URL. To resolve this, use these techniques.

1) Replace underscores or other separators with a dash, clean URL by deleting or replacing numbers and parameters.

2) Marge your www and non-www URLs.

3) Do not use dynamic and related URLs. Create an XML sitemap for proper indexing by search engines.

4) Block unfriendly and irrelevant links with robot.txt.

5) Endorse your URLs in canonical tags.


This has to be a great list of task and as a website owner if you take all these highlights into action your users will commend you on the responsiveness of your website. 

Leave a Reply


Total of 0 Comment