How to get the massive results of Google PageSpeed with no installation

Google PafeSpeed module best configuration emulation

 

Since mod_pagespeed made its appearance, websites have been equipped with a new and powerful weapon.

But how can a low-profile webmaster fight against the huge competition without this weapon?

 

In this deep article we will see all the techniques and best practices to get the same benefits of pagespeed module and further benefits, turning on the competition with the fastest websites on the web.

 

Here is the “jump-to”  of the topics that we will see:

 

Introduction: The tools of Google PageSpeed project

 

What is Google PageSpeed module 

How mod_pagespeed impacts your website performance

How mod_pagespeed affects SEO and boost your ranking

 

The features of mod_pagespeed: the filters

1. Css filters

2. JavaScript filters

3. Images filters

4. Html filters

5. Analytics filters

 

How to enable mod_pagespeed on your server

Building from packages

Building from source

 

Why you may want a mod_pagespeed alternative

 

How you can emulate the effects of mod_pagespeed best configuration

1)Optimize CSS performance

2)Optimize Javascript

3)Optimize Images

4)Optimize HTML

5)Optimize analytics

 

Conclusions

 

 

Introduction: The tools of Google PageSpeed project

 

 

Google Page Speed is a project by Google Inc, made to help a website’s performance optimizations. It had been introduced at Developer Conference in 2010.

 

The project includes 4 tools:

 

  • the Page Speed extension, available for Google Chrome and Firefox, which allows you to evaluate the performance of a web page by displaying it through your browser

 

  • Page Speed Online, with functions like the homonymous extension, which works online but does not require installation and can be used with any browser

 

  • the Page Speed Service(deprecated), an online service that rewrites the pages of our server respecting the best practices

 

  • the module for Apache and Nginx mod_pagespeed, object of this article, which offers the opportunity to speed up your web pages by automatically intervening on problems highlighted by the algorithms of Page Speed.

 

These components are designed to recognize faults in a website’s compliance with Google’s Web Performance Best Practices, as well as automate the modification process.

 

 

 

What is Google PageSpeed module

 

what is google pagespeed module

 

 

Mod_pagespeed is a module designed for the Apache and Nginx web server, released with open source license, which aims to automatically improve the optimization of web pages and other resources in a site.

 

In order to do that , it automatically applies chosen filters to pages and associated assets, such as style sheets, JavaScript, HTML files and images.

 

The module works in line with the so-called “best practices” developed by Google itself, which constitute a sort of rule that every webmaster should follow in the creation of a website.

 

The major good thing about this module is the fact that it generally does not require changes to existing content or workflow, and therefore all interior optimizations and changes to data files are created on the server side, presenting modified documents directly to the user.

 

Each filter corresponds to one of Google’s web performance guidelines rules.

 

Since PageSpeed module can be an open-source library, it is generally kept up to date by numerous programmers from across the world and can be deployed by any sites, web host providers, or CDNs.

 

 

 

How Mod_pagespeed impacts your website performance

 

 

The PageSpeed module showed the most important impact on improving webpage load speed, payload size, and amount of requests in comparison with other options on the market.

According to many experts, mod_pagespeed can increment the loading speed by up to 80%, amount of bytes over a line can be reduced by 30% and the amount of total demands can be lowered by over 20%.

 

 

How Mod_pagespeed affects SEO and boost your ranking

 

Since many search engines, use the speed loading in their ranking algorithm (including Google), these optimizations can impact a website’s placement in search results.

 

By February 2015, Google has begun testing “Slow” labels on mobile devices for websites that go over a certain amount of loading time, prompting developers to look at ways to drop the loading time.

how google pagespeed module can boost your SEO rankingsinfographic by searchenginewatch.com

 

 

 

The features of mod_pagespeed: the filters

 

 

Its operation is based on the action of page rewriting filters and allows developers to greatly improve the performance of blogs and portals, optimizing HTML, JavaScript and CSS sheets and speeding up the upload of images in JPEG and PNG format.

 

To optimize the pages, the module offers about forty filters dedicated to the different components of a web resource.

 

Webmasters and server administrators therefore can take advantage of a powerful tool to easily optimize websites and make them faster.

 

The real novelty of mod_pagespeed is in fact the possibility to automate many of the processes that previously could only be performed manually, allowing a considerable saving in terms of fatigue and time.

 

Now let’s see how these web page filters work to understand which factors have the most impact on the optimization of a site according to Google’s best practices.

 

1. CSS filter

 

 

Combine CSS

 

If a full page requires several style sheets, this filtration combines them into one style sheet, lessening the amount of requests to an internet server

 

 

Extend Cache for Style sheets

 

Escalates the time that cached CSS documents should be maintained in local memory, avoiding extra demands and data launching after a user’s go back to the same web page.

 

 

Flatting CSS Imports

 

Replaces all “@import” guidelines with the contents of imported data files, if the damaged style sheet data files are significantly less than the amount of pre-determined bytes establish by the sub-filter “CssFlattenMaxBytes”. This technique is suitable for reducing the amount of requests by the browser

 

 

Inline @import To Link

 

Transforms “@import” guidelines into equivalent “< link>” tags. Largely is employed for proper work of later applied filtration systems

 

 

Inline CSS

 

Inserts small exterior style sheet data’ content straight into HTML record, therefore reducing amount of requests. This filtration is applied to style sheets that are smaller when compared to a size placed by sub-filter “CssInlineMaxBytes”

 

 

Inline Google Fonts API CSS

 

Inserts any style sheets, employed by Google Font API, if they’re smaller when compared to a value, which is defined by sub-filter “CssInlineMaxBytes”

 

 

Move CSS Above Scripts

 

Rearranges the order of launching style sheets and JavaScript data files, guaranteeing that scripts aren’t preventing CSS resources. Improves the render of a website, resulting in shorter loading times

 

 

Move CSS to Head

 

Places style sheets inclusion before any <body> elements are presented, which decreases the loading time by eliminating re-flows of a webpage

 

 

Outline CSS

 

Experimental filtration system which places inlined CSS on an external resource. The theory behind it is to make parallel contacts to different servers alternatively than consecutive ones to the same host. This filter will attempt to outline only style sheets bigger than place by sub-filter “CssOutlineMinBytes”

 

 

Prioritize Critical CSS

 

Replaces style sheets with inlined CSS guidelines, formulated with only necessary guidelines for first viewport, and defers the others of CSS guidelines to be filled after the web page is fully packed

 

 

Rewrite CSS

 

Allows other style sheet and image related filtration systems to be employed to local style sheets. Also, this filtration minifies all CSS. Has effect on payload size

Implements the same technique as “Rewrite CSS” filtration to all rules, announced under “<style>” attributes

 

 

 

2. JavaScript filters

 

 

Canonicalize JavaScript libraries

 

Replaces popular JavaScript libraries with distant latest free versions from Google Hosted Libraries by default. This probably decreases the number of requests to server in the foreseeable future, since these libraries might maintain user’s internet browser cache from other websites

 

 

Combine JavaScript

 

Combines multiple JavaScript documents into one, therefore reducing final number of requests by the browser

 

 

Defer JavaScript

 

Load the page before executing JavaScript files. This ensures that the loading of visible content is not interrupted

 

 

Extend Cache for JavaScript

 

Increases the time frame where cached JavaScript data should be placed in local memory, avoiding extra demands and data loading after user go back to the same webpage

 

 

Include JavaScript Source Maps

 

Creates the “map” between minified and original JavaScript data for much better readability through the debugging process

 

 

Inline JavaScript

 

Inserts small exterior JavaScript documents’ content straight into HTML report, therefore reducing the amount of requests.

This filtration technique is applied to the data files smaller than the scale establish by sub-filter “JsInlineMaxBytes”

 

 

Minify External JavaScript, Minify Internal JavaScript

 

Removes all commentary, white spaces, redundant and outdated rules which reduces the amount of bytes loaded

 

 

Outline JavaScript

 

Experimental filtration technique which places inlined JavaScript by using an external resource.

The theory behind it is to make parallel associations to different servers somewhat than consecutive ones to the same host.

 

This filter is only going to try to put together scripts bigger when compared to a size established by sub-filter “JsOutlineMinBytes”

 

 

 

3. Images filters

 

 

Deduplicate Inlined Images

 

Eliminates repetitive data loading of the same inlined images, incrementing the amount of requests and copy size

 

 

Extend Cache for Images

 

Escalates the time that cached images should be maintained in local memory, avoiding extra demands and data loading after user go back to the same website

 

 

Inline Preview Images

 

Generates poor versions of lots of inlined images established by sub-filter “MaxInlinedPreviewImagesIndex”, improving user experience. Following the page loading completion, images are turned to raised quality

 

 

Lazy load Images

 

Postpones launching of images, that are not in a user’s primary viewport, decreasing the amount of browser’s requests

 

 

Inline Images

 

Offers a way to add image data into genuine web page code as though it was an external resource, getting rid of extra connections to get image data

 

 

Convert GIFs to PNG

 

Losslessly turns non-animated gifs into pngs, reducing data size

 

 

Convert JPEGs to Progressive JPEGs

 

Transforms automatically generated larger non-progressive jpeg images into smaller progressive type

 

 

Recompress JPEG

 

Recompresses automatically jpeg images if their original compression quality was greater than value, established by sub-filters “ImageRecompressionQuality” or “JpegRecompressionQuality”. This enables a reduction in payload

 

 

Recompress PNGs

 

Losslessly changes PNG images into images with higher compression

 

 

Recompress WEBPs

 

Replaces webp images with smaller re-compressed variants in web browsers which support webp format

 

 

Strip Image Color Profile

 

Cleans away all images’ color account information, since it isn’t supported by almost all the browsers

 

 

Remove Image Meta Data

 

Cleans away EXIF meta data from all image data files of a website

 

 

Reduce JPEG Subsampling

 

Reduces color sampling rate to 4:2:0 anticipated to human eye-sight level of sensitivity to changes in lighting, however, not in hue or saturation. This enables to significantly reduce image size

 

 

Convert PNGs to JPEGs

 

Substitutes png images without transparency channel with smaller jpeg equivalents

 

 

Resize Images

 

Replaces any image with proportions bigger than certain “width” and “level” qualities with an inferior one

 

 

Convert JPEGs to WEBPs

 

Converts jpeg images into much smaller webp format if supported by browser

 

 

Convert JPEGs to WEBPs Lossless

 

Turns jpeg images into much smaller webp format if reinforced by browser in case image is insensitive to compression noise

 

 

Insert Image Dimensions

 

Automatically offers “width” and “height” attributes for “<img>” tags if indeed they were absent

 

 

Resize Images To Rendered Dimensions

 

Makes an attempt to resize any image to its rendered dimensions, overlooking any “width” or “elevation” attributes

 

 

Sprite Images

 

Combines all background images from style sheet guidelines into one large image to lessen the amount of demands by the browser

 

 

 

4. Html filters

 

 

Add Head

 

Provides a <head> html tag if it’s not found before <body> tag, needed mainly for other filter systems to work properly, since many of them add or enhance data in the region proclaimed with the <head> tag

 

 

Collapse Whitespace

 

Deletes extra and needless whitespaces between operators in html page

 

 

Combine Heads

 

Groups the content of several <head> tags into one if there are multiple tags, protecting against incorrect internet browser workflow

 

 

Convert Meta Tags

 

Adds corresponding present meta tags response header, keeping away from reparsing delays credited to some web browsers “http-equiv” feature requirements

 

 

Elide Attributes

 

Takes away html tags’ attributes that are the same of default values, which reduces the quantity of data transferred

 

 

Local Storage Cache

 

Saves inlined resources into browser’s local memory on the first view of a full page, and lots them from local storage space on succeeding views somewhat than inlining them again. This reduces the amount of requests

 

 

Pedantic

 

Forces Pagespeed Component optimizations to become more HTML4 compliant

 

 

Remove Comments

 

Deletes HTML commentary, created by programmers for easier readability and navigation through Html page

 

 

Remove Quotes

 

Deletes quotation grades from HTML attributes, reducing how big is html files

 

 

Trim URLs

 

Substitutes absolute URLs with relative ones to the base URL of the page

 

 

 

5. Analytics filters

 

 

Add Instrumentation

 

Allows measurement of that period when a client load and render the web page, for analytical and traffic monitoring purposes

 

 

Async Google Analytics snippet

 

Forces the Google Analytics traffic monitoring code to download asynchronously, therefore allowing critical resources never to be obstructed during loading

 

 

Insert Google Analytics

 

Adds checking asynchronous snippet code to observe any activities on website with Google Analytics Reporting Tools

 

 

 

How to enable mod_pagespeed on your server

 

 

Installation is very simple. It’ll vary depending on the operating system you use.

You can download and install packages if you use Debian or Ubuntu (or any Linux distribution that uses .DEB packages).

 

Other Linux distributions can download the source and build from that.

 

 

Building from source

 

Here are the Google tutorials for installation from source:

Installing from source: Apache

Installing from source: Nginx

 

 

Building from packages

 

Here are the Google tutorials for installation from packages:

 

 

 

Why you may want a Mod_pagespeed alternative

 

Why you may want to emulate google pagespeed module best configuration

 

Is the installation process too complex for you?

 

Do you have no coding skills and you think this will get you into troubles?

 

Don’t have a server/o.s.  that supports the module? (The project only supports Apache 2.2 and therefore cannot be used for previous versions)

 

Do you not trust the module security when applied to WordPress?

 

 

If your answers to even one of these question is yes, then you might want to implement your own strategy to loading speed optimization.

I realize that making changes to your internet site configuration can be considered a terrifying thought for newbies, particularly if you are not an expert.

 

But don’t worry, you’re not alone!

 

 

I will highlight to you ways to increase your site speed with simply procedures.

If you can point-and-click, you can do this!

 

 

However, understand that the module cannot work on server part code and it is therefore struggling to improve badly performed PHP scripts or increase SQL queries that contain code not been optimized.

 

Therefore, where possible, the best answer is usually to create a site that was already optimized since its inception, following the best practices we’ll see soon and undertaking the necessary steps in a manual way, to be able to obtain positive results.

 

 

 

How you can emulate the Mod_pagespeed best configuration

 

 

how to emulate google pagespeed best configuration without installing it

 

Now we’ll see how to implement the points 1 to 5 of the pagespeed module mentioned above, in alternative ways.

 

I’ll get ranking ways to increase your website, exhibiting you which optimizations contain the most effect on load times.

 

Which means that we’ll begin by responding to common issues like unoptimized queries that decelerate a website’s performance and this are easy to recognize and fix.

 

Let’s start!

 

 

 

1) Optimize CSS Performance

 

optimize css performance google pagespeed module 

 

Combine your CSS scripts

 

First incorporate all the CSS scripts that you can combine into one bigger CSS script.

 

This is very important because if you have a big number of external CSS data your webpages will load slower.

 

That is partly because launching multiple CSS data files creates unneeded additional demands for the browser to take care of.

 

 

Even using one exterior CSS file can be viewed as bad practice as it pertains to page velocity.

 

If your website includes CSS rules that are being used for the above-the-fold content, using an external data file to call a CSS script will stop the rendering of it.

 

Inlining these CSS guidelines will solve this issue.

 

You need to therefore always inline your CSS scripts unless they are larger in dimensions.

 

 

CSS, or cascading style sheets, may be used to change your HTML-based content into a clean and professional document.

 

Many CSS options require HTTP demands (unless using inline CSS), which means you should try to lessen bloated CSS data without eliminating essential features.

 

Should your banner, plugin, and layout website link styles be situated in split CSS data files, this will demand your guests’ web browsers to load numerous documents simultaneously.

 

 

Although now less of any problem because of HTTP/2, this may certainly still feature to longer insert times if the data files are filled from external options.

 

Read our WordPress Performance article to observe how reducing HTTP demands improved launching times dramatically.

 

 

Minify CSS

 

Once you’ve combined your entire CSS scripts into one bigger script you should compress this script to lessen the quantity of data the users have to download when launching your webpages.

 

For each extra KB the visitors have to load, your website slow down just a little. Utilize the CSS compressor to compress your CSS.

 

Minifying CSS is merely compressing your CSS data file into a new one reducing white space.

 

Increased white space may take up tons of bytes and could possibly be the difference between a faster and a slower site in large scale projects, so you should try to decrease them in your CSS.

 

You don’t need to do it yourself of course, as there are programmed compressors on the Internet.

 

Here’s one CSS minifier: http://www.cssminifier.com/

 

 

Remove unnecessary code

 

Another trick to boost page load acceleration is reducing needless CSS code.

Arranging your CSS code is a method that lots of wouldn’t want to inform you of.

 

On many levels, arranging your CSS code will let you lessen your CSS size by way of a great margin and increase your website speed.

Have you been wondering how? It is rather simple. Adding your CSS classes in to the right group of branches ensures nominal duplication, a universal problem with modern web designs. In a few scenarios, arranging your sloppy CSS code into hierarchical branches is all you should do to increase your website.

This is achieved by verifying for redundant or duplicate CSS code by using a tool such as this one: http://unused-css.com/

A final term is to be sure all you do follow “the gold rule” about re-doing your CSS: be pretty much oriented towards well crafted, organized, and clean CSS code.

 

 

   Prioritize CSS rules for the above-the-fold content

 

This is the main factor as it pertains to optimizing the CSS delivery.

 

Follow these steps:

 

– First determine whether your brand-new blended CSS script is either large or small in proportions. Once the script is smaller in proportions you should inline it within the HTML head tag. Defer launching an inferior CSS script is not often necessary because you won’t get any gain using this method (page speed sensible).

Please continue with the next phase when the script is much larger.

 

– Remove the critical above-the-fold CSS from your greater CSS script, signifying the CSS guidelines strictly essential to screen the above-the-fold content of your website.

A sensible way to do that is using Addy Osmani’s Node.js Bundle which can do this for you automatically.

This technique requires SSH gain access to so this may not be an ideal solution for everybody.

 

The other available choices are employing one of the web critical CSS generators or just doing it physically.

 

here you can find the critical CSS generator by jonassebastianohlsson.com

 

– Given that you have the critical rendering path CSS script in your ownership you should inline this script within the HTML head tag of your website and defer weight the rest of your brand-new large CSS script in underneath of your webpage.

 

 Avoid CSS @import

Additionally, any site owners mistakenly use the @import directive to add external style sheets over a webpage.

 

That is an obsolete method, and it stops browsers from executing parallel downloads.

 

The link tag is your very best option and can also increase the forward end performance of your website.

 

Furthermore, external style sheets wanted with the link tag do not stop parallel downloads.

 

While using the CSS @import rule can help you import an exterior CSS data file in a CSS script.

 

That is bad for website quickness because the @import function loads every imported exterior file separately rather than loading it parallel with all the current other files needed to render this page it is employed on. In addition, it creates pointless HTTP requests.

 

You can examine whether your webpages use CSS @import with the CSS Delivery Tool or read more about why and how to prevent CSS @import.

 

 

 Avoid using STYLE tags in the HTML body

You must remove all the CSS you utilize in the HTML body (example: <p style=”margin-left:20px;”></p>) and place these CSS rules inside the HTML head tag.

 

Not carrying this out is harmful to page velocity because style tags located inside the HTML body are provide blocking and simply an overall coding no-no.

 

It really is good practice to place your CSS style sheet at the very top (between <head> </head> tags) as well as your JavaScript in the bottom.

 

Make sure that your CSS code is loaded before the remaining part of the page. Grounds for JavaScript being put in the bottom is its increased size and its own subsequent influence on website speed.

 

 

  Use modern CSS and valid markup

 

Usage of modern CSS reduces the quantity of markup, can decrease the dependence on (spacer) images, in conditions of design, and can frequently replace images of stylized text– that “cost” a lot more than the same text-and-CSS.

 

Using valid markup has other advantages.

 

First, web browsers will haven’t any need to execute error-correction when parsing the HTML.

 

Additionally, valid markup permits the free use of other tools which can pre-process your webpages.

 

For instance, HTML Tidy can remove whitespace and optional closing tags; however, it’ll refuse to operate on a full page with serious markup errors.

 

Here is a useful tool to tidy up your html document

 

 

Splitting CSS files targeting different browsers

 

You can divide CSS documents, i.e. various style sheets, if you are targeting multiple web browsers like IE, Chrome or Firefox.

 

For instance, rather than checking out CSS hacks within a style sheet, you may use IE conditional claims to load an alternative style sheet (e.g. concentrating on IE6 for example).

 

In this manner, you wouldn’t be launching up IE code when using Chrome, and you’ll decrease the CSS quality with a great margin.

 

 

  Mod_pagespeed and WordPress

The optimizing guidelines above are the techniques for both a WordPress or non-WordPress website.

 

There’s no difference when it comes to the CSS delivery.

 

 

   Test your website

Have you optimized well your website CSS delivery?

 

Check with this tool how well you did it.

 

 

 

2) Optimize JavaScript

 

optimize javascript to get massive results of google pagespeed module

 

Minify JavaScript code for smaller file sizes

 

Minifying code and obfuscating code are both ways of making JavaScript smaller, even though they are different.

 

Minification accomplishes the last mentioned, and can reduce file sizes to diminish page weight times.

 

Line breaks, blank spaces, comments are all increments of the size of a JavaScript data file and they influence the quickness of page fill.

 

Compressing the code solves this matter well. Servers aren’t hypersensitive to the aesthetic design of code just like a man would be.

 

Pcs can read and kick off minified code, even if all your JavaScript ties in just one single string.

 

Let’s compress!

 

 

Avoid inactive JavaScript code optimizing it

 

Optimization is a kind of JavaScript minification.

 

These types of minimizers not only delete useless white areas, commas, remarks etc. but also help avoid ‘inactive code’

 

–           Google Closure Compiler

 

 

Exclude unused components of .js libraries

 

Most coders use libraries like jQuery UI or jQuery Mobile as is.

 

Which means that the code includes all possible the different parts of each collection, when you might only need several.

 

An identical situation occurs with other JavaScript libraries as well.

 

When you have the capability to take care of what components will be contained in your offer of catalogue, act.

 

Your site will load considerably faster, and your users will get an improved experience.

 

Go to the Download Builder of jQuery Mobile or jQuery UI to make your own package deal of these famous libraries.

 

Gzip module for Apache, Nginx, and Node.js

 

Gzip can be an amazing technology created when the internet wasn’t as high-speed as it is today.

 

Archivators were a favorite technology.

The theory was to use Archivators for Internet website traffic, so Gzip was originated to deflate data files on web servers, compressing static data right down to 99% of these original sizes.

 

Because JavaScript is a textual document, Gzip may be used to compress JavaScript documents and also help decrease page insert times.

 

Determine if your web server technology has support for Gzip here:

 

You will discover modules for a few of the very most famous web servers, including Apache and Nginx.

 

Because JavaScript can be utilized not limited to front-end development but back-end development as well, you can compress JS data files with the zlib component for Node.js.

 

 

  Reduce the number of inline scripts

Inline scripts can be costly for page loading, because the parser must suppose an inline script could change the page composition while parsing is happening.

 

Reducing the utilization of inline scripts generally, and reducing the utilization of file.write() to end result content specifically, can improve overall webpage loading speed.

 

Use modern AJAX solutions to manipulate site content for modern web browsers, as opposed to the older approaches predicated on document.write().

 

 

 

3) Optimize Images

 

optimize images to get massive results as of google pagespeed module

 

Use Image sprites

 

If you are using background images a whole lot in your CSS, you can decrease the amount of HTTP lookups needed by incorporating the images into one, called an image sprite.

The image sprites technique consists in cutting your CSS quality and decreasing site load time: it can be considered an age-old technique.

 

A graphic sprite is a superb choice if you are by using a lot of symbols and design on your site.

 

You then just apply the same image every time you require it for a history, and modify the x/y coordinates properly.

 

This technique is most effective with elements that will have limited measurements, and cannot work for each use of your backdrop image.

 

However, the fewer HTTP demands and one image caching can lessen page load time.

 

Here you can generate your first image sprite:

 

 

  Minify and compress SVG assets

Many graphic applications often include SVG, which are unnecessary and removeable.

 

Configure your server to apply Gzip compression for SVG resources.

 

 

 Lazy Loading Images

 

There are in least several excellent explanations why you should think about lazy launching images for your website:

 

 

  • If your website uses JavaScript to show content or provide some type of efficiency to users, launching the DOM quickly becomes critical.

 

It’s common for scripts to hold back before DOM has completely loaded before they start operating.

On a niche site with a substantial range of images, lazy loading– or do it asynchronously — will make the difference between users remaining or giving your website.

 

  • Since with this method images are loaded only when an individual has scrolled to the positioning where they would be, if users never reach that point those images won’t be loaded.

 

This implies considerable cost savings in bandwidth, that most users, especially those being able to access the net on cellular devices and slow-connections, will be thanking you.

Several websites utilize this procedure, but it’s especially recognizable on image-heavy sites.

 

So, lazy launching images supports website performance, but what’s the ultimate way to begin it?

 

There is no perfect way.

 

If you live and breathe JavaScript, employing your own sluggish loading solution must not be an issue.

 

Nothing at all provides you more control than coding something yourself.

 

 

Otherwise, you can see the web for practical approaches and begin tinkering with them.

 

Look at these interesting techniques:

 

#1 David Walsh’s Simple Image Lazy Weight and Fade

 

#2 Robin Osborne’s Progressively Enhanced Lazy Loading

 

#3 Lazy Load XT jQuery Plugin

 

 

   Specify sizes for images and tables

If the web browser can immediately determine the dimensions of your images and tables, it’ll be able to screen a website and never have to reflow this content.

 

This not only speed the screening of the webpage but prevents aggravating changes in a page’s structure when the site completes loading.

 

Because of this, dimensions should be given for images, whenever you can.

 

Furthermore, you  should designate widths of columns using the COL and COLGROUP HTML tags.

 

 

 

4) Optimize Html

 

optimize html document to reach best configuration mod_pagespeed results

 

 Minimize the number of files

Reducing the amount of data referenced in a website lowers the amount of HTTP connections necessary to download a full page.

 

Based on a browser’s cache configurations, it could send an If-Modified-Since request to the server for every single CSS, JavaScript or image file, asking if the document has been altered since the previous time it was downloaded.

 

By reducing the amount of data that are referenced in a website, you decrease the time necessary for these demands to be directed, and for their reactions to be received.

 

A lot of time put in querying the previous changed time of referenced data can delay the original display of the web page, because the internet browser must check the changes time for each CSS or JavaScript document, before making the page.

  Reduce domain lookups

Since each individual domain costs amount of time in a DNS lookup, the webpage weight time will develop combined with the number of different domains showing up in CSS links and JavaScript and image sources.

 

This might not exactly always be sensible; however, you should take the time to only use the bare minimum necessary amount of different domains in your webpages.

 

 

  Cache reused content

Ensure that any content that may be cached, is cached, and with appropriate expiration times.

 

Focus on the Last-Modified header.

It permits efficient site caching; through this header, information is conveyed to an individual agent about the data file it needs to load, such as when it was previous modified.

 

 Most servers automatically append the Last-Modified header to static web pages, predicated on the last-modified date stored in the record system.

 

With dynamic web pages, this cannot be done, and the header is not sent.

So, specifically for pages that are produced dynamically, just a little research upon this subject is effective.

 

It could be somewhat included, but it’ll save a great deal in page demands on internet pages which would normally not be cacheable.

 

  Optimally order the components of the page

Download web page content first, along with any CSS or JavaScript which may be necessary for its initial screen, so the user provides the quickest clear response through the page loading.

This content is typically text, and can therefore reap the benefits of words compression in transit, thus providing a straight quicker respond to the user.

 

Any active features that want the site to complete loading before being utilized, should be primarily handicapped, and then only empowered after the webpage has loaded.

 

This may cause the JavaScript to be loaded after the site contents, that may enhance the overall look of the web page load.

.

 

  Chunk your content

Tables for designs which should not be utilized anymore: they are a legacy method.

 

 It is a better solution to use positioning, floats, flexbox, or grids instead.

 

Tables are still considered valid markup if they are used for showing tabular data.

 

To greatly help the internet browser render your web page faster, you should avoid nesting your tables.

 

Instead of deeply nesting furniture just as:

<table>

<table>

<table>

</desk>

</stand>

</desk>

 

use non-nested tables or divs such as

<stand>…</desk>

<stand>…</desk>

<stand>…</stand>

 

See also: CSS Flexible Box Layout and CSS Grid Layout specifications.

  Choose your user-agent requirements wisely

To attain the greatest advancements in site design, ensure that affordable user-agent requirements are given for projects.

 

Usually do not require your articles to seem pixel-perfect in every web browsers, especially not in down-version web browsers.

 

Ultimately, your basic least requirements should be predicated on the thought of modern web browsers that support the relevant expectations.

 

This may include recent editions of Firefox, Internet Explorer, Chrome, Opera, and Safari.

 

Example web page structure:

HTML

 HEAD

 LINK …

CSS files necessary for page appearance. Lessen the amount of data for performance while keeping unrelated CSS in independent data files for maintenance.

 

 SCRIPT …

JavaScript documents for functions required through the loading of the site, however, not any DHTML that can only just run after site loads.

Minimize the amount of data for performance while keeping unrelated JavaScript in split data files for maintenance.

 

BODY

? User noticeable webpage content in small chunks (furniture/divs) that may be displayed without looking forward to the full web page to download.

 

SCRIPT …

Any scripts which is used to execute DHTML. DHTML script typically can only just run when the web page has completely loaded and everything necessary have been initialized.

 

You don’t have to download these scripts prior to the web page content. That only decreases the original appearance of the webpage load.

 

Minimize the amount of data files for performance while keeping unrelated JavaScript in individual data for maintenance.

 

If any images are being used for rollover results, you should preload them here following the site content has downloaded.

Use async and defer, when possible

 

Make the JavaScript scripts in a way that they are appropriate for both async and the defer and use async whenever you can, particularly if you have multiple script tags.

 

The site can stop rendering while JavaScript loads. Normally, the browser won’t render whatever is following the script tags that don’t have these attributes.

 

Note: Even though these characteristics do help a whole lot for the very first time a full page is loaded, you need to use them however, not rely that it’ll work in every browser. If you follow all rules to make good JavaScript code, you don’t have to improve your code.

5) Optimize Analytics

 

optimize analytics to improve seo rankings and speed loading page

 

 

Here you can find Google best practices for implementing analytics on your website.

 

 

 

Conclusions

 

 

Load time matters because users as you hate to hold back.

After simply clicking a site link, you expect almost instant results.

 

Every second much longer a web site takes to load increases the possibility that you’ll click off and search in other places.

 

Anyone expect a high-performance site

Relating to Kissnetrics.com, 47% people expect the loading time significantly less than 2 secs and 40% will leave the website completely if it requires more than 3.

 

Particularly, you can find up to 85% expect a mobile site to load as fast or faster than on desktop.

 

While you may be used to holding out a couple of seconds for websites to load, anything beyond several seconds greatly minimizes your attention period with each moving second.

The outcome of a poor loading speed is higher bounce backs as well as you’re never making it to any website content.

 

Load Speed Impacts SEO too

Slow load rates of speed also hinder website success in another, more significant way.

 

Site launching speed is among the many standing factors that Google’s search algorithms use to determine PageRank.

 

So, every webmaster must understand is that PageRank is awarded on website value and other factors.

 

A site that will not insert fast enough, which creates bounce backs and reduced traffic is not valuable to visitors who’ll not stick to the page.

 

Subsequently, the design conditions that are triggering slow loading drive audiences and stop more users from finding it because of the influences on SEO.

 

So, if your site is not really a big brand one, it ought to be as quickly as possible, otherwise, people will forego it soon.

 

In those days, the user would like information as quickly as possible

 

Now It’s Up To You!    What Do You Think?

 

What do you think of this article?

Did I miss something you know ?

Either way, leave a comment below right now to let me know !

 

You may also like...

One thought on “How to get the massive results of Google PageSpeed with no installation

  1. Firman

    October 13, 2017 at 6:23pm

    terimakasih sudah mau berbagi

  2. Tayna

    October 22, 2017 at 8:08pm

    I discovered your blog website on google and examine a couple of of your early posts. Continue to keep up the excellent operate. I just additional up your RSS feed to my MSN News Reader. Looking for ahead to reading extra from you in a while!…

Your email will not be published. Name and Email fields are required