Wrap up WebWelcome to Wrap up Web Client Billing and Support. From here you can read our help pages, view and pay your invoices, request assistance, maintain your billing information and more.


  • August 16, 2019


4.0 Configure your  robots.txt file 

A robots.txt file is critical in further assisting the search engines in identifying what they should look at and what they should not look at on your website as well as where to find your sitemaps.

4.0.1 - Check your robots.txt file is present


Use the Joomla Extension JSitemap Pro to discover if you have the robots.txt file present

Components/JSitemap Pro/Robots.txt Editor Icon

If you get a yellow warning "Inactive robots.txt detected" contact Wrap up Web who will activate the robots.txt file for you.

Once completed recheck and your robots.txt file should be present.


Yoast SEO/Tools/File Editor

4.0.2 - Apply the default robots.txt text

The robots.txt file can be configured in many ways but here at Wrap up Web we have a standard configuration that can be applied to robots.txt for all our websites.

Simply copy the code below and paste it into robots.txt

Wrap up Web do not block anything in the robots.txt file. This is our code...

# This space intentionally left blank
# If you want to learn about why our robots.txt looks like this, read this post: https://yoa.st/robots-txt
User-Agent: *

4.0.3 - Insert your sitemap links into your robots.txt

The robots.txt file can be used by the search engines to find your sitemaps. It is important links to your sitemaps are included in the robots.txt file for the search engines to see. Use the relevant method above to see if your Sitemaps are listed in your robots.txt file. For example you should see something similar to below at the very bottom of your robots.txt file ...


# Sitemap entries

Sitemap: https://liverpoolbuildingservices.co.uk/index.php?option=com_jmap&view=sitemap&format=xml
Sitemap: https://liverpoolbuildingservices.co.uk/index.php?option=com_jmap&view=sitemap&format=images
Sitemap: https://liverpoolbuildingservices.co.uk/index.php?option=com_jmap&view=sitemap&format=mobile


Sitemap: https://liverpoolbuildingservices.co.uk/sitemap.xml

If they are not present or incorrect. 

Delete them manually and correct them, for example

Joomla - Use JSitemap Pro to insert them automatically as follows

  • Components/JSitemap Pro/
  • Scroll down to sitemap links
  • Click the Pencil Icon of XML sitemap  
  • Click the Pencil Icon of XML Images sitemap  
  • Click the Pencil Icon of XML Mobile sitemap
Click the Robots.txt Editor Icon in JSiteMap Pro again to check your sitemap link entries are now added to the bottom of the robots.txt file


Yoast SEO/Tools/File Editor

4.1 SEO tasks Wrap up Web will complete to further improve ranking

The guidelines so far will greatly improve your chances of your website ranking higher in the search engines when a user uses one of your keyword/phrases in the Search Engines to find something. 

So far you have been able to complete the tasks yourself. However, in addition Wrap up Web will perform the following tasks as part of our Search Engine Optimisation service. So you understand further what this entails they are summarised below. Please note that these are server side SEO configuration tasks that can only be completed by Wrap up Web but are nevertheless just as critical in helping your website rank in the search listings.


There is a file called .htaccess in the root of your website that is critical in making sure your website displays correctly.

4.1.1Canonicalization to your preferred domain in .htaccess

Using this file we can make sure that if your preferred display domain is liverpoolbuildingservices.co.uk and someone types in www.liverpoolbuildingservices.co.uk then www.liverpoolbuildingservices.co.uk redirects to liverpoolbuildingservices.co.uk.

This also works the other way round so if your preferred display domain is www.liverpoolbuildingservices.co.uk and someone types in liverpoolbuildingservices.co.uk then liverpoolbuildingservices.co.uk redirects to www.liverpoolbuildingservices.co.uk.

We add the following code to your .htaccess file by default so you don't have to.

To Redirect to Secure https non-www

# Redirect http non-www to https non-www - Begin
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://liverpoolbuildingservices.co.uk/$1 [R=301,L]
# Redirect http non-www to https non-www - End

# Redirect http www and https www to https non-www - Begin
RewriteCond %{HTTP_HOST} ^www\.liverpoolbuildingservices.co.uk [NC]
RewriteRule (.*) https://liverpoolbuildingservices.co.uk/$1 [R=301,L]
# Redirect http www and https www to https non-www - End

To Redirect to Secure https www

# Redirect http non-www to https www - Begin
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://www.liverpoolbuildingservices.co.uk/$1 [R=301,L]
# Redirect http non-www to https www - End

# Redirect http non-www and https non-www to https www - Begin
RewriteCond %{HTTP_HOST} ^liverpoolbuildingservices.co.uk [NC]
RewriteRule (.*) https://www.liverpoolbuildingservices.co.uk/$1 [R=301,L]
# Redirect http non-www and https non-www to https www - End

4.1.2 - Enable Image Expiry Tag (Leverage Browser Caching) in .htaccess

If your page is using an image expires tag, which specifies a future expiration date for your images. Users' browsers will see this tag and cache the image in their browser until the specified date (so that it does not keep re-fetching the unchanged image from your server). This speeds up your site the next time returning visitors arrive at your site and require the same image.

Wrap up Web will add the following code to your .htaccess file.

## Begin Image Expires Tag (LEVERAGE BROWSER CACHING) ##

<IfModule mod_expires.c>

ExpiresActive On
ExpiresByType image/jpg "access 1 year"
ExpiresByType image/jpeg "access 1 year"
ExpiresByType image/gif "access 1 year"
ExpiresByType image/png "access 1 year"
ExpiresByType text/css "access 1 month"
ExpiresByType application/pdf "access 1 month"
ExpiresByType application/x-javascript "access 1 month"
ExpiresByType application/javascript "access 1 month"
ExpiresByType application/x-shockwave-flash "access 1 month"
ExpiresByType image/x-icon "access 1 year"
ExpiresDefault "access 2 days"


## End  Image Expires Tag (LEVERAGE BROWSER CACHING) ##

4.1.3 - Block any Autobots who are ignoring the robots.txt file in .htaccess (Optional)

Autobots roam the internet looking for websites to can and read. Some are welcome like Google but other are out to take your information to use it against you or target or defraud you.

The robots.txt file discussed above is used to block these unwanted Autobots. However, sometimes the robots.txt file can be ignored by certain Autobots. An extra level of prevention can be put in place by adding a piece of code into your .htaccess file that will stop the Autobots from continuing to crawl your site.

We add this code to your .htaccess file by default so you don't have to.

###### Begin Block user agents not reading the robots.txt file

RewriteCond %{HTTP_USER_AGENT} ^bot\*$

RewriteRule .* - [F,L]

RewriteCond %{HTTP_USER_AGENT} ^BOT\*$

RewriteRule .* - [F,L]

RewriteCond %{HTTP_USER_AGENT} ^Bot\*$

RewriteRule .* - [F,L]

RewriteCond %{HTTP_USER_AGENT} ^$

RewriteRule ^(.*)$ - [F,L]

######  End Block user agents not reading the robots.txt file

4.1.4 - Enable Gzip Compression

Website gzip compression makes it possible to reduce the file size of a web file (like HTML, PHP, CSS and Javascript files) to about 30% or less of its original size before these files get sent to the browser of a user. This compressed file is then served to the browser of the user which in turn decompresses it automatically to load the full original file in the browser again. 

Enabling gzip compression is great for improving page speed because your visitors will need to download much smaller web files as the original ones when browsing your web pages , which speeds up the download process of these files. To turn it on...


System/Global Configuration/Server/Gzip Page Compression to ‘Yes’


Ensure Settings/WP Super Cache/Advanced/Miscellaneous/ 'Compress pages so they’re served more quickly to visitors'/

Is NOT ticked

The following is then added to the .htaccess file


<IfModule mod_deflate.c>

AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/xml
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript



How helpful was this article to you?

Posting has been disabled.