37. Before publishing websites

by Double Bastion - Updated May 18, 2022

When your websites are ready for publishing, don’t forget to remove the header

add_header X-Robots-Tag "noindex, nofollow, nosnippet, noarchive";

from the server block of all the websites that you want to be indexed by search engines.

Change also the content of the robots.txt file for all the websites that you want to allow public access to. First open the file:

nano /var/www/example.com/robots.txt

Change its content to make it look like this:

User-agent: *
Disallow:
Sitemap: http://www.example.com/sitemap.xml

The Sitemap directive is optional, but it’s recommended to include it for SEO purposes.

If you configured Nginx to serve an ‘under construction’ page to outside visitors, you have to reverse those changes, to make the website accessible to everyone. Therefore, the lines that restricted access should be commented out like this:

#    location = /underconstruction.html {}
#    error_page 403 =200 /underconstruction.html;

    location / {
#          allow 123.123.123.123;
#          allow 124.124.124.124;
#          allow 2a05:6ef0:407::c3b4:d1e5;
#          deny all;
          try_files $uri $uri/ /index.php?$args;
    } 
You can send your questions and comments to: